What is it?
Workshop is one way Moodle staff can set up student peer marking in Moodle. The activity can accommodate different kinds of submitted work and can happen before, alongside, or independently of the standard tutor assessment. Workshop lets staff set up forms, rubrics or marking guides to support students in making judgements. Student assessments can be feedback-only or feedback and numeric grades. Tutors can optionally assess the students submissions and assessments, and weight their own judgements in relations to students'. Peers' submissions and assessments can be kept anonymous if needed. Workshop also allows self-assessment.
The basic stages of a Workshop activity - set out in more detail below - are:
- Staff set up a Workshop assignment to which students will submit their work. This includes authoring instructions for submission and marking, setting how many pieces of work each student marks, by when, how allocated, whether anonymous, whether feedback only or numerically graded.
- Students submit their work.
- Students mark each others' work.
- Tutors review the submissions and assessments, and optionally give their own marks for these.
- Students receive the feedback and/or marks given by their peers and (if used) a mark from their Tutor for the assessments they have given.
- A Tutor closes the activity, at which point the marks and feedback are revealed.
Why use it?
Peer marking requires students to make judgements about the work of their peers. Peer markers may be expected to give feedback only (which we'll refer to here as 'peer feedback'), or feedback along with a numeric mark (which we'll call 'peer assessment').
A well-conceived, well-enacted peer assessment activity can advance:
- Students' ability to understand and work with assessment criteria.
- Students' ability in the authentic academic practice of peer review.
- Insights, through articulating judgements and producing constructive feedback, about how students can go about critiquing and improving their own work.
- The possibility of feedback that is quicker, more individualised, and more plentiful than tutors are able to provide.
- The possibility of feedback on students' draft work, with sufficient time for amendments before its deadline.
- Avoiding 'learned dependence' (Yorke, 2003) - students' over-reliance on tutor opinions and over-humility about the importance of their own understandings.
- Triangulation - the original submission, peer reviews and tutor assessment (not to mention self assessment where used) can be compared, giving students new perspectives on their submission, the criteria, and the reviews they have written.
- Relatedly, insights into subjectivity and governance in the assessment process.
- Also relatedly, a departure from monologic, transmissive feedback as students weigh up the differences in the reviews. This in turn promises a desirable change in the way feedback is received from simple certainties to more sophisticated, evaluative thinking (Schommer, 1990).
- An occasion for dialogue with tutors and peers about assessment.
Keith Topping (2009) suggests explaining to students,
"...that peer assessment involves students directly in learning, and should promote a sense of ownership, personal responsibility, and motivation. Teachers can also point out that peer assessment can increase variety and interest, activity and interactivity, identification and bonding, self-confidence, and empathy with others."
Won't students take each others' ideas? A widely-held concern - but Richard Milne (UCL Centre for Virology) comments on his own experience of setting up peer review activities, "I wasn't worried about students stealing each others' ideas ... you discuss a subject with somebody else and then formulate your own way of thinking about it based on the conversation you’ve had". Students can be encouraged to credit each others' ideas (and a convention can be agreed for circumstances of anonymity).
Can students at any level of knowledge carry out good peer reviews? Potentially, yes - a meta-analysis by Falchikov and Goldfinch (2009) found that peer assessment at introductory levels was as valid as at higher levels. They attribute this to good preparation. McConlogue (2014) suggests beginning early with low stakes assessment - perhaps the introduction of a draft essay - and building up student expertise over the duration of a programme.
Can peer assessment work in every subject area? Yes, it seems so - Falchikov and Goldfinch (2009) found that subject area has little effect on validity or reliability of peer assessment. They also report that peer assessment of academic products (e.g. essays, posters) or processes (e.g. oral presentation skills, groupwork participation) tend to have more validity than those in the context of professional practice (e.g. internships). This may relate to students' greater familiarity with academic products and processes. Their research also suggests students may struggle with peer-assessment in multiple unfamiliar disciplines - so stick to one new discipline.
Will students be willing to do it? That depends. Read on.
For general peer assessment design principles and case studies, see the University of Strathclyde's PEER Toolkit and contributions from Eva Sorensen (UCL Chemical Engineering), Richard Milne (UCL Virology) on UCL's Teaching & Learning Portal.
Before I start...
- First times of any new activity need time and careful planning - especially one this coordinated and interpersonal. As you and your students repeat the process, it becomes far quicker and easier to run than the first time.
- Peer marking activities stand or fall on the strength of the explanations and instructions you give to students. These relate to the assessment criteria and the underlying theory of peer marking, as well as to the technology.
- Though the Moodle Workshop activity presents staff with a clear dashboard list - called the Planner - of who has and hasn't completed the two stages, it doesn't currently let you generate the same kind of participation reports as other Moodle activiies. So if you have a very large cohort, we suggest using groups to make the list more manageable.
How do I set one up?
Creating the Activity
- Turn editing on.
- Click Add an activity or resource, select Workshop.
- Complete settings as detailed below. Keep in mind that what you put in the text fields of the settings has a crucial orientation role, displaying at certain times in certain places - we give guidance below.
- Workshop name - this will display as the link to the activity, so make it concise and meaningful.
This displays for students on the front page of the activity just under the Planner (dashboard). Give a brief motivating introduction to the activity (instructions for each phase come later).
Below you will be asked submit your case study, peer mark two other students' case studies, and self mark your own. Your overall mark will be based on the marks you receive from peer markers for your submission, and from a tutor for the feedback you give as a peer marker yourself.
You will find the precise deadlines on the Planner here.
Participation - the way we have set up this activity, only students who make a submission will be able to peer mark. Because, at the deadline, Moodle is set to automatically allocate submissions to peer markers, we cannot give extensions. If you cannot submit work then you need to go through the extenuating circumstances procedure.
Anonymity - both your the work you submit and the peer assessments you give will be anonymous to fellow students - you will not know whose work you are marking nor who has marked yours. However, tutors will be able to see this information.
How you will be marked - the mark you receive for your submission will be based on the average mark you received from your peer markers, and will be 50% of your overall mark. The feedback you give as a peer marker will be marked in turn by a tutor, and an average of those marks will comprise the other 50% of your overall mark.
Grading strategy determines the assessment form used and the method of grading submissions.
There are 4 alternative grading strategies:
- Accumulative grading - each assessment criterion has its own numeric grade along with optional weighting and optional comments; a final grade is calculated on the basis of the separate grades and their respective weightings.
- Comments - no numeric grade but feedback only, as comments either in a single field or as responses to a series of guiding questions.
- Number of errors - markers decide whether the work they are marking has passed or failed each criterion (e.g. has original ideas; answers its question).
- Rubric - generates a numeric grade based on the level of achievement markers choose for each criterion. You'll be able to define your criteria and, for each criterion, as many levels as you need. The Rubric will display with a free text field for Overall Feedback. Do note that Rubric will generate a numeric grade and cannot be set to feedback-only - in other words you will need to assign a single numeric grade to each level, and those marks must be unique. Note too that it is not currently possible to import or reuse rubrics created elsewhere.
Grade for submission and Grade for assessment allow you to set a maximum grade which can be obtained on a piece of work.
The Submission grade to pass and Assessment grade to pass boxes allow you to set a minimum grade required for students to pass the assessment.
Instructions for submission display to students above the place where they submit their work. It is a good place to explain the settings and any conditions you have set up.
Example submission instructions:
Submit your work below [link to guidance] Please note that there are no late submissions for this activity - you need to submit work before the deadline in order to be allocated others' work to mark, and in order to receive marks and feedback yourself.
Maximum number of submission attachments - if set to none then students will need to paste their submission directly into a Moodle text field and, since Workshop does not have autosave, you may need to suggest they draft elsewhere and paste into Moodle. Alternatively you can allow files, in which case let students know how many you expect and in which format.
Late submissions - if late submissions are allowed there is no way to automatically allocate peer markers to them. You'll need to to manually allocate markers to these late submissions, or mark them yourself.
Instructions for assessment - this displays to students during the assessment phase and appears just under the Planner and above their allocation. Like the instructions for submission they are important for orienting and motivating students.
Example assessment instructions:
Immediately after the submission deadline you will be asked to complete a practice assessment. Once you've done this you'll able to start giving feedback on fellow students' work you have been assigned, until the assessment deadline.
The submissions have been assigned to you randomly and automatically.
Use self-assessment - this allows students to be allocated their own work to assess.
Overall feedback mode - provides a field for students to type summary comments which they have not made elsewhere.
Maximum number of overall feedback attachments - this may be none. Attachments come in handy if you are, say, asking peer markers to comment directly onto the work. There's no need to ask peer markers to upload a completed marksheet here, since Moodle Workshop provides you with ways to make these (see below).
Conclusion - this displays to students after the activity has been closed i.e. when they can come and collect their feedback and/or marks. It appears just under the Planner and above their feedback.
Example submissions are a practice assessment opportunity for students.
Use examples - this will allow you to give students a piece of work so they can practice peer marking without their feedback reaching a genuine peer. You will also provide an exemplary assessment for them as a reference.
Modes of examples assessment - choose when students encounter the example, and whether they have to mark it.
Here you set dates for the Submission and Assessment phases.
Switch to the next phase after the submissions deadline - this begins the Assessment phase automatically at the Submission deadline (the alternative is to manually switch phases when you're ready). If you are going to switch automatically you will almost certainly need to set up Scheduled Allocation - see below.
Common module settings & Restrict access
These are the same as other Moodle activities.
You will be able to edit many of these settings retrospectively, but be aware that many of them display in the students' Planner view, who may plan around them - so once the Submission Phase is underway it's best to leave them alone unless you have a chance to negotiate changes with students.
When you are ready click Save and Display; the Setup Phase page displays.
During this phase only Moodle editors are active - students aren't involved yet.
The example Planner shown below is telling us that before we switch to the Submission Phase, we first need to edit the Assessment Form and prepare Example Submissions. Under that, the Description added in the Workshop Settings should display.
Editing the Assessment Form
To set up the assessment, in the Settings sidebar block click the link to Edit Assessment Form; what displays here depends on the Grading Strategy you chose in the Settings (you can still change this during the Setup Phase).
If you chose Accumulative Grading you'll be asked to define one or more Aspect. For each criterion aspect you'll specify the maximum points available or which scale to use - you can have different points or scales for each aspect. You'll also be asked to give each aspect a weighting in relation to the others. Peer markers will see the definition of the aspect, a place to enter the grade or scale, and a field for their comments.
If you chose Comments you'll be asked to define one or more Aspects - these are the questions or feedback points which will display to peer markers, with a blank text field under each. This is a feedback-only marking strategy so there are no numeric grades or weightings to set.
If you chose Number of Errors you'll be asked to define one or more Assertions along with terms to for peer markers to use to indicate whether these have been met or not met. You'll also be asked to give each assertion a weighting in relation to the others. Save And Continue Editing to activate the Grade Mapping Table, which turns the weightings you have chosen into a number of possible errors and relates these to the overall number of points available for the Workshop activity - so what you then need to do is make sure that the values in each menu is the same as their adjacent values.
If you chose Rubric you'll be asked to define your criteria and levels of achievement. You need to assign a unique grade to each level. Extra levels will appear 'on demand' - first complete those that display by default, then click Save And Continue Editing - when the page reloads it will display extra blank levels. You will be able to choose whether the rubric should display to students as a (vertical) list or (horizontal) grid (list is best where there are many levels).
When you are ready you will next need to manually enable the Submission Phase - during this phase students submit their work. Only after the Submission Phase deadline do they begin to peer mark.
Adding Example Submissions
Still in the Setup Phase - if (in the initial Workshop Settings) you enabled an example submission, the Planner now prompts you to add one by clicking the button.
You are then prompted to paste or type an example submission directly into Moodle, which you then 'mark' in an exemplary way according to your chosen grading strategy. This becomes a reference for students. If you want to show students a range of work, you can add more than one example - but if you do add more than one it is probably not a good idea to force the student to assess them all before being able to proceed.
When a student assesses the example, their assessment does not accrue marks. They are shown the example submission and the Assessment Form we set up earlier, and after completing the form and saving the assessment, they are presented with the tutor's reference assessment for comparison.
Students have the opportunity to re-assess the example.
Make students anonymous, both as authors and markers
Students tend to ask for anonymity, both for their submissions and assessments. Our evaluations of Workshop have indicated that without anonymity (and depending on the experience and trust among students) "everyone is holding back" on passing judgement on their peers' work. This is to be expected - at least while they are inexperienced at peer marking. Anonymity is something to negotiate with students and be very clear about.
Here is how to set anonymity for both Author and Marker (which Moodle calls 'Reviewers').
The default is that Authors' identities can be seen by Reviewers (markers). Staff can change this as follows and as illustrated below:
- From the Workshop's Settings menu, select Permissions; the Permissions page for that particular Workshop activity display.
- Filter by the word: view.
- From the shortened list, find the permission for View Author Names and delete the Student role from it by clicking the adjacent 'x' icon.
- Note the permission for View Reviewer Names doesn't include Student - no need to do anything here.
Finally return to the Workshop front page by clicking the Back To Workshop... link at the bottom.
Even if you enabled a date in the Workshop Settings, you will still need to go to the Planner and manually click the lightbulb icon in the Submission Phase column, as illustrated, so it appears shaded green.
It often helps to send a News Forum message reminding students that submission is underway, till when, and where to find it.
To monitor submissions, you can refer to the Planner to monitor how many students have submitted work so far, and below that you can filter by Group (if you've set them up).
Allocate submissions (before the Assessment Phase)
Allocation of submissions to peer markers could be done as student submissions arrive, but it is probably easiest to wait until the after the deadline - and easier still to set Moodle Workshop up to do this automatically.
Scheduled allocation (automatic after the Submission Phase ends)
You may not want to do this if you are expecting students to submit late. However, it does offer the benefits of moving things smartly along and allowing students to peer mark while their own work is still fresh in their minds.
The most automated this gets is as follows - in the Workshop initial settings, in its Availability section:
- Ensure you have set a Submissions Deadline date;
- and that you have checked the box for Switch To Next Phase ...;
- and that you have set an Open For Assessment From date.
- and (since these dates display to students) that you have set a Deadline for Assessment.
- Save and Display; the Planner displays.
There's one more thing to do for these settings to be enacted.
- From the Workshop's Settings sidebar block, select the Allocate Submissions link and configure as follows.
- Click the Scheduled Allocation tab; its settings display.
- Check the Automatically Allocate Submissions At The End Of The Submission Phase checkbox.
- Decide on the number of reviews (assessments) which should be made (per peer marker or per submission, depending on your preference). How many is enough? You need to predict whether all or just some students will do the peer marking - you want to avoid overloading students while ensuring that everybody gets some peer feedback. Our evaluations suggest students place a higher value the feedback they receive than the feedback they give - even though the latter is considered more educationally potent.
- Decide whether students should be able to peer mark if they themselves have not made a submission.
Decide whether students should definitely self-assessChecking the Add Self-Assessments box will assign peer markers their own Submission in addition so keep students' workload in mind when you set the number of Reviews.
- Save Changes, and then click on Allocate Submissions again to check things are as you intended. There should be some ticks and confirmations. Set up as illustrated below and you will not have to manually progress the activity from Submission Phase to Assessment Phase..
You are given a list of the participants (students or participants on the course) and then, to the left, a column to assign them a reviewer and a right column to assign them a reviewee. A participant must submit something before they can be assigned a reviewer, however you can assign someone to review someone else's work even if they have not submitted themselves.
The other way you can allocate reviewers and reviewees is by random allocation. If you have groups set up on the Workshop activity then you can set the allocations to distribute among group members. You set how many students you want to review each piece of work. You can choose to over-ride any previously allocations, for example if you have some extra submissions you may wish to re-do the random allocations. There is also the option to allow those who have not submitted to be reviewers, if you do not tick this then anyone who has not submitted will not be allocated a reviewee.
Once you have selected the relevant settings click Save changes.
When the random allocation has completed you will get a confirmation screen like this, which will also note any problems, such as not being able to allocate to individuals who have not submitted. Note the example screen below look more complicated than it would on a first allocation run as it is the result of changing allocations as well as applying new ones
Once submission are completed and reviewers have been allocated it is time to move into the Assessment phase. This can be done manually, but if you set it up in the initial settings then this will happen automatically at the set date.
From a tutor or course administrator perspective there is not much more to do at the assessment phase, once you have activated it. This is the students' time to review their peers work and whilst doing so also reflect on their own pieces.
As the grades or comments are given by peers to one another you will start to see these appearing in the table. This will allow you to chase up anyone who has not done so before the assessment phase closes.
The names highlighted in red are still to give feedback, the one in black has completed, and in the case where grades are given you can see what they have given.
Grading and evaluation phase
This phase is all about the tutors or course administrators, as it is your chance to review the comments and grades given by students and make any amendments where they have been either too lenient or too harsh. It basically enables you to mitigate any negativity and ensure all grading has been done fairly. You can also allocate marks for the reviewers based on the quality of their reviews if you wish to.
Once you turn on the Closed phase then the students can see any grades or comments they have been left by the other students. This is also the end of the process.
If you find any inaccurate or missing information you can even update this yourself (it's a communal wiki).
If you have a specific question about the tool please contact the Digital Education team.
- Setting the Workshop activity to automatically switch phase from submission to assessment and automatically make allocations, is one of the most labour-saving aspects of Moodle Workshop. However, it also means that if there are submissions after the allocations have been made, a tutor needs to make manual allocations of that work to assessors. Tutors need to work out a process to handle this, especially if the activity is compulsory and credit-bearing, because assessors shouldn't be surprised by extra allocations late in the day. One way is to build in a few days' delay before switching from submission to assessment and running the allocation. Another complementary way is to set the activity to allow assessment by students who have not themselves made submissions - that way the student will be allocated work to assess, and their own work can be marked by the tutor.
- To recognise and incentivise the assessments, if using numeric marks (rather than feedback only), make available a proportion of credit for the assessment as well as - or even instead of - the submission.
- Include instructions in the settings of the Workshop. Pay attention to how the instructions display for students and adjust them accordingly.
- Tutors make diary entries to monitor the activity and (notwithstanding the Upcoming Events Block) to send students reminders a few days before the submission and assessment deadlines, and after the activity is closed. If students are new to the activity, do include instructions about where they can find their allocated work and their feedback. Use the News Forum for this (unfortunately since Moodle Workshop is multiphased, Moodle Course Participation report doesn't help us here).
- Anonymity - do go into the Workshop Permissions and change the Permission for View Author Names so that marking is blind; this reduces the politics and social discomfort of passing judgement, and makes peer markers more comfortable about giving constructive criticism rather than platitudes.
Examples and case studies
Questions and answers
What students do?
See this brief separate guide for students using the Moodle Workshop activity.
- Bloxham, S., & West, A. (2007). Learning to write in higher education: students’ perceptions of an intervention in developing understanding of assessment criteria. Teaching in Higher Education, 12(1), 77–89.
- Cartney, P. (2010). Exploring the use of peer assessment as a vehicle for closing the gap between feedback given and feedback used. Assessment & Evaluation in Higher Education, 35(5), 551–564.
- Covill, A. (2010). Comparing Peer Review and Self-Review as Ways to Improve College Students’ Writing. Journal of Literacy Research, 42(2), 199–226.
- Falchikov, N., & Goldfinch, J. (2000). Student Peer Assessment in Higher Education: A Meta-Analysis Comparing Peer and Teacher Marks. Review of Educational Research, 70(3), 287–322.
- McConlogue, T. (2012). But is it fair? Developing students’ understanding of grading complex written work through peer assessment. Assessment & Evaluation in Higher Education, 37(1), 113–123.
- M.cConlogue, T. (2014). Making judgements: investigating the process of composing and receiving peer feedback. Studies in Higher Education, 1–12.
- Milne, R., (2013). Peer review of virology essays. Available from: https://www.ucl.ac.uk/teaching-learning/case-studies-news/assessment-feedback/peer-review-of-virology-essays
- Nicol, D., (2007). Peer Evaluation in Assessment Review project. Available from http://www.reap.ac.uk/PEER.aspx
- Nicol, D., (2010). From monologue to dialogue: improving written feedback processes in mass higher education. Assessment & Evaluation in Higher Education, 35(5), 501–517.
- Orsmond, P. (2004). Self- and peer-assessment: guidance on practice in the biosciences. Leeds: Centre for Bioscience, Higher Education Academy.
Sadler, D. (2010) Beyond feedback: developing student capability in complex appraisal. Assessment & Evaluation in Higher Education, 35(5), 535-550.
- Saito, H., & Fujita, T. (2004). Characteristics and user acceptance of peer rating in EFL writing classrooms. Language Teaching Research, 8(1), 31–54..
- Schommer, M. (1990). Effects of beliefs about the nature of knowledge on comprehension. Journal of Educational Psychology, 82(3), 498–504.
- Sorensen, E. (2013). Experiences of using peer assessment in a 4th year design module. Available from: http://www.ucl.ac.uk/teaching-learning/case-studies-news/assessment-feedback/peer-assessment-chemical-engineering
- Topping, K. J. (2009). Peer Assessment. Theory Into Practice, 48(1), 20–27.
- Yorke, M. (2003). Formative assessment in higher education: moves towards theory and the enhancement of pedagogic practice. Higher Education, 45, 477–501.