|Published online: July 26, 2016||$US5.00|
This article reports on tutor and student experiences of moving from paper methods of marking and feedback to electronic marking for assessing student coursework in a small specialist university in the UK. The account is informed by a tutor focus group and a student survey and takes a grounded approach. This research contributes to a growing pool of literature on electronic marking and feedback and provides new insights from the perspective of the two major stakeholder groups. Tutors primarily used GradeMark, but tutors who were more comfortable with Microsoft software opted to work with Microsoft Word’s Track Changes feature. The study found tutors could successfully mark electronically but had unresolved concerns over matters such as accountability and practical application of rubrics. Through the pilot, tutors re-examined some existing practices. Students reported that electronic approaches brought an alignment between criteria and mark, improved legibility, and improved convenience. However, they reported frustration with pre-prepared comments and submission protocols that did not align with the digital experience. Conclusions show the experience of markers and students is mixed and that the process requires further optimization to take full advantage of the potential pedagogical benefits. The exercise also provided a catalyst to encourage markers to re-evaluate their assessment practices.
|Keywords:||Assessment, Marking, Online, Feedback, Usability, Annotation, Legibility|
The International Journal of Assessment and Evaluation, Volume 23, Issue 3, September 2016, pp.35-47. Article: Print (Spiral Bound). Published online: July 26, 2016 (Article: Electronic (PDF File; 369.141KB)).
E-Learning Technologist, Technology Enhanced Learning, Harper Adams University, Newport, Shropshire, UK
Educational Developer, Harper Adams University, Newport, Shropshire, UK