Remember Me
Or use your Academic/Social account:


Or use your Academic/Social account:


You have just completed your registration at OpenAire.

Before you can login to the site, you will need to activate your account. An e-mail will be sent to you with the proper instructions.


Please note that this site is currently undergoing Beta testing.
Any new content you create is not guaranteed to be present to the final version of the site upon release.

Thank you for your patience,
OpenAire Dev Team.

Close This Message


Verify Password:
Verify E-mail:
*All Fields Are Required.
Please Verify You Are Human:
fbtwitterlinkedinvimeoflicker grey 14rssslideshare1
Denton, P; Rowe, P
Publisher: Taylor & Francis
Languages: English
Types: Article
Subjects: LB2300, RS

Classified by OpenAIRE into

mesheuropmc: education
Electronic marking tools that incorporate statement banks have become increasingly prevalent within higher education. In an experiment, printed and emailed feedback was returned to 243 first-year students on a credit-bearing laboratory report assessment. A transmission approach was used, students being provided with comments on their work, but no guidance as to how they should use these remarks. A multiple-choice question test, undertaken before and after the return of feedback, was used to measure learning. Although returned comments included model answers, test scores showed no overall enhancement, even when students’ marks for their laboratory reports were initially hidden. A negative and significant (p = .010) linear trend between relative test scores and test date suggests that even modest improvements in subject knowledge were lost over time. Despite this, students could accurately guess their mark based on emailed feedback alone, estimated and awarded marks being linearly correlated (p < .001). It is concluded that statement banks organised according to published assessment criteria can ultimately help students to appreciate how their work was graded. However, students should be encouraged to produce a structured response to received feedback if self-assessment is to occur.
  • The results below are discovered through our pilot algorithms. Let us know how we are doing!

    • Black, P., and D. Wiliam. 1998. “Assessment and Classroom Learning.” Assessment in Education: Principles, Policy and Practice 5(1): 7-74.
    • Boud, D., and E. Molloy. 2013. “Rethinking models of feedback for learning: The challenge of design.” Assessment and Evaluation in Higher Education 38(6): 698-712.
    • Boud, D. 2000. “Sustainable assessment: Rethinking assessment for the learning society.” Studies in Continuing Education 22(2): 151-167.
    • Brown, G. 2001. Assessment: A guide for lecturers, York: LTSN generic centre.
    • Butler, R. 1987. “Task-involving and ego-involving properties of evaluation: Effects of different feedback conditions on motivational perceptions, interest and performance.” Journal of Educational Psychology 78(4): 210-216.
    • Carless, D. 2006. “Differing perceptions in the feedback process.” Studies in Higher Education 31(2): 219-33.
    • Case, S. 2007. “Reconfiguring and realigning the assessment feedback processes for an undergraduate criminology degree.” Assessment and Evaluation in Higher Education 32(3): 285-299.
    • Crisp, B. R. 2007. “Is it worth the effort? How feedback influences students' subsequent submission of assessable work.” Assessment and Evaluation in Higher Education 32(5): 571-81.
  • No related research data.
  • Discovered through pilot similarity algorithms. Send us your feedback.

Share - Bookmark

Cite this article