LOGIN TO YOUR ACCOUNT

Username
Password
Remember Me
Or use your Academic/Social account:

CREATE AN ACCOUNT

Or use your Academic/Social account:

Congratulations!

You have just completed your registration at OpenAire.

Before you can login to the site, you will need to activate your account. An e-mail will be sent to you with the proper instructions.

Important!

Please note that this site is currently undergoing Beta testing.
Any new content you create is not guaranteed to be present to the final version of the site upon release.

Thank you for your patience,
OpenAire Dev Team.

Close This Message

CREATE AN ACCOUNT

Name:
Username:
Password:
Verify Password:
E-mail:
Verify E-mail:
*All Fields Are Required.
Please Verify You Are Human:
fbtwitterlinkedinvimeoflicker grey 14rssslideshare1
Havemann, Leo; Sherman, S.
Languages: English
Types: Unknown
Subjects: its
“Assessment and feedback lies at the heart of the learning experience, and forms a significant part of both academic and administrative workload. It remains however the single biggest source of student dissatisfaction with the higher education experience” (Ferrell, 2012).\ud \ud In recent years, UK agencies such as Jisc (2010), NUS (2010, 2011) and the HEA (2012) have dedicated time and resources to researching and supporting the enhancement of assessment and feedback. The NUS has argued that institutions must consider ways of making university administration more accessible through technology including online submission of assessment (NUS, 2010), and that the use of technology in institutional administration will simplify and improve processes, including assessment and feedback (NUS, 2011). For Jisc (2010), a holistic approach which considers the technical, administrative and pedagogic elements of assessment and feedback is necessary.\ud \ud The HEA’s 2012 paper “A Marked Improvement” presents a strong pedagogic argument for updating and transforming assessment methods in higher education. The paper concludes that “it is clear that there is need for institutions to continue to adopt robust technological solutions to support assessment and feedback” since it has an impact on both supporting administrative processes, and moreover in influencing the student experience (HEA, 2012).\ud \ud It is evident that online assessment and feedback methods and processes are increasingly essential to the student experience and must be regarded as a priority for education institutions. This session considers the experiences of the members of the Bloomsbury Learning Environment (BLE) consortium (a group of geographically-close HE institutions based in London) in which there has been an increasing interest – and concern – in this area. For example, low scores in the National Student Survey relating to feedback (common across the sector) provided a driver to place greater attention on improving assessment mechanisms (technology, process and quality). Learning technologies play an important role in offering efficient, accessible ways of managing assessment and providing feedback but the affordances of specific technologies can also constrain practice.\ud \ud This presentation describes a cross-institutional project conducted by members of the BLE, which aimed to explore and improve online assessment and feedback processes, practices, opportunities and technologies available to the member institutions. In the project, we sought to identify the expectations and goals of key stakeholders regarding the deployment and use of e-assessment and feedback (for example academic administrators, teaching staff and learning technologists). We assessed and evaluated appropriate technologies to support e-assessment across the consortium, producing documentation and case studies. We also arranged practice-sharing and CPD events for academic and support staff. By working collaboratively, we aimed to facilitate a collective sharing of knowledge, expertise and resourcing to ensure the individual partners could learn from each other and improve practice.\ud \ud The presentation will report and comment on the approaches taken to disseminate lessons learned and highlight areas of good, innovative and interesting practice. We will also discuss the question of impact: some projects are implemented with very specific, quantifiable targets in mind. In terms of the more pedagogic aspects of this project, there is less evidence to go on. When evaluating a collaborative project of such wide scope, understanding how to identify and quantify improved practice is an ongoing challenge.\ud \ud References\ud \ud Ferrell, G (2012) A view of the Assessment and Feedback Landscape: baseline analysis of policy and practice from the JISC Assessment and Feedback programme.\ud \ud JISC. (2010) Effective Assessment in a Digital Age: A guide to technology-enhanced assessment and feedback [online], Bristol, HEFCE. Available: www.jisc.ac.uk/media/documents/programmes/elearning/digiassass_eada.pdf\ud \ud HEA (2012). A Marked Improvement. Available: https://www.heacademy.ac.uk/resource/marked-improvement\ud \ud NUS (2010) Student Perspectives on Technology. Available: https://www.hefce.ac.uk/media/hefce/content/pubs/2010/rd1810/rd18_10.pdf\ud \ud NUS (2011) NUS Charter on Technology in Higher Education. Available: https://www.ucisa.ac.uk/~/media/Files/about/news/2012/ICT_Charter_Final1.ashx
  • The results below are discovered through our pilot algorithms. Let us know how we are doing!

    • Contact us: Leo: @leohavemann Sarah: @BLE1
  • No related research data.
  • No similar publications.

Share - Bookmark

Cite this article