In summer 2020, MTU received approval from the National Forum for the Enhancement of Teaching & Learning in Higher Education for a project entitled 'RAFT – Reimagining Assessment and Feedback Together' as part of the Strategic Alignment of Teaching and Learning Enhancement Funding in Higher Education Call 2020.
RAFT took a strategic approach to enhancing Assessment and Feedback practices, and built on the successful SATLE 2019 project – Building Assessment Literacy Initiative (BALI) that saw students and staff collaborating to understand challenges and identify enhancements to student/staff experiences of the assessment process.
Our vision for MTU’s RAFT project was to enhance the Assessment and Feedback (A&F) experience for students and staff by providing structure, time and expertise to staff to explore (and hopefully solve!) an A&F challenge.
- Structure was provided through an online, flipped-classroom, professional development programme utilising Action Research to develop an intervention for a module.
- Time was provided by a 1hr timetable alleviation running across Semester 1 (to identify, plan and prepare an intervention) and Semester 2 (to implement, monitor and gather evidence of the intervention’s impact)
- Expertise was provided by the participants, course team, external inputs (Bovill (2020), Carless (2017), Nicol (2007)) and critically, student partners (those who had, or were, experiencing the module).
RAFT PROJECTS
The following RAFT projects are centred around several thematic areas including rubrics, peer review/feedback, oral assessment, plagiarism, feedback literacy and authentic assessment.
Click under Staff or Project Title to review further details on each RAFT project:
THEME |
STAFF |
DEPARTMENT |
PROJECT TITLE |
Co-creation |
Dr Con Burns |
Sport, Leisure & Childhood Studies
|
|
Peer Assessment |
Dr Alison Merrotsy |
Applied Social Studies
|
Implementing Peer Feedback in a First Year Social Care Module |
Student Experience |
Brigid Walsh, Grainne Daly and Ruth Farrell |
Tourism & Hospitality |
Year 4 Student Experience of Assessment & Feedback in Hospitality Management
|
Co-creation |
Deirdre Ryan |
Architecture |
Impact of Student Voice, Co-Creation, Feedback & Peer-Assessment on Student Engagement & Learning
|
Co-creation |
Dr Aisling O'Gorman |
Process, Energy and Transport |
|
Co-creation and Peer Assessment |
Catherine Murphy, Denise McSweeney, Elaine O’Brien & Michelle Collins |
Accounting and Information Systems Marketing and International Business |
A cross-discipline, multi-campus approach enabled the 23 participants to learn from diverse experiences as they piloted 16 interventions (some as teams) in 14 departments.
An Action Research approach resulted in innovative and evidence informed interventions that directly impacted over 500 students.
These groups enabled a collaborative approach providing accountability partners, support networks, and research collaborators as the interventions developed.
Focus groups, surveys and student/staff reflections evidence the interventions’ impact, which form part of a suite of Case Studies. A symposium is planned to showcase activity and explore how interventions are translatable to different disciplinary settings.
Staff participating in the RAFT programme have noted the significant impact on their own delivery and learning, as well as positively challenging how they collaborate with students in A&F settings – hear more about the programmes and their experiences in this short, summary clip:
What types of interventions were trialed?
Interventions developed across many areas of A&F practice with small teams forming around core themes including:
- Peer Assessment and Feedback
- Standardised Rubrics
- Co-creating criteria with students
- Feedforward and Exemplars
- Programme level change
How did the Action Research approach work for such a diverse group of participants?
We adopted an Action Research approach running over 2 semesters with the main idea to explore and hopefully solve an assessment and feedback issue by developing an intervention informed by the literature.
During Semester 1, we ran a weekly, online workshop for the 23 RAFT participants (drawn from 14 different departments across multiple campuses) to deliver on the following activities:
- Identify A&F issue and explore solutions
- Plan, with student input and consultation, an intervention for execution in Semester 2
- Prepare for evidencing impact and evaluating if the intervention worked
Throughout Semester 1, small tasks were set to ensure that participants maintained momentum and these were used to model different feedback practices that could be incorporated into their own learning settings. To continue into Semester 2 of RAFT, participants were required to submit an action plan, incorporating a mini literature review to evidence the development of their planned intervention.
During Semester 2, the focus shifted away from the whole cohort meetings to individual activity including:
- Implementing the planned intervention
- Monitoring and responding to issues
- Gathering evidence and reporting on the process.
Ongoing cohort support was still provided through 5 workshops and individual consultations.
All workshops were delivered using a flipped classroom approach, which supported the development of an online community of practitioners who acted as support/challenge for each other. All sessions were delivered online with a range of inputs to develop participants’ knowledge and understanding of A&F practices and the action-research process, including:
- David Carless’ work about scaling up assessments or designing feedback for student uptake
- Cathy Bovill’s research and practice about co- creation in learning and teaching
- David Nicholl’s work around principles of good assessment and feedback
- Purpose and approaches to undertake action-research projects
- Submitting for Ethics Approval.
Given a key aim of RAFT was to evidence and report on the efficacy of the interventions, a lot of support from the core initiative team went into supporting Ethics submissions and building SoTL practices with the participants. This process did mean we had to delay the planned analysis of the interventions’ impact as Ethics required our data analysis to fall after exam boards.