Digital Feedback for Progression

Can digital feedback accelerate student progression?

Introduction

A collaborative project, sponsored by the Education and Training Foundation, between Abingdon and Witney College and Newham College to explore to what extent providing students with digital feedback could accelerate student achievement and reduce the total amount of marking time for a teacher.


Across the partnership we worked with 102 learners and 24 practitioners including:



  • Ten subject specialists
  • Six E-learning/Digital learning specialists
  • Two Heads of Teaching and Learning

Our Approach

We were really pleased to be able to take part in this project but with time not being in our favour our approach was to break the project into 5 stages which allowed us to complete the project in a relatively stress free way.

Stage 1 - Set Up

The set-up phase of the project consisted of establishing initial relationships for collaboration, signing a partnership agreement, setting project outputs and scheduling meeting and deadline dates between the two project partners. Meeting dates via Skype were scheduled early on in the project and both colleges collaborated well using this method.

Stage 2 - Research

Both project partners felt that it would be beneficial to have a research phase of the project, where a range of digital tools for feedback could be trialled in order to ascertain the best ones to use for the action research stage of the project. During this phase, of the nine identified digital tools considered five tools were identified for trial.


Each of the tools were trialled by teachers from both colleges, against a mutually agreed set of criteria, as outlined below:



  • Accessibility – across platforms, technical requirements, training needs
  • Free/low cost
  • Innovative/engaging
  • Flexible/adaptable for different levels
  • Sustainability
  • Reliability/consistency - are results accurate, what data does the resources give us?

By testing possible digital tools we were also able to create a baseline of our students attitude towards feedback. We asked them:



  1. How do you normally receive feedback from your teachers either at school or in college?
  2. What worked? How did the feedback you received help you to improve your work?
  3. What didn’t work?
  4. What impact did the feedback have on your progress?
  5. Why do you think this is?
  6. Would you like to receive digital feedback to help you progress in college?
  7. What type of digital feedback would be suitable for you?


Overwhelmingly, our learners indicated that they had previously received only written feedback on their work. Most startlingly, 40% of respondents to the initial survey stated that they had read the feedback and didn’t understand it. This suggested that a significant number of our learners are not able to use the feedback they receive to progress and an additional/alternative way of delivering feedback is needed.

Big picture

Stage 3 Training and Development

We arranged for training sessions on MS Forms, One Note and Go Formative to be delivered over the summer break for all teachers involved in the project. Across the partnership, 10 subject specialists received training on the digital tools.


Marking time logs were also introduced for staff to record marking time.

Stage 4 - The Trial

Learner’s starting points were measured using their first percentage score achieved in the first formative assessment. Learners were then divided into control groups, that did not receive digital feedback and groups that did. Teaching was not affected by this split, all learners continued to receive the same quantity and quality of teaching, the only difference was the type of assessment (digital or paper based).

Stage 5 Monitor and Review

During the monitoring phase of the project, both teachers and learners completed mid and end point reflections, with questions following the same format as the start-point survey.

What We Learnt

Learner Attitude to Feedback

Written feedback is not universally read and understood by our learners and if we ignore this we will not be providing the best possible support for all our learners. Learners prefer digital feedback but the difference is not as pronounced as might be expected. At the end of the project 52% of learners said they prefer to receive digital feedback, but 21% said they would rather not. Digital feedback can make a difference, engaging learners, but replacing traditional methods of feedback may disadvantage some learners.

Saving Teacher Time

We set out to test if digital feedback would save teacher time and what we have learned is that, yes it can but only in the right conditions. Teachers have saved time on marking – MS Forms Quiz did the marking for the teacher and instantly gave students their achievement percentage score. Using MS Forms eliminated the need for marking, however, this type of tool works best for closed, true or false, yes or no questions. Results for longer, open ended questions were more mixed at the end of the trial. Teachers were evenly split with 33% reporting that digital tools saved time to use digital tools, 33% felt that there was no difference and 33% felt that it took longer.

Reflections


  • Taking the time to research possible digital tools and assess their suitability was, we think, vital to the success of the project and we would encourage anyone thinking of trying this to make time to thoroughly work through the array of options, conducting mini trials if needed.



  • Central to making the research period work was establishing clear robust criteria for deciding on which digital tools to use. Certainly for us having those criteria and the opportunity to engage with colleagues in mini trials was vital to the success of the project.