|
|||||||||
New ways of giving feedback
• Audio and video feedback
One approach to enhancing feedback is to experiment with novel ways of giving it — 'novel' in the sense of trying out methods of providing feedback that weren't technologically feasible a quarter-century ago, or at least have become much more common because new technologies can now be used to communicate them very effectively.
One good example of the latter is 'generic' or whole class feedback, which has recently come to the fore as an invaluable form of post-exam feedback to students who may no longer meet for timetabled classes and/or may have begun their vacation. Email or website postings offer a rapid and economical form of communication between examiners and students. A second example is the use of screencasts to give students in large first-year courses speedy access to more detailed guidance on commonly occurring problems.
Other novel means of giving feedback are even more closely interconnected with new technologies. Using clickers is a fast and systematic electronic means of checking, during a lecture or other large class, how securely the students have grasped a difficult concept or issue. Automated feedback enables students to self-test with online multiple-choice questions in a form that gives them feedback on incorrect answers. Software has also been developed which can make recycling written comments possible (so that it's not necessary to draft every feedback comment from scratch) or which can enable a tutor to shift from writing comments to providing audio and video feedback. Either possibility can offer savings in time, and so the prospect either of a more manageable feedback workload or the opportunity to make fuller comments than would otherwise have been feasible.
FURTHER READING
Hounsell, D. (2008) The Trouble with Feedback: new challenges, emerging strategies. TLA Interchange Issue 2.
JISC (2007) Effective Practice with e-Assessment. An overview of technologies, policies and practice in further and higher education. Bristol & London: JISC (2007)
Audio and video feedback
The development of MP3 players is one widely-used technology that has been exploited in recent years to provide a new method of giving feedback to students - the podcast. Often used in combination with other types of feedback, those who use podcasts to provide feedback find them an informal method which can be used to provide a good deal of feedback quite quickly, rather as they would in a face-to-face meeting with a student. Several surveys have been done of students' responses to receiving their feedback via podcast, and most students appear to find it a positive experience, giving them detailed feedback they can listen to more than once and in their own time while seeming more personal than written comments. A further development has been to use "screen" capture to provide video feedback, with the added advantage that students can see the part of their assignment which the staff member is referring to in their audio comments.
CASE EXAMPLES
France, D. and Wheeler, A. (2007) Reflections on using podcasting for student feedback. Planet, 18. Higher Education Academy Subject Centre for Geography, Earth and Environmental Sciences.
Hill, D. (2008) The use of podcasts in the delivery of feedback to dissertation students. Higher Education Academy Subject Centre for Hospitality, Leisure, Sport and Tourism Case Study.
An example of using podcasts on a distance learning MSc in Occupational Psychology, with 'lessons learned' and 'advantages gained'. http://www.jisc.ac.uk/digiassess
Jordan, J. (2004) The use of orally recorded exam feedback as a supplement to written comments. Journal of Statistics Education 12(1).
King, D., McGugan, S. and Bunyan, N. (2008) Does it make a difference? Replacing text with audio feedback. Practice and Evidence of Scholarship of Teaching and Learning in Higher Education 3(2), 145-163.
Lunt, T. and Curran, J. (2009) 'Are you listening please?' The advantages of electronic audio feedback compared to written feedback. Assessment & Evaluation in Higher Education iFirst.
McLaughlin, P. (2009) eFeedback gets personal. Centre for Bioscience Bulletin, No. 28, p.3.
Merry, S. and Orsmond, P. (2007) Feedback via MP3 audio files. Centre for Bioscience Bulletin, No. 22, p.5.
Merry S. and Orsmond P. (2008) Students' Attitudes to and Usage of Academic Feedback Provided Via Audio Files. Bioscience Education volume 11
Micklewright, D. Podcasting as an alternative mode of assessment feedback. Higher Education Academy Subject Centre for Hospitality, Leisure, Sport and Tourism Case Study.
Nortcliffe, A. and Middleton, A. (2008) A three year case study of using audio to blend the engineer's learning environment. Engineering Education 3(2), 45-57.
Rodway-Dyer, S., Dunne, E. and Newcombe, M. (2009) Audio and screen visual feedback to support student learning. Paper given at ALT-C Conference, September 2009, Manchester.
Stannard, R. (2007) Using screen capture software in student feedback. Higher Education Academy English Subject Centre Case Study.
Case study - providing audio comments. Massey University: Innovations in Assignment Marking.
A Word in Your Ear, Sheffield Hallam University, 18 December 2009
FURTHER READING
Savin-Baden, M. (2010) The sound of feedback in higher education. Learning, Media and Technology 35(1), 53-64 This article explores recent research on, and practices used for, podcasting assignment feedback (PAF). It argues that PAF should be based on the principles of dialogic learning. Screencasts
Screencasting is a technology that allows academics to demonstrate to students how things are done, in the way a master might show an apprentice. A screencast records the actions on a computer screen, so it is particularly useful for demonstrating, for example, how to write or use software, or stages in a calculation, as it shows the process by which something is done. It can also provide a model answer or an exemplar of a particular kind of problem. Since multiple students can access a screencast, it can be used to provide useful feedback on common problems which students encounter in an assignment.
CASE EXAMPLES
Cassidy, S. (2007) Screencasting, Blogs and Feedback. Macquarie University Learning and Teaching Centre Podcast Series on Engaging Students. Recycling written comments
Individualised written feedback can be very important in helping students to learn. However, it is time-consuming and increased student numbers have led to more pressure on staff time in producing these comments. The papers in this section describe methods of "recycling" comments that lecturers find themselves frequently making on common issues in student work. In some cases comments are recycled using specialised software, and in others standard word-processing packages. The past comments are then redeployed on new students' work, edited as appropriate and often blended with other tailormade feedback. The benefits of this approach,it is argued, include greater consistency in providing feedback and more effective use of staff time.
CASE EXAMPLES
Balfour, J. (2007) Some light at the end of the feedback tunnel? CEBE Transactions 4(2), 54-66.
Brown, J. Annotating electronic assignment copies with comments. Massey, Victoria, Otago universities and UCOL: Innovations in Assignment Marking project.
Juwah, C. et al. (2004) Enhancing effectiveness and efficiency in student feedback. Case Study 4 in: Enhancing Student Learning through Effective Formative Feedback. Higher Education Academy: Student Enhanced Learning through Effective Feedback project. Staff teaching final-year Accounting and Finance used grade-related criteria and a bank of feedback statements to provide quick and detailed feedback.
Pezdek, K. (2009) Grading student papers: reducing faculty workload while improving feedback to students. Association for Psychological Science Observer 22(9) Online and e-feedback
Many of the innovations cited in this section are ways of providing students with feedback on online tests which they log on to in their own time. One of the advantages of this type of feedback is that it is immediate, and can be accessed by students at a time of thier choosing. And while the main type of question used tends to be multiple choice, it is also possible to design short-answer questions (Jordan et al. 2009). The feedback can be more or less sophisticated, with software able to go beyond yes-and-no answers to feedback which provides constructive suggestions for improvement.
While presenting a number of technical difficulties, online feedback does have the advantage of flexibility, and the possibility of links to other online resources. However, most of its proponents suggest that it should not be the only source of feedback that a student receives.
A rather different approach uses video cameras linked to a computer system to enable staff in a medical school to provide feedback without being in the room (Hughes et al. 2008).
CASE EXAMPLES
Balfour, J. (2007) Some light at the end of the feedback tunnel? CEBE Transactions 4(2), 54-66.
Esendal, T. and Dean, M. (2009) An online tool to give first-year programming students pre-assessment feedback. Italics 8 (2) 36-44. Higher Education Academy Subject Centre for Information and Computer Sciences e-journal.
Golden, K., Stripp, C. and Lee, S. (2007) Encouraging student use of feedback, reflection and engagement through web-based learning support. MSOR Connections 7(2) 7-10. Higher Education Academy Maths, Stats & OR Network newsletter.
This project evaluated how a range of technical interventions might encourage students to engage with feedback, and identified a series of recommendations around the use of technology in giving feedback. http://www.heacademy.ac.uk/resources/detail/ourwork/evidencenet/Technology_Feedback_Action
Hepplestone, S. et al. (2010) Using technology to help students engage with their feedback. A best practice guide for academic staff. Sheffield Hallam University. download leaflet
Hepplestone, S. et al. (2010) Using technology to help students engage with their feedback. A ten minute guide for senior managers. Sheffield Hallam University. download leaflet
Hughes, C., Toohey, S. and Velan, G. (2008) eMed-Teamwork: a self-moderating system to gather peer feedback for developing and assessing teamwork skills. Medical Teacher 30(1), 5-9
Jordan, S. and Mitchell, T. (2009) e-Assessment for learning? The potential of short-answer free-text questions with tailored feedback. British Journal of Educational Technology 40(2), 371-385
Khan, K., Davies, D. and Gupta, J. (2001) Formative self-assessment using multiple true-false questions on the internet: feedback according to confidence about correct knowledge. Medical Teacher 23(2), 158-163
Montague, B. Building up an electronic collection of marked assignments. Massey University: Innovations in Assignment Marking Case Study.
Murray, S. Feedback and engagement. Macquarie University Learning and Teaching Centre: Engaging Students - A Podcast Series.
Nix, I. and Wyllie, A. (2009) Exploring design features to enhance computer-based assessment: learners' views on using a confidence-indicator tool and computer-based feedback. British Journal of Educational Technology Early View
Price, G. (2006) Computer aided assessment and feedback - can we enhance students' early experience at University? New Directions 2. Higher Education Academy Subject Centre for Physical Sciences.
Tong, R. and Beynon, C. (2008) Can formative computer aided assessment assist student learning? Higher Education Academy Hospitality, Leisure, Sport and Tourism Network Case Study Whole-class feedback
It's tempting to cast 'generic' or whole-class feedback in the role of the permanent poor relation to individualised feedback. Yet as the table below suggests, it needn't be seen as an option of last resort. It has real advantages as a speedy means of emailing feedback on end-of-course exam scripts to students who no longer meet for timetabled classes. But as the table also indicates, it does have the edge over one-to-one feedback in the greater elbow-room it offers the feedback-giver. And the enlarged space for comment can be used in various constructive ways: to offer fuller explanations of aspects of the subject-matter that many students had not adequately grasped; to review the alternative approaches that could be taken to tackling a particular question or problem; or to pick out for praise especially good features of students' answers. In other words, whole-class feedback needn't be a glum post-mortem, but can widen students' grasp of what counts as good work in the subject. In this respect, it merits more general use as a complement to individualised feedback.
CASE EXAMPLES
Harland, J. (2007) Feedback to large practical classes. Centre for Bioscience Bulletin No.22, p. 7. Using clickers (PRS)
The use of personal response systems (PRS) or electronic voting systems (VRS) – often simply known as 'clickers' – can enhance students' learning experience in a variety of ways. At one time known largely for its use in TV quiz programmes like 'Who Wants To Be A Millionaire?', the use of handheld clickers with a choice of buttons for responding to questions is becoming more widespread. Clickers can be used in lectures to encourage engagement with the lecture content, and how well this is achieved depends partly on the design of the questions, so as to test understanding as well as knowledge. Bates et al. (2006) argue that 'A good question is one where a spread of answers might be expected or where it is known that common misconceptions lurk.'
The system provides immediate feedback to students on how well they have understood the question asked, as well as giving the lecturer feedback on how many of their students have understood a particular concept which they can use to address any problems, or to start group discussions. The system has also been used in peer feedback (see Barwell and Walker, 2009) where the advantages of anonymity in responding can be beneficial. While clickers are sometimes used simply to break up a lecture, when used well, it is with sound pedagogy behind the use and not simply as a novelty which will wear off.
A less high-tech version is based on IF-AT (immediate feedback assessment technique) forms which work like scratchcards.
CASE EXAMPLES
Barwell, G. and Walker, R. (2009) Peer assessment of oral presentations using clickers: the student experience. Proceedings of the 3rd HERDSA Annual Conference.
Bates, S., Howie, K. and Murphy, A. (2006) The use of electronic voting systems in large group lectures: challenges and opportunities. New Directions Issue 2, 1-8. Higher Education Academy Subject Centre for Physical Sciences journal.
Beekes, W. (2008) 'Ask the audience' in lectures. BMAF Magazine 4, pp.3-4.
Cotner, S., Fall, B., Wick, S., Walker, J., Baeploer, P. (2008) Rapid feedback assessment methods: can we improve engagement and preparation for exams in large-enrollment courses? Journal of Science Education and Technology 17(5), 437-443
de Jong, T., Lane, J., Sharp, S. and Kershaw, P. (2009) Optimising personal audience response systems technology to enhance student learning in teacher education lectures. Proceedings of the 3rd HERDSA Annual Conference.
Draper, S. (2009) Catalytic assessment: understanding how MCQs and EVS can foster deep learning. British Journal of Educational Technology 40(2) 285-293
Premkumar, K. and Coupal, C. (2008) Rules of engagement - 12 tips for successful use of 'clickers' in the classroom. Medical Teacher 30(2), 146-149
Robinson, C. and King, S. (2009) Introducing electronic voting systems into the teaching of Mathematics. MSOR Connections 9(1) 29-33. Higher Education Academy Maths, Stats and OR Network newsletter.
Russell, M. (2008) Using an electronic voting system to enhance learning and teaching. Engineering Education 3(2) 58-65. Higher Education Academy Engineering Subject Centre.
Wit, E. (2003) Who wants to be ... the use of a personal response system in statistics teaching. MSOR Connections 3(2) 14-20. Higher Education Academy Maths, Stats and OR Network newsletter. |
|||||||||