Educational Technology Blog

Why the way students perceive feedback is important for your classes

We know that feedback is important, and a fairly recent study (https://doi.org/10.1080/03075079.2013.855718) has broken down feedback into a series of student conceptualisations that can guide the way that we think about feedback and how we might change the way feedback can be delivered to our students.

Feedback as telling

Feedback can be equated with information transference. At times a student needs to be told that they are doing something wrong (or, indeed right). The students involved in the study referred to this passive form of feedback, this students experienced this method of feedback as in the present time with reference to the immediate task at hand and often with tests and assignments.

This style of feedback can be thought of traditional feedback, and is of the style of feedback that teachers often need to have (that hideous term); evidenced.

Feedback as guiding

Rather than simply telling students what to do, through feedback they can be guided in the right direction. Although there is also a perception of there being a single right way of completing a task which may restrict creativity and deep thought from students. However, since the feedback is by nature guiding there is encouragement for students to think about how they can apply the advice in their future work.

Within the study it was noted that there is opportunity for extending the learning environment beyond the classroom with this type of feedback.

Feedback as a means of developing understanding

Creating exploratory feedback and guiding students about what is wrong as well as the fact that it is wrong has clear advantages from a student perspective.

This is a basis for giving students information about What Went Well, and What to do next.

Feedback to offer different perspectives

Student feedback can be grounded in information from the real world rather than restricted to the school context. My suggesting how students can improve their work using these different perspectives, there is an opportunity for students to enhance their whole world view.

This moves students away from simply having a fixed perspective, and allows them to develop in ways that they might not be able to think possible.

Discussion

It should be noted that the concepts were drawn from an undergraduate group of students, and they may not be directly applicable to your context.

However, students have varying perspectives on the application of feedback. This is probably coloured by their experiences of the education system (although all students in the study are undergraduate students suggesting they are able enough to go to university), and the teachers who have taught them.

We should think more about feedback, and are we able to give students the assistance they deserve to improve their work and really get to where they should be with the help of the feedback they are given.

 

 

New year, new ideas

As the results make us reflect upon the completion of a cycle, and the beginning of a new one it is a good opportunity to reflect on the opportunities a new school year brings.

As many of the Twitterati are attempting to use the platform less, I’m finding on the whole a productive way to gain some time towards CPD and think about how my practice might improve through the ideas of others.

To boost the prospects students that need extra support, or perhaps need to be stretched further differentiation regularly crosses my mind. As a result, the products published by StudeApps will focus on providing differentiated lesson plans and resources to assist teachers in providing the best quality lessons possible. 

As we begin to update our App portfolio for 2019-2020, accessibility will become a focus for the same reasons. The New Computing GCSE App has a host of great content for students, and we will work to provide a great experience for all who use the platform.

Not to forget, I hope all of your students have had a successful set of results!

Onwards and upwards!

Are Terminators already here? The Today programme makes use think so…

Today the BBC fell into the common trap of unnecessary hyperbole and concern around AI.

Starting with confusing robotics and AI (two vastly different subsets of computing), the interview ended in an interviewee asserting that Israel have some form of killing machines that can identify enemy targets without human intervention.

For a start, there was no mention of war conventions (human intervention is required in war!), and we should remember that machines that kills without human intervention have been around for years. We might call such machines “bombs” and “mines”.

When the interviewee spoke about how the killing machines could be “scaled up like Google”, we are talking less about robot AI from the Terminator movie and more about mines across warzones in the last hundred years.

The reason this is so important is that people develop a fear of AI, what is possible and what is going to change.

We don’t expect BBC reporters to know everything about each topic covered on the Today programme, but surely they can do better than this?

 

Are multiple choice question formats appropriate for YOUR classes?

 

An academic research  paper  (Palmer and Devitt, 2007) discussed multiple choice questions in a very positive light:

“Well-constructed peer reviewed multiple choice questions meet may of the educational requirements and advocate that this format be considered seriously when assessing students ”

Some may doubt the validity of this paper, as it is over a decade old. However newer studies agree with that well-crafted multiple-choice tests and quizzes are of benefit to staff and students alike (Abdulghani et al., 2017)

Indeed the ‘wrong’ answers in a multiple choice test have their own jargon (“distractors”), even questions have their own name (“stem”). However, what is absolutely clear is that we need to develop our questions to test higher levels of cognitive reasoning to discriminate between high and low achieving students.

 

Xu, Kauerand Tupy (Xu, Kauer and Tupy, 2016)feel there are ways to optimise the development of multiple-choice questions. These ideas are summarised below:

When formulating your tests you should consider the following improvements:

Develop questions that test higher order thinking

Discourage guessing from students, and tell students before hand what is being tested

Learn from your exams by analysing question results

Use clearly written questions that cover a range of topics

Use questions that test the taught content

Plan feedback to be immediate, or delayed depending on the context. Consider instant✓digital feedback using computing resources

Give students the chance to self-correct

Utilise students opinions on the tests you set

Avoid negative questions (pick the option that is NOT…) and all /none of the above questions

Use 3-choice items

Avoid difficult to understand answers like A and B, not C

Choices should be similar in length

Questions should be as short as possible

Randomize answer positions

Help students to understand why cheating does not help them

Randomize question order and answer positions to minimize cheating, and change the questions each year

So using multiple choice questions is a good idea, but only if we do them well. We need to think about how we will develop them to be beneficial for both students and teachers.

References

Abdulghani, H., Irshad, M., Haque, S., Ahmad, T., Sattar, K. and Khalil, M. (2017). Effectiveness of longitudinal faculty development programs on MCQs items writing skills: A follow-up study. PLOS ONE, 12(10), p.e0185895.

Palmer, E. and Devitt, P. (2007). Assessment of higher order cognitive skills in undergraduate education: modified essay or multiple choice questions? Research paper. BMC Medical Education, 7(1).

Xu, X., Kauer, S. and Tupy, S. (2016). Multiple-choice questions: Tips for optimizing assessment in-seat and online. Scholarship of Teaching and Learning in Psychology, 2(2), pp.147-158.

 

 

Recent Comments