Personalised Feedback
Back to All ActionsWhat is the Action?
To use personalised feedback in computing assignments.
Personalised feedback is a personal message that accompanies a grade/mark, as opposed to generic feedback or no feedback at all. It addresses students’ strengths and weaknesses in a specific work submitted. For example, addressing students in class after an assignment and pointing out to the most common issues, as well as a personal, but still a generic message “Rename your functions for clarity” is considered generic feedback. A more personal feedback is a comment “Well done on getting your code working. However, the naming of your functions could be clearer. For example, you could rename function_1 to count_to_four “.
Quick Facts to Support this Action
- Research has shown that personalised feedback positively influences students’ persistence in computer science
- Personalised feedback contributes to female students feeling that they belong in the field of computing
Email-based personalised feedback provided to top-performing students in introductory computing classes has increased intentions to persist among female students.
- Female students in science courses are more likely to act upon feedback.
- Personalised feedback in computing labs has helped reduce the gap between the lower and the higher-performing students.
- Personalised feedback in online computing courses has contributed to better student engagement, which is correlated to better performance and can therefore help target at-risk students.
Personalised Feedback in Computing Assessment Study (TU Dublin)
10 lecturers teaching computing modules were interviewed to find out how they provide personalised feedback and whether there are perceived gender differences among students receiving it. The majority of lecturers provided text-based feedback through a VLE, some also used video recordings, peer feedback, email feedback, GitHub-based feedback, automated feedback or direct commentary in assignments.
Nearly every lecturer had an opinion that face-to-face and/or video feedback would benefit their students the most. Lecturers found it challenging to find a balance between available time and good quality feedback, especially those teaching large classes. Template-based feedback is thought to be useful as long as it is appropriately personalised. According to the lecturers interviewed female students need more attention than male students, female students tend to appreciate given attention, and one-to-one verbal or video feedback is thought to be more useful to students than email/text-based feedback.
Approximately 800 students from the modules that these 10 lecturers teach were shared a short survey on the feedback and their career and future study aspirations. Students included undergraduate and postgraduate cohorts, with full-time and part-time courses, some of them provided fully online. There were 80 valid responses. Some findings from the survey are below:
A selected pool of female students (4) were interviewed, to get an in-depth understanding of what they thought of personalised feedback provided to them and of their overall experience in computing modules, considering that they belong to an underrepresented gender group. 100% of the interviewees said that they were happy enough with the feedback, 50% of the students said that do not feel excluded while being female but that they surrounded themselves with other female friends. One participant expressed the following: “Sometimes I feel like I don’t know enough and some guys in my class finish the tests in 20 minutes… I only started coding when I was 15, some people in my class started when they were like 8.” – indicating then that she should be focussed on her own tasks rather than comparing herself with her male classmates.
Email feedback provided to top-performing students (North Carolina State University)
This study – Increasing Women’s Persistence in Computer Science by Decreasing Gendered Self-Assessments of Computing Ability – used personalised emails sent to the top 50% performing students in an introductory computing class. After their assignment, these students were divided into three groups based on their grades: those in the top 10% of students, those in the top 11-25% of students, and the remaining in the top 26-50% of students. All these students received an email from their lecturer containing personalised feedback alongside their grades. The feedback message contained information on what performance group they were placed, as well as a positive note about the student ending up in a top performing group. There was a GIF attached to the email, to reinforce the positive feedback. There was evidence that this feedback intervention positively influenced the persistence of women in computer science. A template email with sample wording of personalised feedback, is available to use in the relevant section below.
Personalised emails based on student performance in online or hybrid modules (North Carolina State University)
This study – Increasing Students’ Persistence in Computer Science through a Lightweight Scalable Intervention – investigated the effects of email-based personalised feedback provided to 800 students separated into three performance groups after two major assignments during an introductory computing class. The groups were top, middle, and bottom performers, based on the results in the assignments. Data was collected from online and hybrid modules. While this research was not directly focused on female students, gender data was collected, and results showed that the intervention improved self-assessment of computing ability, as well as positively influencing students’ intentions to persist in computing fields, which could both contribute to improved retention among female students.
Students received emails from the lecturers containing contextualised feedback alongside their grades. All three types of email were phrased positively, and their content encouraged students to continue their good work or to do better, depending on their results. Low performing students were also offered support and resources to help them succeed.
Evaluation Approach
There is a number of ways to assess the impact of personalised feedback for students in computing disciplines. The most common data collection method is the pre-and post-intervention surveys (see studies by Fisk et. al., Akram et. al., and Voghoei et. al.). Some examples/idea on what was done/can be done to demonstrate impact are listed below:
➤ Confidence & Sense of Belonging
➤ Effects of Personal Feedback
To encourage the completion of surveys, additional credit can be offered to students who complete the survey (like in studies by Fisk et. al., and Akram et. al.) or a requirement added to complete the survey as part of the course.
Next Actions to Consider
Consider using this action along with some others, such as Pair Programming or Class/Lab Dynamics.
– Would you be happy to share your success story? Please get in touch! –
Azcona, D., Hsiao, I.H. & Smeaton, A.F. 2019. Detecting students-at-risk in computer programming classes with learning analytics from students’ digital footprints. User Model User-Adap Inter 29, 759–788.
Akram B., Fisk S., Yoder S., Hunt C., Price T., Battestilli L., and Barnes T. 2022. Increasing Students’ Persistence in Computer Science through a Lightweight Scalable Intervention. In Proceedings of the 27th ACM Conference on Innovation and Technology in Computer Science Education Vol. 1 (ITiCSE ’22). Association for Computing Machinery, New York, NY, USA, 526–532.
Fisk S. R., Wingate T., Battestilli L., and Stolee K. T. 2021. Increasing Women’s Persistence in Computer Science by Decreasing Gendered Self-Assessments of Computing Ability. In Proceedings of the 26th ACM Conference on Innovation and Technology in Computer Science Education V. 1 (ITiCSE ’21). Association for Computing Machinery, New York, NY, USA, 464–470.
Iraj H., Fudge A., Faulkner M., Pardo A., and Kovanović V. 2020. Understanding students’ engagement with personalised feedback messages. In Proceedings of the Tenth International Conference on Learning Analytics & Knowledge (LAK ’20). Association for Computing Machinery, New York, NY, USA, 438–447.
Voghoei, S., Tonekaboni, N. H., Yazdansepas, D., Soleymani, S., Farahani, A., & Arabnia, H. R. 2020. Personalized feedback emails: A case study on online introductory computer science courses. In Proceedings of the 2020 ACM Southeast Conference, 18–25.
KEEP ACCORDIAN COLLAPSED
Do not delete. This is important to keep the accordian items closed on initial load. Keep this at the top of the accordian stack.
The accordian will auto-open the first item on load. This item has custom css to keep it hidden. This means the user cannot see that it is open thus manually overriding the auto-open.
Personalised Feedback Student Survey - TU Dublin
Contextualized Feedback Student Survey
Personalised Feedback - Lecturers Interview Questions - TU Dublin
Interview Questions Lecturers
1. What is your assessment strategy in modules where you give personalised feedback?
2. How do you give personalised feedback?
3. How frequently do you give personalised feedback?
4. What type of personalised feedback would benefit your students the most do you think? (e.g. automated feedback, verbal feedback, video or a textual feedback).
5. Why do you provide personalised feedback?
6. Is there any formal evidence that feedback is a benefit to students in your modules, or anecdotal feedback received from your students?
7. Did you identify any differences in terms of how personalised feedback is perceived by students of different genders? If so, what were they?
(A short verbal introduction to TechMate and the action on personalised feedback). Please take a look at the action in TechMate dealing with personalised feedback https://ascnet.ie/techmate/delivery-techniques/personalised-feedback/ (demonstrate the page of the action).
8. In terms of content, are there any useful ideas here on how you could implement personalised feedback in your classes? (For example, on the use of templates based on student performance, or else?)
9. If yes, what are they?
10. If not, why not?
11. Is the guidance and the suggestions straightforward to follow?
12. Do the inline links (references) that should support the guidance work for you? If not, how would you suggest they were presented instead?
13. Is the section “Evaluation approaches” useful to you as an academic implementing this action?
14. Is the section “Supporting Resources” useful? If not, why not?
15. Is there anything you would improve in a way that this action is presented, and if so, what would it be?
16. If you had more time, would you improve anything in the way you give personalised feedback? If so, what would it be and why?
Thank you very much for your time.
Personalised Feedback - Students Interview Questions - TU Dublin
Interview Questions Students
1. Please tell me about the module (and its assessment) where you received personalised feedback.
2. What is the nature of the assessment where you received personalised feedback – e.g. large project, weekly graded labs etc.?
3. What did you find helpful in terms of assessment and the feedback provided to you?
4. What did you not find helpful?
5. What would you improve in the feedback you received, in particularly considering that you are a student from the underrepresented gender group in computing? Why?
6. Were there any other types of personalised feedback in other modules that you found particularly helpful? What were the differences to what you received in this module?
Thank you very much for your time.