Evaluation

Evaluation of the Toolkit

Woman sitting at a laptop

Pilot evaluation of TechMate took place as part of the personalised feedback study at TU Dublin.

Show More

In this study, lecturers in computing department were interviewed and their views and opinions collected on the usability and usefulness of personalised feedback action. Following this pilot, TechMate is being updated, and a formal evaluation will follow. It is proposed that the evaluation will assess 9 usability and usefulness attributes and take place in a facilitated session, using a think-aloud method for user tasks and a semi-structured interview.

Evaluation of Actions in the Toolkit

Two people at a computer

Different actions may need to be measured in different ways to show impact. For instance, many Policy related actions rely on measuring enrolment or retention data.

Show More

Indeed, when creating an all-female course or recruiting new female faculty staff, the most straightforward way to see whether the action worked is to compare recruitment/retention numbers before and after the implementation.

Actions such as once-off outreach events, workshops, industry visits can use feedback as an evaluation approach. Feedback on measures such as, perception of the field, and interest in computing, could be assessed before and after the event. Alternatively, feedback of enjoyment/satisfaction from an event can be collected.

Actions which take place in the classroom such as pair programming, or changing the class or lab dynamics can use same evaluation approaches as actions that take place during the academic term outside of class. The latter include, buddy systems or personalised emails received from the lectures with feedback on assessment. If a short term impact is required, they can be evaluated with measures such as sense of belonging survey or by using feedback with one or multiple measures listed above.

Actions such as advertising campaigns or amendment of marketing materials would benefit from knowing the statistical numbers afterwards, rather than asking students for feedback/opinions/perceptions.

Our current work-in-progress suggestion is to select actions for evaluation where evidence of impact is possible in a relatively short term. By that, we mean that the impact can be seen immediately after implementation. These actions could include many in-class initiatives, as well as some mentoring programs.