By Stephanie Baines and Pauldy Otermans, Senior Lecturers (Education) in Psychology at Brunel University London.
The rapid advances in generative AI have changed the landscape in HE. In the Division of Psychology, our approach has been to embed the use of generative AI tools into our teaching and assessment strategy. Generative AI is not going away, and the ability to utilise these tools skilfully, as well as apply a critical eye to the output they provide, will be crucial for our students in their post-university journey. To enable our students to understand the uses and limitations of generative AI, we introduced teaching sessions at all levels of our undergraduate programme. In these sessions, we introduced students to different tools and their uses. We explain the pitfalls they need to be aware of, and how to use them ethically. Students were given tasks to practise use of the tools with the guidance of academic staff. To build on these skills learned in teaching sessions, we have woven ethical use of generative AI into our assessment strategy.
In one example, students are required to use a generative AI tool such as Chat GPT to generate a short passage of text considering how their academic and employability skills have developed over the course of their first year at university. They are required to paste the output into word and use comments and tracked changes to demonstrate how they would improve and refine the output to provide a stronger paragraph. Students were shown how to do this in a teaching session, and completing this part of the assessment will allow them to practise use of generative AI to create effective prompts, produce outputs, and critically appraise and improve the output. In particular, this assessment highlights issues with generative AI output, such as vague or inaccurate details.