13 November 2024
How does AI support graduates of the future? An update on project progress
Authors
Written by the team behind the Collaborative Enhancement Project: Exploring the opportunities that generative artificial intelligence (AI) offers for higher education.
Catching up after the summer break gave our Collaborative Enhancement Project partners the opportunity to discuss developments in their approaches to generative AI, share their emerging practices and reflect on the potential challenges and priorities for the coming year. In this blog we update on some of the key issues that have been exercising our collective mind.
- Rethinking our purpose: Our last blog ended by considering the challenges that generative AI is posing to how we define our purpose and measure quality in higher education, particularly around academic writing. These voices continue to gain momentum and, in the light of the pervasiveness of generative AI functionality in standard university systems, are positively compelling us to rethink previously ‘taken for granted’ orthodoxies, such as how we address proof-reading and translation in academic integrity policies. Whilst seemingly trivial, changes to these practices could make a significant difference to some student groups.
- To detect or not to detect is not the right question: Debates around the use of generative AI in our institutions has inevitably focused on detection, whether with the assistance of a detection software or not. There is still significant concern about identifying the (in)appropriate use of generative AI and dealing with it effectively. However, we are also encouraged by the fact that the focus on detection has prompted a wider engagement with good practice in assessment design. This includes a drive to explicitly define the skills, as well as the knowledge, being developed in modules and ensuring that these are effectively translated into the assessment criteria. The question becomes ‘Is this assessment authentic and does it build the skills that are required in an Al enabled world?’
- Mandatory digital upskilling: Even with the significant developments in our digital ecosystems, universities have generally not followed a path of mandated minimum levels of digital competencies for either staff or students. However, with the rapid developments in generative AI, consideration is being given to the compulsory updating of digital skills for staff.
- Big tech: Global IT companies have consolidated their place in our university’s digital ecosystems with the widespread use of cloud based digital infrastructure, not explicitly developed for education. Consequently, updates are being made to these technologies without due consideration to the academic cycle which can make it challenging to embed generative AI tools into learning, teaching and assessment, if features are being changed, updated, or withdrawn with little or no warning.
- Having conversations: All our partners have acknowledged the central role that continual dialogue is playing in understanding how students are engaging with generative AI and their key concerns. Students’ AI literacies are clearly wide ranging, but many students remain extremely anxious about its (mis)use and about ‘getting it wrong’. Students are also concerned about the reduction in job prospects as AI becomes ubiquitous in the workplace. These concerns necessarily need to feed into our strategic approaches to generative AI.
- The moral backlash: Whilst still embryonic, we are beginning to see a backlash by both staff and students who are choosing not to use this technology, based on ethical concerns including bias, environmental impacts, and the infringement of intellectual property.
- Creating a strategic approach: It is becoming increasingly clear to all partners that adopting an institutional position on AI more broadly, and generative Al more specifically, supports the various functions in universities to adopt (generative) AI tools with purpose, clarity, and effectiveness - although, as yet, only some partners have done so.
Next steps
Through sharing our experience, we continue to develop our toolkit to help universities to take advantage of the opportunities that generative AI affords and mitigate against any risks.