Key Considerations for Senior Leaders When Supporting Students with the Use of Generative AI
We are seeing generative AI (genAI) embedded in daily tools more and more, which is raising awareness and increasing use by students. Many students today are using genAI in some form or another and we continue to see this use evolve.
So, it comes as no surprise that the use of genAI within education also continues to grow in popularity, meaning trying to avoid or ignore it is no longer feasible - or, importantly, something students want.
Students are looking for their institutions to embrace genAI and support them in their journey through education to employment in an AI enabled world.
Initial concerns
When conversations around genAI really took off in 2022 with the launch of ChatGPT, the use of genAI tools in education was understandably met with some trepidation by many senior leaders and practitioners, with concerns centred mainly around opportunities for misuse and the impact on academic integrity and assessment.
Some of the initial unease around genAI was due to a lack of understanding of the capabilities of this type of technology causing fear of the unknown. As a way to combat this, Jisc introduced the generative AI primer, providing support and regular updates to leaders and practitioners when navigating the challenges and opportunities of genAI at their institution.
We’ve also produced a guide to essential resources to help cut through the noise and to support our members as they develop ethical genAI strategy.
As the UK’s digital, data and technology agency for tertiary education, Jisc supports our members in their accelerated adoption of generative artificial intelligence in a responsible way through the use of pilots, advice and guidance, events and dedicated AI communities of practice.
Generative AI at your institution
GenAI continues to rapidly evolve as technology becomes more advanced and our understanding of how to get the best from these tools grows. When it comes to genAI adoption in education, we know there are many factors - including IT system procurement, staff/ learner IT skills and pedagogy - that determine how and when an institution can use genAI effectively in teaching and learning.
At Jisc, we’ve produced an AI maturity model, a tool to frame conversations around an institutions journey on adopting genAI, which focusses on five key stages of genAI adoption: approaching and understanding, experimenting and exploring, operational, embedded and optimised/ transformed.
Most institutions are at either the ‘approaching and understanding’ or ‘experimenting and exploring’ phase of implementing genAI, which is the perfect time to start developing responsible processes and guidance for how you want users to interact with genAI.
As these tools proliferate, and guidance for ethical and responsible use becomes increasingly embedded, opportunities to harness genAI to improve teaching and learning will only increase.
Student expectations of using genAI
In spring 2023 Jisc ran our first series of student discussion forums to gain greater insight into student/learner perceptions of using genAI across UK further and higher education. The outcomes provided valuable insight into areas including learner usage, key concerns and expectations around the role genAI could play in the educational experience.
The findings showed that students were leveraging genAI in various areas to enhance both their academic and personal lives, and they believed embracing genAI empowered them to contribute meaningfully to an AI-driven world.
Students also had clear expectations of what they needed from their institutions to help them use genAI effectively, with the emphasis on advice and guidance around the skills they need and the tools they should use. Consultation was also a key consideration as students were keen to be involved in decision making around the use of genAI in their own education, asking for open dialogue and collaboration on the subject.
We also asked why students were attracted to using genAI in education, and found that what appealed most was that genAI doesn’t just give you answers. It simplifies and expands, it can change the voice/tone, it can chat about content so it is explained in a way that best suits the user. And, importantly, genAI is always there.
We have just launched the next ‘student perceptions of generative AI’ report, this time having spoken to over 200 students in colleges and universities across the UK, and it is clear that student expectations around their use of genAI have matured.
They now express a desire for genAI to be embedded across their education, and expect competent usage by staff.
There is a stronger call for clear, consistent policies on genAI use, emphasising the need for guidance on plagiarism, copyright, and equitable treatment. Concerns for accessibility and the equitable provision of genAI tools have increased, highlighting growing disadvantage among students based on financial ability.
Students are also more aware of the inherent biases perpetrated by genAI and want to feel confident these are not being reproduced within their institutions.
Overall, students want a safe, responsible, accessible and adaptive genAI experience that’s uniformly applied across their education.
Questions for senior leaders
One of the most important questions senior education leaders can ask today is: what are we doing to ensure students are prepared for the genAI enabled world they will be working in?
New technology always leads to new types of jobs, and genAI is no different. It is estimated that 10-30% of all jobs could be automated by AI. This is a daunting figure, but new career options are regularly emerging as employers realise the opportunities available to them from new technologies.
Wider than this, though, most jobs will likely involve some element of genAI skill in the future, with AI literacy already high on the list for many interviewers alongside essential human skills such as critical thinking, adaptability and responsible decision-making – the skills that set us apart from the machines.
When considering this, it is clear that supporting students to use genAI ethically, as well as improving equity and accessibility to reduce the risk of digital inequality, should be as high a priority for leaders and practitioners as it is for students.
There is no one size fits all blueprint that guarantees successful dissemination of genAI literacy, improves equity and prevents overreliance so that students are prepared for the genAI enabled workplace - but there are steps that can be taken by all institutions as they navigate their genAI journey.
The first is to ensure the publication of relevant, unambiguous policies and guidance that best fit the individual institution and support the responsible and ethical use of genAI for staff and students. For higher education, the Russell Group has developed a set of principles for the use of AI in education which Jisc supports.
The next step is to make sure essential human skills such as critical thinking, creativity and key employability skills (such as communication and decision making) are fully embedded into the curriculum, as these are what make candidates stand out from the crowd.
Finally, developing institution-wide genAI strategy that emphasises the importance of equipping students with the necessary genAI skills to help them succeed in their chosen profession is essential. For this we recommend taking a whole institution approach using the above AI maturity model.
By involving staff and students in the operational process, diverse needs and perspectives can be addressed, fostering an environment of collaborative learning and adaptation. This holistic approach helps in navigating the ethical and practical challenges associated with the deployment of genAI in an educational setting.
About the author
Sue Attewell is Head of AI and Co-Design at Jisc. Sue co-leads Jisc’s artificial intelligence activity. Jisc’s focus is on supporting their members to responsibly adopt AI. They provide a wide range of thought leadership, practical advice, guidance, and training alongside piloting relevant AI products.
Before joining Jisc, Sue led skills development initiatives as Head of Skills at the West of England LEP in collaboration with employers. Her previous work with Jisc has included leading their edtech and apprenticeship delivery.
Find out how you can sign up for Jisc’s communities of practice here.