How do HE providers manage the fast-paced developments in AI?

As I start to write this blog, I am resisting the temptation to use an Artificial Intelligence (AI) platform such as Chat GPT to produce it for me in seconds. I am sure I am not the only one, and certainly the temptation for our students to use it to produce coursework and assignment submissions is huge, particularly if under pressure with deadlines. Although AI technology has been around for a while the recent developments seem to be advancing at an unprecedented pace.

This technology is here to stay and as education providers we have a responsibility to embrace this change in how we operate to avoid the risk of education losing its credibility. I am sure maths teachers had similar concerns when calculators were first introduced, now we expect students to use them in lessons. History demonstrates change is constant and we adapt appropriately.

How do we do this? At Leeds College of Building, we have started to look at three key areas initially to start to address both the advantages and disadvantages of AI software in education and the workplace. A working group is being formed to implement and disseminate information and training for students, lecturers, managers, and support staff.


We have started to work on two policies, one for staff and one for students. It is important to have clear guidance on the benefits, limitations and appropriate and ethical use of AI. The student policy has a particular focus on academic misconduct and plagiarism to protect academic integrity along with consequences of misuse and appeals.

Training for lecturers, management and support staff in the use of AI

AI is already a very useful tool to help create all kinds of lesson planning, resources and assessment material, saving time and improving efficiency. At present however it is not perfect, and staff also need training on its capabilities, responsible use, risk of errors and misinformation. The aim is to embed AI as a useful teaching tool benefitting both staff and students.

Student research and course work submissions

This, in my opinion, is the most difficult area to address. We know students are already using AI software to help them produce coursework to submit. When does using AI as a convenient, useful research tool cross over into plagiarism – I would suggest there is a fine line. How do we develop new ways of assessing students to maintain academic integrity and credibility. Plagiarism checking tools with AI identification, which I imagine most Higher Institutions have, go some way to helping identify AI generated work, but is this reliable enough? Various assessment methods need to be developed which test the students’ knowledge and skills along with analytical and critical thinking. It would be great if good practice and ideas were shared through Go Higher West Yorkshire’s networks.

To conclude, as educators we ignore AI at our peril. If we can incorporate AI in the way we work, develop resources, delivery content, assess student work and competence in an ethical and responsible way it has the potential to reduce staff workload and simultaneously increase efficiency while enhancing the student experience, preparing them for modern industry working environments.

Government policy may also influence how we use AI in the future.

If you haven’t tried AI, set up a free account with ChatGPT and ask it anything – it will change your working life. Maybe also have a look at the imminent release of Microsoft 365 Copilot.

I promise I didn’t use AI, as you can probably tell.

Chris Tunningley, Assistant Principal for Adult & Higher Education, Leeds College of Building