06. January 2024 · Write a comment · Categories: courses · Tags: ,

I’m teaching a new course this term, called Confronting the Climate Crisis. As it’s the first time I’ve taught since the emergence of the latest wave of AI chatbots. Here’s what I came up with:

The assignments on this course have been carefully designed to give you meaningful experiences that build your knowledge and skills, and I hope you will engage with them in that spirit. If you decide to use any AI tools, you *must* include a note explaining what tools you used and how you used them, and include a reflection on how they have affected your learning process. Without such a note, use of AI tools will be treated as an academic offence, with the same penalties as if you had asked someone else (rather than a bot) to do the work for you.

Rationale for this policy: In the last couple of years, so-called Artificial Intelligence (AI) tools have become commonplace, particularly tools that use generative AI to create text and images. The underlying technology uses complex statistical models of typical sequences of words (and elements of images), which can instantly create very plausible responses to a variety of prompts. However, these tools have no understanding of the meanings that we humans attach to words and images, and no experience of the world in which those meanings reside. The result is that they are expert at mimicking how humans express themselves, but they are often factually wrong, and their outputs reflect the biases (racial, gender, socio-economic, geographic) that are inherent in the data on which the models were trained. If you choose to use AI tools to help you create your assignments for this course, you will still be responsible for any inaccuracies and biases in the generated content.

More importantly, these AI tools raise important questions about the nature of learning in higher education. Unfortunately, we have built a higher education system that places far too much emphasis on deadlines and grades, rather than on learning and reflection. In short, we have built a system that encourages students to cheat. The AI industry promotes its products as helpful tools, perhaps no different from using a calculator in math, or a word processor when writing. And there are senses in which this is true – for example if you suffer from writer’s block, an AI tool can quickly generate an outline or a first draft to get you started. But the crucial factor in deciding when and how to use such tools is a question of what, exactly, you are offloading onto the machine. If a tool helps you overcome some of the tedious, low-level steps so that you can move on faster to the important learning experiences, that’s great! If on the other hand, the tool does all the work for you, so you never have to think or reflect on the course material, you will gain very little from this course other than (perhaps) a good grade. In that sense, most of the ways you might use an AI tool in your coursework are no different from other forms of ‘cheating’: they provide a shortcut to a good grade, by skipping the learning process you would experience if you did the work yourself.

Icon for Creative Commons licence CC-BY-NC-SA

This course policy is licensed under a Creative Commons Licence CC BY-NC-SA 4.0. Feel free to use and adapt for non-commercial purposes, as long as you credit me, and share alike any adaptations you make.