<img height="1" width="1" style="display:none" src="https://www.facebook.com/tr?id=1005154772848053&amp;ev=PageView&amp;noscript=1">

Responding to Robot Writing: There’s Only One Option for Universities

Jack Goodman

Jack Goodman

Feb 23, 2023

 

The arrival of automated essay-writing software has sent shockwaves through the global higher education sector. Academics and administrators are urgently debating how to respond to a technology that could make cheating a run-of-the-mill, free, and potentially acceptable behaviour for millions of university students.

 

Just last year Australia’s higher education regulator, TEQSA, was busy blocking access to scores of essay mills - websites that offer to write essays for students - usually for a few hundred dollars - with turn-around times of 24 hours to two weeks. That response now feels like it came from a bygone era, in the face of the game-changing ChatGPT, a new AI algorithm that can respond to nearly any prompt by spitting out “original” text right before one’s eyes.

 

At a gathering of education leaders in Sydney in January, the tension in the room was driven entirely from AI. Across academia, from North America to Europe, to Oceania, the spectrum of responses tends to fall into two camps. At one end of the spectrum are a group best described as “the enforcers.” This group sees some form of punishment as the only logical response to any breach of academic integrity rules. Students who break the rules must face consequences, and, if needed, universities should revert to “unhackable” assessments, in the form of face-to-face assessments, featuring pen and paper.

 

At the other end of the spectrum are “the accommodators” who see the inevitable rise of artificial intelligence and conclude that fighting it is pointless. Better to accept the arrival of our computer overlords and try to think about ways to collaborate with them for educationally constructive purposes.

 

But this is a false dichotomy that fails to understand how we should think about ChatGPT, and how we should respond to its arrival.

 

One useful way to think about technologies is to categorise them based on the type and level of intellectual activity they have sought to enable humans to “offload” from their cognitive burden. A timeline might look something like this:

Technology Time period Intellectual activity offloaded
Calculator 1970s-80s Arithmetic and calculations
Spell check 1980s Spelling
Grammar check 1990s Grammar, sentence structure
Thesaurus 1990s Vocabulary
Search engines  2000’s Memory, discipline knowledge
AI/Chat GPT  2020’s Deep cognitive work

 

When viewed through this lens, it’s clear that part of the arc of technology over the last 50 years has been the progressive increase in the complexity of the intellectual efforts it has subsumed on behalf of its human masters. Simple arithmetic and spelling are often seen as chores (what grade school child hasn’t moaned about spelling lists and flash cards, and what parent hasn’t secretly (or not so secretly) sympathised with them?)

 

But educators have known for decades - and we all understand intuitively - that having a corpus of facts in one’s head allows us to see patterns, make connections and come up with new ideas and ways of seeing the world. That’s why we all have to ask ourselves: Do we want to offload the intellectual burden of writing an essay - or even just a first draft? Writing is how we discover what we think about whatever topic we have been studying. There is nothing more fundamental about learning - and no skill more important to most knowledge-economy careers - than producing a coherent, well-argued, grammatically correct piece of writing.

 

Writing is also one of the hardest skills to learn, which is why watching ChatGPT produce its writing in real time is incredibly mesmerising and, for most of us who struggle to produce output, jealousy-inducing. It’s also deeply troubling because the makers of ChatGPT acknowledge that the tool has no understanding of truth and is unreliable in that it will give different answers to the same question. In short, there is no “intelligence” in ChatGPT. There is only imitation.

 

How should we respond as educators? Already some universities are going down the “enforcer” path, seeking to block access to ChatGPT, defining use of any computer-generated content as a breach of academic integrity rules and signalling that students who do so may be severely punished. And while this is an understandable response, it is only dealing with the consequences, not the cause of academic dishonesty. Perhaps a tool will be developed that can identify AI-generated content. A new bot will surely come along that will defeat such a tool, and the spy-versus-spy arms race will continue ad-infinitum.

 

Better to start with the causes of academic dishonesty. If we can mitigate them, then students will be far less likely to to turn to the dark side - whether that be copying text off the internet, paying a third party for an essay, or using an AI-bot to slap 1,500 words together in a matter of seconds.

 

Why do students cheat? The late Tracey Bretag, one of Australia’s leading researchers in the field of academic integrity, identified three factors that influence academic dishonesty.

 

  1. Students whose native language is not English. International Students. Students from Culturally and Linguistically Diverse backgrounds. First generation immigrants.

  2. Dissatisfaction with the teaching and learning environment. Students who feel ignored or who are in large enrolment courses where there is little or no individual or personalised communication.

  3. The perception that there are “lots of opportunities to cheat.” In this regard, there is no greater opportunity than ChatGPT - the contract cheating equivalent of an invisible cloak that makes detection potentially impossible.

 

Put another way, students who feel cheated by their institution are more likely to cheat.

 

Enrolments at universities in much of the western world have grown dramatically, and perhaps peaked - a result of widening participation policies and a huge spike in international students. 

 

We now have a massified higher education system where investment in the student experience has failed to keep pace with technology and student needs. And we know this because the most reliable student experience data - like Australia’s Quality Indicators of Teaching and Learning (QILT) - show a sector wide shortfall in learner engagement and student satisfaction. 

 

Why does an improved student learning experience -  including satisfaction metrics, staff wellbeing, and critical skills scaffolding - matter so much? Because students who feel their teachers know them and care about them are far less likely to take a shortcut to pass a unit or cheat their way to a degree, especially when faced with enormous financial or societal pressures to simply pass.

 

Ethical AI means genuine student learning - Studiosity

Historical technology advancement did not reduce humans' deep cognitive skills development. But poorly-managed generative AI will. Without ethics, humans in the loop, and an evidence base - generative AI undermines both learning gains and higher education credentials.

 

It’s understandable why universities are viewing the arrival of robot writing as an existential crisis for the sector. Given the current size and scale of most of our universities, it’s quite possible that enormous numbers of students will be tempted to cross the line with a tool like ChatGPT. The need for the sector to up its investment in teaching and learning, student well-being, and belonging has existed at least since the start of QILT a decade ago. There is no more time to waste.

 

Jack Goodman founded Studiosity 20 years ago, with a vision to make the highest quality academic study support accessible to every student, regardless of their geographic or socio-economic circumstances.

 

About Studiosity

Studiosity is personalised study help, anytime, anywhere. We partner with institutions to extend their core academic skills support online with timely, after-hours help for all their students, at scale - regardless of their background, study mode or location. 

Now you can subscribe to our educator newsletter, for insights and free downloads straight to your inbox: 

Subscribe >>