Nathan Nobis, Philosophy, Morehouse College, nathan.nobis@morehouse.edu
Spring 2023 saw the arrival of “ChatGPT” and other artificial intelligence (AI) tools that are able to produce different forms of writing in response to prompts. While ChatGPT can be used in many contexts and for many purposes, my discussion here concerns its use by students in higher education.
For a bit of factual background, ChatGPT enables students to enter typical essay writing assignments (and other types of assignments that involve writing), with their requirements, and the AI will quickly produce an organized, informed, sometimes “thoughtful” and even “stylish” original writings that generally meet the requirements. The AI can also be prompted to (repeatedly) revise that product, in light of any requests the user has for revision.
And users can manually revise the output (and then seek further feedback from the AI, if desired).A student recently wrote in the Chronicle of Higher Education that many students are using ChatGPT, much more and in ways that few professors suspect. And there are also reports of students “feeding” ChatGPT their own previously written papers so that it will produce new writings for them that appear to be "written" in their own unique writing styles.
So the problem is this: ChatGPT enables students to more effectively cheat on writing assignments. It enables them to manufacture usually at least passable writing assignment submissions but not engage in any, or many, of the learning activities that writing assignments are intended to support: developing a topic, organizing information, developing a thesis, finding and organizing evidence, presenting arguments, responding to objections, thinking about how to best reach one’s audience, and so on.
Since ChatGPT enables cheating in these ways and evades the educational process, here’s our question: what are we going to do about ChatGPT?
Preliminaries
Let us begin with some initial observations:
In situations where calculators are used to do things that students generally could not do themselves, there is a legitimate educational reason for that: that would not be the case with basic writing since there are no tasks there that students are better off outsourcing to AI.
So, a tool is illegitimately used in educational contexts if it is used to complete a task that the student could not do on their own. So if and when ChatGPT is used to circumvent learning activities that require students to work to develop the skills to demonstrate understanding and successful communication (and much more), ChatGPT’s use is illegitimate: if students use ChatGPT to produce writings that they could not produce on their own—given their current level of understanding and skills—that use is illegitimate. This suggests a memorable guide for when ChatGPT use is illegitimate:
Anything students can’t do, ChatGPT shouldn’t do for them.
One open question is what educational goals can legitimately be met with, or benefit from, the use of ChatGPT: there may be some, and these potentially positive uses should be identified. A concern about these potentially positive uses, however, is that they can often be met in other ways: for example, although ChatGPT could review materials for students, simplifying them in various ways, this is also a task that the instructor could do, or students could do together in groups; ChatGPT can also review students’ self-created writings and projects to suggest improvements: again, this could be done by other students—with benefits for all the students—and/or the instructor, with benefits for the student-teacher relationship. So just because a benefit can be achieved with ChatGPT doesn’t mean that’s the best way to seek that benefit: other routes may be equally or more beneficial. And there’s also a real concern about “slippery slopes”: perhaps students using ChatGPT to “check their work” will lead to them using ChatGPT to effectively doing their work, or too much of it.
The above thoughts about when ChatGPT use is illegitimate suggest a related principle that if ChatGPT is used by someone who could create that writing product themselves, then that use may be legitimate.
I, however, cannot identify any substantive writing-related activity that almost any undergraduate student is better off outsourcing to ChatGPT: they are not better off—in terms of improved understanding and skills—if ChatGPT finds a topic for them, if it creates an essay outline for them, if it generates a thesis for them, if it assembles support or evidence for them, if it finds and responds to objections, and so on. Since students lack the expert-level knowledge and understanding required for legitimate ChatGPT use, their using it is typically illegitimate: and students lack the ability to discern whether ChatGPT’s outputs are low or high quality and where any errors are: their use could be akin to a student “using” a calculator and then saying of the final answer, “Look, I really have no idea if this is correct or not and why: it says what it says.” And, again, students using ChatGPT for almost any potentially legitimate purpose is an easy slippery slope to illegitimate use.
(An aside: I do not find the use of spell and grammar-check, including the use of Grammarly.com, problematic, although it may seem to be ruled out by the principles above: some differences to explain this, however, are that sometimes students could spell and grammar check themselves and so this software is like a calculator doing things the student could do themselves; sometimes students genuinely cannot [yet?] do these things and the software can help them learn this [whereas I don’t think ChatGPT is usually going to help students become better writers], and I don’t think that finding spelling and minor grammatical errors are as “constitutive” of thinking and reasoning as the processes involved in, say, essay and presentation writing are. Another issue is that, for many students, if we required a high level of grammatical and spelling proficiency before moving to higher-level learning tasks, we might never get to those tasks, or we’d be waiting too long.)
Given these preliminaries (preliminary preliminaries, since there surely are more background issues to be engaged!), again, what are we going to do about ChatGPT?
We can begin with our course goals and assignments: which of them might students use ChatGPT to circumvent? How can “ChatGPT-proof” goals and assignments or assessments be created? Here are some suggestions although none are perfect and can likely be defeated by students determined to cheat:
It is acknowledged that implementing these strategies in larger classes, where there is less personal attention, is surely harder.
It must also be recognized that some of these suggestions may be challenging to some students with some disabilities. And some students would not perform as well at these types of assignments, compared to other, more “traditional,” forms of assessments: however, it’s to be noted that that’s true of all kinds of assignments—some students do better than others on some kind of assignments, compared to others, so a mere change in typical assignment format need not be unfair: the best response, as its always been, is to provide some variety in types of assignments.
Concerning the specific teaching of writing, and the processes involved in effective writing, here are some suggestions, although none are perfect and can likely be defeated by students determined to cheat:
Here are two probably bad ideas for responding to ChatGPT:
In sum, the issue here is not merely how we might reduce a new type of cheating that involves new AI tools. The issue is much more profound and fundamental and it’s this: for many reasons, our societies, our world, needs people who are able to learn about complex issues, understand them, communicate that understanding, present arguments for their perspectives, and productively engage contrary points of view. Simply put, we need educated people. And we need to not have people who appear to be educated in these ways but really are not since they cheated using ChatGPT. ChatGPT makes distinguishing these two categories of persons, of citizens, harder, and so its negative impact must be resisted, for the good of all.