
Naomi Girson | opinions editor
Jeff Lambert, assistant director at the center for teaching excellence and a philosophy professor at Duquesne, has been experimenting with AI since its introduction in 2021.
For him, AI is replacing simple skills the way the GPS replaced paper maps.
“The research is showing that when we use it for a lot of things like writing out arguments or summarizing a meeting, that the mental muscles that we have to do that, the critical thinking that we have, starts to atrophy,” Lambert said.
Lambert did appreciate a lot of what AI could bring to the table, while also understanding what is lost with the adaptation of large language models to do tasks we could do ourselves.
Lambert talked about his experience with hallucinations — a phenomenon where a large language model identifies patterns or objects that don’t exist or cannot be perceived by humans. The results are nonsensical or inaccurate outputs, which he said is a misinterpretation of data.
“I want to thicken this custard. What should I put into it? It’ll be like ‘put some glue in there,’ right? Well, yeah, glue will definitely thicken it, it’s just not going to be edible. And so it’s not thinking about it that way,” Lambert said.
Jim Purdy, the director of the Writing Center and professor of English and writing studies, does not see an end to hallucinations anytime soon.
“My sense is the way that the technology works, because it feeds on its own outputs, and it trades off of its own outputs … like it eats its own garbage,” Purdy said.
Lambert said AI can also affect academic individuality in writing because of linguistic flattening — the monotonous methodical answers given time and time again by large language models.
“It’s the idea that, as AI is trained on how to write academically, and then it’s helping people write academically, more and more things are going to look exactly the same,” Lambert said. “There’s something to be said [for] preserving your voice. And a lot of us who identify ourselves as writers think that that’s a pretty significant part of our identity.”
Purdy did a study that looked at Generative AI as a writing assessment tool. He told different large language models to help give feedback on academic papers, asking them to look at writing through a different lens. They asked large language models to look at the papers and operate as a coach, a grader, a peer reviewer and an editor. The results for every kind of prompt on all the different platforms they used only found about half of the errors a human editor was able to find.
He also said that with the different provisions they gave to the large language models, it helped or harmed their ability to give proper feedback. In most cases, when a rubric was provided, it helped their ability to make proper corrections (except for the editor role).
AI in the classroom
Purdy said he believes AI should only be for the brainstorming process and the final edit.
“Are there things that are lost by not doing your own brainstorming and editing? Sure, but I do think it can work more as that prosthetic intelligence and collaboration at those endpoints rather than in the middle,” Purdy said.
University of Pittsburgh English Professor Stephen Quigley has his students use AI to help them recognize patterns in writing, regarding word choice, passive voice and sentence composition. They will ask a large language model for 10 versions of one sentence to show them how many different ways there are to say one thing.
He believed that AI needs a project manager and an editor to make it worthwhile for him and his students.
“I’m less concerned about the reading and writing, I’m more concerned that people are using these technologies without requisite literacy,” Quigley said.
He said there should be an ethical framework at play, and he sees no reason to use AI in academic settings if students cannot understand the rhetoric of the outputs they are using.
“It’s easy to dismiss sometimes that the language we use, or the care that we need to use for our language is important, but I think AI shows us it’s all the more important because we’re seeing the mistakes it makes, and we want to be better people, not worse,” Lambert said.
Naomi Girson can be reached at girsonn@duq.edu
