ChatGPT Goes to College
May 14, 2023
As ChatGPT becomes more and more popular amongst students and educators alike, UCSD deals with the aftermath in innovative and thoughtful ways.
If you open up ChatGPT and ask it to write a fairy tale, it might give you something like this:
“The king reached the base of the volcano and saw that it was guarded by fierce dragons and molten lava. Despite the danger, the king pressed on and climbed up to the top of the mountain, where he finally laid his hands on the magical flower.”
Is this excerpt good? Does it draw you in and make you wonder about this king and his journey? Perhaps, but the odds are that there are better and more interesting ways to tell this story. UC San Diego lecturer Tina Hyland tasked her students with the job of turning this story into something more.
In Hyland’s ChatGPT activity for CAT 125R, a public rhetoric class, she asked students to read a fairy tale written by ChatGPT. They analyzed the story, looking for its triumphs and shortcomings. They were then tasked with the job of adapting the fairy tale based on their analysis, adding depth, details, or anything else they thought the story was lacking. Hyland wants students to examine where ChatGPT misses the mark.
“This fairy tale doesn’t do any sort of character development, it doesn’t describe anything,” Hyland said. “It’s organized very strangely. It takes away the sort of catharsis, or ‘reveal’ moments that make something pleasurable to read.”
ChatGPT can easily take a prompt and produce an answer based on the AI’s interpretation of a fairy tale plot. That is its job, after all. But without the human touches that give stories life, we’re left with an insubstantial shell of what could be a real fairy tale.
In late 2022, research laboratory Open AI released ChatGPT, a language-model AI chatbot that can answer nearly any question you throw at it. ChatGPT isn’t just any AI: its humanesque qualities, versatility, and ability to compile information set it apart from resources that previously existed. Ever since, students, educators, and many others have been grappling with its implications. For UCSD teaching staff and cybersecurity administrators alike, ChatGPT poses real and pressing questions.
“I will say the issue of artificial intelligence or large language models has been a major concern for folks in cyber security for a number of years,” said Michael Corn, chief information security officer at UCSD.
What’s so intriguing about ChatGPT is its ability to compile information in a succinct and accessible manner. As a result, it can be easy to believe that the content that Chat GPT produces is completely original and always accurate. But in reality, Chat GPT is just an amalgamation of information compiled from a large dataset of information — both the trustworthy and the unreliable.
Accordingly, ChatGPT poses threats to students. It can be easy to trust something that gives you information so confidently and instantaneously. But even when ChatGPT is accurate, it doesn’t always tell you where its information came from, or that it might be directly quoting an article or reading that someone else wrote. Corn sees this unsupported trust as a cause for concern.
“I worry that with ChatGPT, people use it as an oracle of fact instead of just distillation of information that’s online,” Corn said. “I think it masks the fact that many of its sources are just wrong or are opinion, and because it does such an elegant job of writing them, you’re fooled by it.”
Students might fail to challenge or work to understand the information they’re given. They might end up plagiarizing something without even knowing it.
Teachers have had different approaches to the challenge of dealing with ChatGPT in their classrooms. Some may ignore it completely, others may ban it, and some may choose to embrace it. Hyland was excited about the opportunities that ChatGPT offers for educational settings. She even uses AI technology like ChatGPT in her own work, whether to gather sources, spark inspiration, or create a rendition of one of her favorite philosopher’s lives. She finds value in the interactive potential that it poses.
Educators must find a balance between adopting and limiting the use of AI technology. If they reject ChatGPT entirely, they risk ignoring new pathways to creation; if they embrace it and trust their students, they risk encouraging the use of a tool we barely understand.
Hyland chooses to embrace ChatGPT. Her use of it in the fairy tale activity was a lesson to herself and her students about what it means to be human, to write, and to tell stories. A more engaging, more human story might have described the dragon, how the king used different resources to approach the volcano, or what emotions were involved in the journey.
So at what point does a tool for learning become a platform for falsification, for plagiarism, for the absence of learning? Hyland isn’t sure there’s a clear line. Though she trains her students to write above and beyond what ChatGPT seems capable of producing, she wasn’t even sure that she could tell the difference between authentic, start-to-finish, student writing, and a product whipped up by ChatGPT. Her ability to detect the use of ChatGPT would really depend on how students chose to manipulate and change what they were given by the AI.
But Hyland posed the question: “If they edited and worked with the text so much that I couldn’t tell, it’s almost theirs, anyway, right?”
In the STEM context, the use of ChatGPT can become more complicated. Shannon Ellis, associate professor for data science and computer science, has adapted her class in response to ChatGPT.
Ellis said that ChatGPT can pass every exam she’s ever written for an intro programming class. If Ellis were to give her students the take-home exam that she developed during the COVID-19 pandemic, they wouldn’t necessarily have to put in the work it takes to truly learn the code. She wants to ensure that students are learning, and she can’t if there is an AI available to complete a take-home exam instantaneously. She describes the changes she’s made not as a shift in philosophy, but in method.
“I would say, my philosophy has not changed at all, but the way I assess has to, as a result,” Ellis said. “What I want out of students is the same; I want students to learn.”
For her upper-division coding courses, it’s a different story. With much more complex concepts, and students that have a lot of experience with coding, the content of her coursework surpasses some of ChatGPT’s capabilities.
“ChatGPT can get you most of the way, but you really need to understand what you’re doing to get the code all the way correct,” explained Ellis.
Ellis actually encourages her students to use ChatGPT as a tool for learning, but not as the end-all-be-all answer. If you enter a code with errors into ChatGPT, it can help you understand what you did wrong and what changes you might need to make to fix it.
“If they use ChatGPT, then go back, and say, ‘Oh, that’s not right’ and then, ask ChatGPT another and then get to the answer, that is the learning process,” she explained.
ChatGPT becomes an impediment when students rely on it before they rely on their own knowledge. And if students rely too heavily on ChatGPT, they may not leave class with the knowledge that they need to succeed in their future careers.
“If you don’t learn how to learn while you’re in college, you’re gonna have to figure it out on the job,” Ellis said. And I don’t want students going out into the job market, not feeling confident, not knowing things they should know.”
The way that ChatGPT interacts with users changes the process by which they evaluate information.
“When you listen to ChatGPT … it feels like you’re talking to a person, in which case you’re gonna treat it more like a person and be less questioning of it. And that worries me,” Corn added. “I believe that the most important thing you learn at a university is how to challenge problems, how to think through problems, how to compare different solutions to problems.”
Though it can be difficult to address the questions that arise alongside new technology and resource development, making changes to her course was not a burden for Ellis. If anything, it was just a natural part of the constantly changing and evolving world of education, particularly in the context of data science and coding.
Even from the more technical side of things, Corn too, sees a need for further exploration of ChatGPT, its risks, and its capacity to help or hinder.
“There are some big open questions around the data handling that we need to dive into and understand,” he said.
There’s no single answer to unpacking the complexity of teaching in a world where AI can ace tests. This new tool raises the question of whether or not educators choose to trust their students with the use of this resource. Whether teachers are adapting their coursework or not, there’s never going to be a way to guarantee that students don’t use ChatGPT. So the question becomes, are they going to use it responsibly?
Hyland thinks so.
“I do trust my students to use the technology responsibly … and innovatively and interestingly, and all of the other things,” Hyland said. “I think there is cause to be excited about new technologies that help the ability to change the way that we relate to one another and the world.”
In the world of coding, Ellis also sees her students as responsible actors in their own education.
“Do I think students care about their education? I do,” she said. But Ellis also recognizes how difficult it can be for students to balance all of their responsibilities, and how that can present many internal conflicts with individual effort and the use of resources like ChatGPT. “But I also think students are stressed, and when students are stressed and busy and against a time crunch, even the most honest students can make a bad decision,” Ellis added.
The challenge of balancing the beneficial and detrimental uses of ChatGPT will persist, and it’s up to all of us to continue reimagining education in its wake.
“In five years, it’s gonna be too late to have those conversations,” Corn said.
A machine that could emulate human writing, or write perfect code, would pose a lot of questions about what it means to create, write, and educate. ChatGPT has the capacity to both build up and break down the learning process for students. It’s up to all of us to question the information we receive, to use it innovatively but also thoughtfully. At this point, ChatGPT poses great opportunities for the world of education, but it cannot replace it.
“I find change to be exciting, and I like to see where things go. If I had a time machine, I wouldn’t go 100 years in the past, I’d go 100 years in the future,” Hyland said.
In 100 years, AI could be writing the most influential novels in history, or writing code that sends rockets to the moon. Change can be scary, but it doesn’t have to be a bad thing.
Photo Courtesy of Ralph Ordaz from Getty Images
GPTDeutsch.de • Sep 27, 2023 at 12:00 am
This article about ChatGPT integration into college life is a testament to the ever-expanding horizons of AI. It’s inspiring to see how technology like ChatGPT can assist students in their academic journey, offering valuable insights and support. Kudos for shedding light on this exciting development!