Faculty saw a downturn in student engagement after ChatGPT launched in 2022.
Sanjay Madhav, an associate professor of technology and applied computing practice, used to have 10 to 15 of his students come to his weekly office hours for help on some of his course’s harder coding assignments. Starting three years ago, Madhav said only one or two students have been showing up at most, which makes him question why he commits to four hours of office hours every week.
“One of the ways I used to get to know the students very well was when they came to office hours and worked through solving the problems,” Madhav said. “It’s harder now to get to know the students if only one or two people are showing up to office hours.”
Madhav attributes the decrease in students to ChatGPT’s launch in 2022. Rather than asking him for help, Madhav said students are now asking chatbots to fix their coding errors.
Daily headlines, sent straight to your inbox.
Subscribe to our newsletter to keep up with the latest at and around USC.
From tempering student-professor communication to compelling professors to restructure their classes, the meteoric rise of ChatGPT and artificial intelligence tools has reshaped how the University and its faculty are thinking about higher education.
So much so that USC bought an annual institutional subscription to ChatGPT Edu, which launched last January and cost the University $3.1 million, according to Provost Andrew Guzman. Now, active USC students, staff and faculty can use ChatGPT’s most advanced models for free, a development that has received pushback by some professors.
In the Dornsife College of Letters, Arts and Sciences, professors such as Taly Matejka, an assistant writing professor, have started to change their classes to avoid using technology altogether. Matejka teaches “Writing and Critical Reasoning,” a mandatory intro course for USC’s writing requirement, and said she felt discouraged in 2025 because she didn’t know how to respond to students using ChatGPT in their assignments.
“It was really disheartening being in a class full of students and having all their heads down into their laptops, trying to get a conversation going and being met with silence,” Matejka said. “They’re not actually here. They’re checking the NASDAQ in class or looking at sports scores or shopping, and that sucks because some of my most formative experiences have been in classroom discussion.”
In response to these observations, Matejka and several Dornsife colleagues collaborated on redesigning methods of teaching their curricula. Matejka’s classes are now screen-free as students take notes on readings with pen and paper and get a guaranteed B if they show up and do the work.
“One of the big things that goes into [redesigning the curriculum] is trying to help them not worry so much about the grade they’re going to get, because that’s a big issue for students,” Matejka said. “If your writing seems ChatGPT-like — if it’s sparkly and polished and perfect — it’s likely not going to get the A because I’m going to suspect [AI was involved]. But, if your voice is coming through, and what I see is that you’re wrestling with something and thinking about it, you’ll get the A.”
Similar to Matejka, Helen Choi, an associate professor of technical communication practice, teaches a required advanced writing course for engineering students in the Viterbi School of Engineering. After the 2022 Thanksgiving break, Choi said she felt “disempowered” after noticing a substantial portion of her students using ChatGPT on their assignments.
Choi said she missed interacting with students on a “human level.”
“There’s also that profound sadness of being not able to do your job. My job is not to count essays; my job is to assess whether or not students have gotten anything out of the class or learned anything,” Choi said. “It’s very difficult to do that if all you have as a written artifact of their learning is not something that they produced.”
After Choi switched her class from take-home essays to in-class writing, she said her students feel a “great deal of freedom” because they did not have to compete with students who would inevitably use ChatGPT for their essays.
One of Choi’s writing assignments was a prompt asking students about their p(doom), which measures the students’ personal prediction on the probability that AI will kill us all. She presented this assignment in a Center for Excellence in Teaching faculty showcase on AI in teaching in March 2025 to highlight how engineering students felt about AI’s implications for the future of humanity.
Three semesters ago, Choi said the average p(doom) percent hovered around 5% or less, but now, most of her engineering students feel a probability of impending doom forecasted above 50%.
Madhav, who has worked in the video games industry for about two decades, said he would “rather quit than use generative AI,” but admits that he is an outlier within the applied computing faculty.
The AI doom hits close to home for Madhav. For his first assignment in his video games class, which asks students to code the simple video game “Pong,” Madhav warns his students to not use the book he wrote because the assignment’s solution is very similar to the first example in the text. But three years ago, a student came into his office hours and showed their code, which Madhav said looked like the code in his book. The student said they didn’t use the book, but instead admitted to using GitHub Copilot — a generative AI coding companion.
After a bit of digging, Madhav discovered that if students use GitHub Copilot to work on the “Pong” assignment, the exact code from his book would auto-complete the rest of the game, finishing the assignment. Madhav said he published all programs from the book onto GitHub under an open-source license that requires attribution.
“The cool thing is my book is in the Anthropic settlements,” Madhav said. “I’m hoping I get a few thousand dollars for that at least.”
Anthropic is the company that created Claude, a competitor to OpenAI’s ChatGPT, which has agreed to pay $1.5 billion to settle a copyright infringement lawsuit with authors for each of the 500,000 books included in the lawsuit in September 2025.
As a professor of clinical education at the Rossier School of Education, Artineh Samkian teaches teachers how to teach. Her students are mostly working educators who are pursuing an educational doctorate.
One thing Samkian has heard from both her students and her students’ students is that they don’t know what form of AI usage is considered cheating. Samkian said it’s an educators’ responsibility to tell students what is and isn’t proper use of generative AI for their assignments.
Rather than implementing a standardized AI policy, Samkian clarifies ethical AI usage guidelines for each assignment in her class. She decides how much AI usage is appropriate by evaluating the level of its involvement in producing students’ thinking in their work.
For example, she assesses whether the chatbot should help with menial tasks, like creating a bibliography in APA format, or if it negatively affects students’ learning by doing their thinking for them.
When some of Samkian’s students used ChatGPT to create an image for a conceptual framework, she said she felt “torn” because the image did not show how her students viewed the relationship between different concepts. The image Samkian’s student submitted was a graphic that was supposed to represent parent engagement, but the way ChatGPT produced the image was “questionable.”
“[The graphic elements] didn’t add to the ideas. It was just a lot of — as Edward Tufte would call it — ‘wasted pixels,’” Samkian said. “It was a lot of pictures and colors. and it took away from what the student should have been trying to communicate.”
Ingrid Steiner, the director of CET and one of Samkian’s previous students, consults faculty during this time of what she calls “positive disruption.” Steiner said that the rollout of ChatGPT gives educators an opportunity to rethink how they assess student learning.
“[AI] came on very quickly, kind of hot and heavy,” Steiner said. “It just appeared, and all of a sudden everybody was using it and all the schools got on board and started purchasing it.”
CET does individual consultations with faculty, graduate students and teaching assistants across the University. The center also has teaching institutes, one being their AI in education teaching institute, which is a six-week-long series that covers using AI for teaching efficiencies, how to integrate AI into assignments, among other things, Ingrid said.
Ingrid also clarified that CET does not have a stance on AI.
Through CET, Steiner consulted a professor who ran one of their writing prompts through ChatGPT, which produced a B-minus paper. The professor said they were concerned students would do the same and lose the expertise and critical thinking skills the assignment was meant to nurture.
After sitting down and thinking about the assignment with the professor, Steiner helped redesign the paper into a group presentation. Rather than writing an essay at home, the students would instead present information from scholarly articles, which ChatGPT might not have access to, and lead a question-and-answer period without technology.
But switching to presentations is not the only adaptation Steiner has seen. Steiner said some faculty require students to document how they used ChatGPT alongside their thought process. Other faculty, going the opposite direction, have returned to the blue book.
Steiner said there isn’t one solution to stopping students from inappropriately using ChatGPT on assignments because problems arise from specific classes, disciplines and course expectations.
“We don’t have a magic fairy wand. There’s no silver bullet,” Steiner said. “It’s very much a conversation, sitting down and saying, ‘What are your learning objectives? What do you want students to learn in 15 weeks?’ And then when they’re done, ‘How are you going to know that they’ve learned that?’ Then designing around that.”
On the University-wide ChatGPT subscription, Samkian said that USC providing AI resources to its faculty, staff and students is the University’s way of saying it believes AI will not go away, and therefore wants to let the community leverage its use.
“USC’s decision to make the ChatGPT Edu available to everyone is telegraphing to everyone in the USC community that we see the importance of AI and want to give people access to it,” Samkian said. “That is a very strong message.”
Referencing the more than $250 million deficit USC was facing in 2025’s fiscal year, Choi said the University did not consult faculty before spending the $3.1 million for the ChatGPT Edu subscription amid a fiscal crisis.
“When [USC] decided to invest in this technology, I don’t think the University really did much in terms of asking how it might impact learning, mostly because no one really does know how it impacts learning,” Choi said. “Students can be very creative and figure out ways that it can enhance their learning, but in terms of evidence-based research [that is] peer-reviewed, we don’t really know yet.”
Although Matejka has completely cut off screens from her classes, she said she sees room for ChatGPT in education because the University has to be on the cutting edge of technology. However, she also sees ChatGPT as detrimental to the University’s job of teaching students how to interpret information and be reflective.
As a 50-year-old, Matejka said that her generation of professors took for granted their own education, the emphasis of which on critical thinking, which is being challenged by this era of “new ease.”
“I want my students to have the opportunity to have access to the education that I had, and I don’t want [ChatGPT] to rob them of that,” Matejka said. “A lot of us are just feeling like we have to step in and take back the classroom.”
DONATION PLUG – PLEASE DO NOT TOUCH
Thank you for reading the Daily Trojan.
We are the only independent newspaper here at USC, run at every level by students. That means we aren’t tied down by any other interests but those of readers like you: the students, faculty, staff and South Central residents that together make up the USC community.
Independence is a double-edged sword: We have a unique lens into the University’s actions and policies, and can hold powerful figures accountable when others cannot. But that also means our budget is severely limited. We’re already spread thin as we compensate the writers, photographers, artists, designers and editors whose incredible work you see in our paper; as we work to revamp and expand our digital presence, we now have additional staff making podcasts, videos, webpages, our first ever magazine and social media content, who are at risk of being unable to receive the support they deserve.
We are therefore indebted to readers like you, who, by supporting us, help keep our paper independent, free and widely accessible.
Please consider supporting us. Even $1 goes a long way in supporting our work; if you are able, you can also support us with monthly, or even annual, donations. Thank you.





