AI: INSIDE AND OUT
Can artificial intelligence write your academic paper? It depends
A closer look at course learning surrounding AI, ChatGPT and generative machines in the classroom.
A closer look at course learning surrounding AI, ChatGPT and generative machines in the classroom.
At first mention, artificial intelligence seemed to be a professor’s academic nightmare.
With ChatGPT generating college students’ essay drafts, Midjourney designing inventive visual artwork and machine learning effectively debugging code, computer systems have instigated concerns of job replacement, copyright and dependence. And while concerns of ethical use of artificial intelligence are far from nonexistent, some undergraduate curricula are reimagining learning with generative AI.
General university guidelines give discretion to professors in determining course policies for AI. Students may find syllabi prohibiting use of computer systems between mentions of Turnitin and Academic Affairs, or in contrast, assignments incorporating artificial intelligence into academic work and independent projects.
Preparing students for careers integrating AI into neuroscience, biotechnology and pharmacology, instructors at the Davis School of Gerontology are incorporating machine learning into syllabi. In the upper-division course “Physiology of Aging,” John Walsh, professor of gerontology, encourages students to develop skills in artificial intelligence programming. Students create a Wix website with algorithm-generated data for a specialization of cancer, including prevention and risk factors, advanced diagnostics, treatments and clinical trials.
“My goal is [for students to] be competitive in the professional world that is completely embracing AI,” Walsh said. “People are getting really incredibly lucrative careers based on knowing how to communicate with AI and [using] the right search terms.”
At the Viterbi School of Engineering, artificial intelligence is arguably redefining computer diagnostics, advancing early diagnosis of Alzheimer’s disease. Researchers are training AI to detect blood markers of the neurodegenerative condition in cross-disciplinary research. Such innovation at the intersection of biomedical research and computer science prompts the introduction of AI training into undergraduate education.
But artificial intelligence is still (machine) learning: USC Libraries warns students of current model limitations, citing that generated information may be false, outdated or biased. For students, using this sometimes unreliable computer-generated information for assignments could cause professors to label them as plagiarized or entirely fabricated. Such reminders reflect that artificial intelligence remains remarkable, yet imperfect.
Recognizing computer system flaws, Walsh requires his students to fact-check algorithm-generated content. After generating their websites, undergraduates will reference credible sources, confirming the accuracy of data. Such policies ensure students correct misinformation and acknowledge “algorithmic bias,” or implicit bias from data training the computer systems. When left unaddressed, automated bias can perpetuate disparities in representation, exacerbating existing structural inequities.
While artificial intelligence may be to blame for computer-generated errors, students should be aware that they are ultimately responsible for work they create or endorse.
“Even though it’s AI generated … you’re still presenting it as yourself,” Walsh said. “You are now the advocate for … that tool, so you have to justify it.”
While most undergraduate schools have yet to standardize expectations for AI usage, professors are recommended to clearly state their individual course policies.
Mike Ananny, an associate professor at the Annenberg School of Communication and Journalism, takes on a reflective approach to the question of AI use in his syllabus, and more generally, the learning environment he wants to foster with his students.
“Generative AI can be an opening to other ways of learning and other ways of critical thinking, [and this critical engagement], to me, is the whole point of a learning environment. It’s a matter of how generative AI can fuel that,” Ananny said.
Ananny said one of the biggest mistakes in the rhetoric surrounding AI is in the personification and the agency ascribed to AI, which is essentially a mathematical algorithm. Much of the fear and hype surrounding generative AI is in its potential to replace human labor, but as students prepare for a future in a workforce that is saturated with fast-paced technological innovation, he hopes that his classroom can equip students with an important “metaskill”: developing an understanding of the greater politics of labor organization and technological distribution.
“I am most deeply interested in the politics of technology,” Ananny said. “Why are some systems made and other systems not made? What kind of image of people or humanity or success or public life — those big, huge humanistic concepts — end up being baked into these systems?”
In the School of Cinematic Arts’ course “AI and Creativity,” Professor Holly Willis has already seen increased awareness around the ethics of AI among her students. By using generative AI tools such as ChatGPT and Midjourney for language, still-image and motion generation, students have also taken on a reflective practice, critiquing the biases and limitations in the datasets that these tools are trained on — datasets of which are sourced from existing databases that may contain race, class, gender and geographic skews.
In this period of flux, professors face the challenge of designing course policies that balance the concerns around academic integrity while also introducing the opportunity to use AI reflectively and creatively in the classroom.
“The knee jerk reaction that I find most troubling is the idea that students just want to cheat, and they’ll be using these tools to get through their classes quickly and easily without doing the actual work themselves,” Willis said. “I kind of push back against that and recognize that students are often very serious about why they’re here and what they want to do as they’re learning.”
At the Marshall School of Business, relevant guidelines mirror the sentiment that fundamentals are essential. In foundational and lower-division courses, students are not allowed to consult AI for work relating to analysis, critical thinking or innovation. Such curricula are necessary for students to practice independently, prior to in-depth analysis and applications.
After students develop the fundamental skills in lower-division courses, they can apply AI to enhance their craft in upper-division courses. Ananny said AI tools can be integrated into the “journalistic workflow” to provide different starting points for projects that would originally take longer, more tedious preparatory steps. Likewise, in Willis’ upper-division course, generative AI has offered students different mediums for visual storytelling.
“The myth is that it is super easy to generate something, [but] to create a body of work that shares an aesthetic and that feels continuous really takes skill and hard work and a kind of visual imagination that I think is exciting,” Willis said. “There still needs to be a kind of sensibility and attentiveness to storytelling — all the things that go into traditional forms of art making.”
Regardless of the tools students use, both Ananny and Willis emphasized the importance of the creator as the driving force in the relationship between creator, creation and the subsequent commercialization of the craft.
The integration of new technologies in USC classrooms continues to change in tandem to the broader, societal impacts of AI as the University aims to foster future leaders in creative and technological fields. Across all the schools at USC, carving out space in the syllabus to combine the frontiering of technological innovation and a perspective of humanistic inquiry aims to prepare students for ethical advancements in AI.
“This is a moment that is about the politics of technology,” Ananny said. “And if we see it as an opportunity to better understand the politics of our technologies, then this can be a really productive, exciting, positive moment.”
We are the only independent newspaper here at USC, run at every level by students. That means we aren’t tied down by any other interests but those of readers like you: the students, faculty, staff and South Central residents that together make up the USC community.
Independence is a double-edged sword: We have a unique lens into the University’s actions and policies, and can hold powerful figures accountable when others cannot. But that also means our budget is severely limited. We’re already spread thin as we compensate the writers, photographers, artists, designers and editors whose incredible work you see in our daily paper; as we work to revamp and expand our digital presence, we now have additional staff making podcasts, videos, webpages, our first ever magazine and social media content, who are at risk of being unable to receive the support they deserve.
We are therefore indebted to readers like you, who, by supporting us, help keep our paper daily (we are the only remaining college paper on the West Coast that prints every single weekday), independent, free and widely accessible.
Please consider supporting us. Even $1 goes a long way in supporting our work; if you are able, you can also support us with monthly, or even annual, donations. Thank you.
This site uses cookies. By continuing to browse the site, you are agreeing to our use of cookies.
Accept settingsDo Not AcceptWe may request cookies to be set on your device. We use cookies to let us know when you visit our websites, how you interact with us, to enrich your user experience, and to customize your relationship with our website.
Click on the different category headings to find out more. You can also change some of your preferences. Note that blocking some types of cookies may impact your experience on our websites and the services we are able to offer.
These cookies are strictly necessary to provide you with services available through our website and to use some of its features.
Because these cookies are strictly necessary to deliver the website, refusing them will have impact how our site functions. You always can block or delete cookies by changing your browser settings and force blocking all cookies on this website. But this will always prompt you to accept/refuse cookies when revisiting our site.
We fully respect if you want to refuse cookies but to avoid asking you again and again kindly allow us to store a cookie for that. You are free to opt out any time or opt in for other cookies to get a better experience. If you refuse cookies we will remove all set cookies in our domain.
We provide you with a list of stored cookies on your computer in our domain so you can check what we stored. Due to security reasons we are not able to show or modify cookies from other domains. You can check these in your browser security settings.
These cookies collect information that is used either in aggregate form to help us understand how our website is being used or how effective our marketing campaigns are, or to help us customize our website and application for you in order to enhance your experience.
If you do not want that we track your visit to our site you can disable tracking in your browser here:
We also use different external services like Google Webfonts, Google Maps, and external Video providers. Since these providers may collect personal data like your IP address we allow you to block them here. Please be aware that this might heavily reduce the functionality and appearance of our site. Changes will take effect once you reload the page.
Google Webfont Settings:
Google Map Settings:
Google reCaptcha Settings:
Vimeo and Youtube video embeds:
The following cookies are also needed - You can choose if you want to allow them: