AI could be the enemy of social progress
As artificial intelligence rises, we must be conscious of systemic human biases.
As artificial intelligence rises, we must be conscious of systemic human biases.
As artificial intelligence becomes more developed, it holds the potential to become a powerful tool for enhancing daily life. But with power comes great responsibility. AI developers need to start considering the negative implications that AI software can have on social progress.
The term “AI Revolution” has been coined to describe the rise of AI technology, garnering comparisons to the Industrial Revolution.
“There will be a big impact on jobs and that impact could be as big as the Industrial Revolution was,” said Lord Patrick Vallance, UK minister of state for science, to the House of Commons’ Science, Innovation and Technology committee last year.
The societal implications of AI are concerning because the proliferation of AI is the capitalist’s wet dream. The New Yorker author Gideon Lewis-Kraus cautions that Silicon Valley tech companies are consolidating power “on a scale and at a pace that is both unprecedented in human history.”
AI companies rely on underpaid gig workers like data labelers, delivery drivers and content moderators. AI researchers Adrienne Williams, Milagros Miceli and Timnit Gebru wrote in an essay published in Noema Magazine, “unlike the ‘AI researchers’ paid six-figure salaries in Silicon Valley corporations, these exploited workers are often recruited out of impoverished populations and paid as little as $1.46/hour after tax.”
In this era of radical technological growth, we must beware of the over-commodification of AI putting exponentially more money into the hands of fewer members of society.
Government use of AI is also a cause for concern. Across the United States, AI software is being used in courtrooms to brainstorm criminal sentencing. Correctional Offender Management Profiling for Alternative Sanctions is a software that develops predictions on recidivism or the likelihood of a person repeating a crime.
A study led by Julia Angwin found that COMPAS is remarkably unreliable in forecasting violent crime: only 20% of the people predicted to commit violent crimes actually went on to do so.
In an interview with NPR, internet scholar and UCLA professor Safiya Noble explained how the information fed into AI creates inherent biases.
“What is used to determine these kinds of predictive AIs are things like histories of arrests in a certain zip code,” Noble said. “So if you live in a zip code that has been overpoliced historically, you are going to have overarresting. And we know that the overpolicing and the overarresting happens in Black and Latino communities.”
Data from Angwin’s study revealed that Black defendants were twice as likely as white defendants to be misclassified as a higher risk of violent recidivism, and white recidivists were misclassified as low risk 63.2% more often than black defendants.
If courts base sentencing decisions on inaccurate, racially-biased, AI-generated recidivism predictions, that is institutionalized racism. The use of these kinds of computational devices in institutional decisions poses an urgent need for regulation.
On an interpersonal level, AI products are becoming a normative part of life.
One might use an AI tool to generate images for a presentation or try an AI filter that transforms ordinary photos into artistic portraits. While AI cat memes and trending AI Disney filters may seem harmless, it is important to consider that any image produced or modified by AI can reinforce problematic and noninclusive social narratives.
Popular AI-powered TikTok filters have been criticized for “whitewashing” facial features, removing same-gendered partners from photos and altering users’ bodies to look thinner.
AI-powered filters are changing the way that we look and who our partners are to conform to noninclusive narratives. Without intervention, AI has the power to teach problematic ideas about beauty, sexuality and more to impressionable audiences.
The Washington Post revealed that Stable Diffusion XL, an AI image generator, also succumbed to many tropes when prompted to create photos. When prompted to create photos of a person playing soccer, Stable Diffusion XL generated images of primarily darker-skinned male athletes. When prompted to create photos of a person cleaning, the program generated images of only women.
Inspired by The Post, I decided to personally test the DeepAI Image Generator. Without further specification, the prompt “Asian person” generated images of people only with East Asian features. Similarly, the prompt “couple” produced images of only heterosexual couples.
When AI generators cater to the appearance of dominant groups or fail to give us diverse, inclusive images, they teach stereotypes that are harmful to members of every community.
AI technology is flawed because humankind is flawed. As AI inevitably penetrates all dimensions of modern life, we must remain cognizant of its relationship to social progress. AI developers, policymakers and community members must all advocate for an ethical future with AI.
We are the only independent newspaper here at USC, run at every level by students. That means we aren’t tied down by any other interests but those of readers like you: the students, faculty, staff and South Central residents that together make up the USC community.
Independence is a double-edged sword: We have a unique lens into the University’s actions and policies, and can hold powerful figures accountable when others cannot. But that also means our budget is severely limited. We’re already spread thin as we compensate the writers, photographers, artists, designers and editors whose incredible work you see in our daily paper; as we work to revamp and expand our digital presence, we now have additional staff making podcasts, videos, webpages, our first ever magazine and social media content, who are at risk of being unable to receive the support they deserve.
We are therefore indebted to readers like you, who, by supporting us, help keep our paper daily (we are the only remaining college paper on the West Coast that prints every single weekday), independent, free and widely accessible.
Please consider supporting us. Even $1 goes a long way in supporting our work; if you are able, you can also support us with monthly, or even annual, donations. Thank you.
This site uses cookies. By continuing to browse the site, you are agreeing to our use of cookies.
Accept settingsDo Not AcceptWe may request cookies to be set on your device. We use cookies to let us know when you visit our websites, how you interact with us, to enrich your user experience, and to customize your relationship with our website.
Click on the different category headings to find out more. You can also change some of your preferences. Note that blocking some types of cookies may impact your experience on our websites and the services we are able to offer.
These cookies are strictly necessary to provide you with services available through our website and to use some of its features.
Because these cookies are strictly necessary to deliver the website, refusing them will have impact how our site functions. You always can block or delete cookies by changing your browser settings and force blocking all cookies on this website. But this will always prompt you to accept/refuse cookies when revisiting our site.
We fully respect if you want to refuse cookies but to avoid asking you again and again kindly allow us to store a cookie for that. You are free to opt out any time or opt in for other cookies to get a better experience. If you refuse cookies we will remove all set cookies in our domain.
We provide you with a list of stored cookies on your computer in our domain so you can check what we stored. Due to security reasons we are not able to show or modify cookies from other domains. You can check these in your browser security settings.
These cookies collect information that is used either in aggregate form to help us understand how our website is being used or how effective our marketing campaigns are, or to help us customize our website and application for you in order to enhance your experience.
If you do not want that we track your visit to our site you can disable tracking in your browser here:
We also use different external services like Google Webfonts, Google Maps, and external Video providers. Since these providers may collect personal data like your IP address we allow you to block them here. Please be aware that this might heavily reduce the functionality and appearance of our site. Changes will take effect once you reload the page.
Google Webfont Settings:
Google Map Settings:
Google reCaptcha Settings:
Vimeo and Youtube video embeds:
The following cookies are also needed - You can choose if you want to allow them: