OPINION: Media literacy is critical in online platforms


It seems that no one can get through a year, or even a week, of college without at least one YouTube binge. Those sessions that start with a video sent from a friend or linked in an article that turn into strings of “up-next” and “recommended” clicks, leading you far afield from the video you started with. You finally resurface, hours later, your brain feeling like a bloated sponge, wondering how you got from an innocuous Buzzfeed clip to “Why Illuminati Replaced Nicki Minaj with Cardi B – THE TRUTH.”

Shideh Ghandeharizadeh | Daily Trojan

The answer is YouTube’s recommendation system, a closely-guarded algorithm that queues up videos based on user history and machine learning. The algorithm is designed to keep the viewer watching video and creating ad revenue for the site, and is remarkably effective. It exploits the human desire for novelty and entertainment, recommending and autoplaying videos infinitely until the user chooses to stop. To keep us watching, the algorithm tends toward extremity in all things, from $1,000 makeup hauls and all-fruit diets to the more dangerous realms of conspiracy theory, dubious science and fake news.

In this arena, YouTube poses a danger far greater than a few wasted hours. After the Las Vegas mass shooting in October, the algorithm was found to be promoting conspiracy theories calling the shooting a hoax. Just a few weeks ago, a video alleging the Parkland, Fla., high school shooting was a product of “crisis actors” shot up in views through the recommendation algorithm, landing on the coveted “trending” page of the site due to an extreme spike in its popularity. Videos that peddle obviously false, conspiracy-minded news narratives seem to be favored by the algorithm, and if someone is exposed to enough of these ideas over time, their view of reality can be considerably distorted.

Further, it seems that there is a bias in the algorithm toward right-wing and far-right content. Former YouTube engineer Guillaume Chaslot conducted a statistical analysis of the algorithm right before the 2016 presidential election by seeding accounts with pro-Hillary Clinton and pro-Donald Trump searches. A combined 86 percent of recommended videos contained damaging messages about Clinton, regardless of the seeded search. This does not mean that a right wing bias is written into the code of the site, but that over years of aggregated user data, videos with a right-wing viewpoints have garnered precedence over their liberal counterparts, so the algorithm tends to recommend them when all other variables are kept consistent.

Leaders of alt-right and white supremacist groups have taken full advantage of this fact, and YouTube is crawling with right-wing pundits who can act as a gateway drug to the vitriolic and racist world of the alt-right for vulnerable young people. Transcripts of conversations in alt-right chat rooms even show members coordinating to make thousands of fake accounts and downvote videos with more leftist viewpoints, further gaming the algorithm in their favor. In an interview with The Daily Beast, Southern Poverty Law Center representative Ryan Lenz said that “the alt-right is incredibly adept at navigating and otherwise exploiting loopholes in social media platforms.”

As college students develop and refine their opinions, it is important to acknowledge the power of platforms like YouTube. In these companies’ missions to maximize ad revenue, they can lead users down dangerous paths of radicalization, recommending content that is increasingly extreme and driving viewers on both sides of the political spectrum even deeper into their ideological echo chambers.

So far, YouTube’s responses to these controversies has been inconsistent and ineffective, using a few human moderators and demonetization to remove only a few of the most offensive channels. Still, the artificial intelligence of the site brought a conspiracy video about Parkland to the trending page as recently as last month. There may be something deeply wrong with this algorithm, but since it is still incredibly effective at retaining user engagement, YouTube seems reluctant to make any substantive changes.

In a world where the average American adult spends ten hours per day consuming media, platforms like YouTube can be just as impactful on college students’ worldviews as any professor or class. But at the end of the day, they are businesses that traffic in our attention, not educational institutions. It is important to push for improvement at YouTube (and its parent company, Google), but consumers also must develop their skills of moderation and critical thinking in their media consumption, as difficult as that can be.