Facebook is relatively transparent about why you see certain advertisements. Any user can go into “Ad Preferences” and see how Facebook targets its ads. For example, Facebook thinks I will be interested in, and therefore more likely to click on, ads that concern Beyoncé, Postmates and irony. The New York Times this week brought to the attention of readers that the ad preferences page also labels the political party to which a user is most likely to belong. For me, right under the “Lifestyle and Culture” tab is a little box that says “US Politics (Very Liberal).”
It’s not surprising that a website that knows the places I’ve lived, the pages I’ve clicked on, my age, race and more would be able to determine my interests and opinions with at least some accuracy. And while I would never complain about Facebook showing me more ads related to Beyoncé, the labeling of users as conservative or liberal has worrisome implications. By pigeonholing users on one end of the political spectrum or another, Facebook is inadvertently increasing the partisanship sentiment that characterizes much of the nation’s political discourse. When Facebook’s privacy terms and conditions, especially surrounding allowed and disallowed content and censorship is taken into account, the combination of the two may result very easily in a real threat to open, unadulterated and bipartisan political discourse.
This year, Pew Research Center conducted a survey on “news use across social media.” The researchers determined that 62 percent of adults in the U.S. use social media sites to get news. Facebook’s contribution to this growing trend is significant. In fact, 44 percent of adults use Facebook in particular as a news source. If you narrow the group to just millennials, the reliance becomes even more prominent; according to the American Press Institute, 88 percent of millennials regularly get news from Facebook.
The labeling of users as conservative or liberal in order to customize ads should be concerning. According to the Pew Research Center, “more than half of Democrats (55 percent) say the Republican Party makes them ‘afraid,’ while 49 percent of Republicans say the same about the Democratic Party.” Intense political partisanship has characterized much of the 2016 election and shows no signs of ending soon. Intense, legitimate ideological differences are understandable, even desirable. However, isolating groups from seeing the views of others only serves to heighten feelings of mistrust and animosity.
But turning away from traditional news sources doesn’t necessarily mean the death of healthy democratic deliberation. Due to the user-driven nature of social media, news that would have been looked over or deliberately censored by the mainstream media has been able to garner mainstream attention. Social media also has the power to propel lesser-known candidates to the main stage (see Bernie Sanders). Despite these obvious benefits, however, the ubiquity of Facebook makes it easier to contain individuals within narrower and narrower ideological bubbles, while also forcing the general public to trust that the often secretive company will not censor certain groups.
Accusations of bias and censorship in Facebook’s policies and algorithms have originated from every corner of the political spectrum. Matt Orfalea, whose Facebook profile picture and cover photo both prominently feature Jill Stein, claims that a video he posted of PBS censoring Jill Stein was removed from Facebook without explanation. Last week, two extremely popular libertarian Facebook groups were shut down. Facebook later restored the accounts, claiming they had been removed in error. Black Lives Matter activists have cited the temporary deletion of Korryn Gaine’s livestream as evidence of Facebook’s censorship, or even complicity in police cover-ups.
Whether these incidents have been caused by human error, technical glitches or legitimate violations of Facebook’s policies, these claims of censorship should not be taken lightly. Facebook is a private company that can subject its users to its own terms and conditions. But too many people rely on Facebook for news, political debates and information sharing for the site to arbitrarily decide which political groups it will allow and which must be shut down.
Lately, my Facebook feed has been bluer than usual. I see the same few people posting the same few articles that all espouse views that I generally agree with. I see ads from The Nation, Mother Jones, and ThinkProgress more often than anything else. While I may not click on a Breitbart article if it were to ever show up on my newsfeed, the ideological bubbles Facebook is forcing users into skews perceptions of the most important issues the world faces today.
Lena Melillo is a senior majoring in philosophy, politics and law and gender studies. Her column, “’Pop Politics,” runs every Thursday.