The Social Media Bubble

Facebook isn’t a neutral site to share a variety of viewpoints­—it’s more like an echo chamber.


By Jasmine Wu

As of the third quarter of 2016, Facebook had 1.79 billion monthly active users. What once started as a site to connect with distant relatives and old friends has become something so much more. Forty percent of the United States population now gets its news from Facebook, which acts as a convenient medium for news publications to share their articles and bring attention to current events. Yet given the proliferation of ads and fabricated news stories that are shared on the social media site in order to get clicks and generate revenue, should Facebook, as a for-profit corporation, have so much influence on how people access their news? 

The theoretical premise of Facebook as a platform for users to utilize their First Amendment rights may be empowering for many, but it is important to take note of the company’s own incentives. Facebook generated $9.1 billion in advertising revenue in North America during the first nine months of this year; 95.2 percent of its total revenue was from advertising in 2015. The company’s dependence on ads means that it gives information about its users to advertising agencies, slicing each person into more than 50,000 unique categories. Ads are shown to people based on not only the standard demographics such as age, gender, and location, but also on numerous other factors like if you’re an international traveler or if you’re more liberal in U.S. politics. 

One such category that warrants special attention is “ethnic affinity,” which Facebook claims is different from race. Ethnic affinity is sourced from an individual’s interests and activities on the site. In giving advertisers the ability to target specific groups, Facebook has given them the ability to exclude specific groups by their “ethnic affinity.” To prove that this could actually happen, ProPublica bought an ad targeting Facebook members who were house hunting, and excluded those who had an “ethnic affinity” for African-American, Asian-American or Hispanic people. Numbingly anachronistic, this option had a civil rights lawyer denounce it as a blatant violation of the Fair Housing Act of 1968. Although Facebook originally defended this practice, it said it would no longer allow advertisers of housing, employment, and credit-related products to target ethnic groups. 

Though this practice stopped, the fact remains that ads play into people’s worst preconceptions and stereotypes of who their customers are and who’s interested in what, all in the name of making money. These ads targeting ethnicity are still available outside these categories, and who you are, your friendships, and your beliefs are still being extracted and divvied up for advertisers. Ads, by nature, are only a service when they are consonant with the potential buyer, but it’s come to a point where this method is harmful. If an advertiser wants to address the largest market possible for a skateboard and spend the least amount of money possible, it will target only boys from ages 14–20 and potentially exclude a 40-year-old aunt who wants to buy her nephew a skateboard because the demographic is too small to buy into. This inflexible method, in its worst case, has the potential to devolve into discrimination and perpetuate harmful societal norms. Taking this to the political realm, if you’re a Republican, you will only see advertisements and “sponsored content” that are congruent with your beliefs because you are more likely to click on it. Facebook wants to pursue its mission to make the world more connected and open, but how is that possible when its revenue is tied to something it can control: what people click on? 

Ads are only one sample of content that embraces people’s already cemented beliefs. According to a BuzzFeed News analysis, in the last three months of the presidential campaign, the top 20 fake news stories on Facebook generated more engagement—shares, likes, and comments—than the top 20 stories from real news websites. One such story came from a tweet that claimed protesters were paid to be anti-Trump in Austin, TX. Despite the Twitter user later proving that it was false, the original tweet had been retweeted and liked more than 5,000 times; articles about it were shared on Facebook more than 44,000 times. Fake news stories such as this one are so much stickier than the truth because people want to believe them. They want to read what they already believed for self-verification, for continued resonance. 

Instead of a site where all voices are heard, Facebook has become a site where the popular, but not necessarily true voice, is more heard. It’s where people share that Obama was born in Kenya and that Pope Francis endorsed Donald Trump; even though these stories aren’t true, they proliferate because the audience wants to believe them. Facebook has started to become an echo chamber, with both ads and fakes news allowing people to embrace prejudice and reject information that challenges them. It has made it harder for someone to be exposed to the world by instead showing the viewer ads pre-conceptually suited to their demographics and news, fake or not, supported by like-minded friends. Facebook is a chilling illustration of power, which Mark Zuckerberg himself has acknowledged as a tool that can be shaped to improve society, but nothing will happen if it is left as it is, with clicks accumulating and beliefs cementing.

Jasmine Wu is a second-year majoring in philosophy and economics.