Like, Share, Lie: How Facebook Enables False and Misleading News

Illustration by Christina Carlson
Illustration by Christina Carlson

In addition to partisanship, rhetoric and suspicious hair, the most recent election was also notable for a massive increase in news stories spread, and generated, by social media. According to Pew Research statistics, 63 percent of all Americans get at least part of their news from social media sites such as Facebook, YouTube or Reddit. Clearly, social media has become an extremely important platform for politicians and marketers alike to help spread their message and reach their audience. For news organizations, this has meant a dramatic restructuring in how news is delivered to target audience as they transition to attract desirable groups. However, due to the ease with which social media companies like Facebook can target individual audiences, news and entertainment industries often ignore large sections of the population that disagree with their message and therefore wouldn’t generate any ad revenue. In turn, this creates echo chambers, isolating like-minded groups and reducing competing and critiquing voices while enabling sensationalistic, misleading or outright false news.

A core issue at hand is that fake news gets a lot of attention. For example, a fake news story in the final days of the election indicating that an anti-Trump protest was staged generated over 350,000 shares on Facebook, increasing as right-wing Facebook groups shared and spread the story. However, while the original story energized the internet, the legitimate news stories offering the truth never took off. The Snopes article debunking the protest gained less than 6,000 views, less than 3 percent of the number of people who saw the other story. This in itself points to a lack of critical thinking and skepticism in social media users, who indeed should be skeptical of everything they read from an unverified news source.

Most of the people who saw the Snopes article were not part of the 350,000 who originally saw the fake story. Because conservative and liberal media groups predominantly have followers who agree with their biases, a conservative or liberal viewpoint will quickly gain traction throughout these groups without any critiquing influence. Furthermore, since these Facebook members are far more likely to mostly have friends with similar political viewpoints, the false message can easily spread across an internet community. To make matters worse, Facebook knows this, and specifically increases the presence of such political messages on your homepage in hopes that you will click the link, generate the ad revenue and share the post.

New media organizations have figured this out and subsequently profited from users’ gullibility. The managers of this new segment of sites recruit people in foreign countries, such as Macedonia, to think up sensationalist headlines and publish them under semi-factual websites in order to gain ad revenue on such webpages, which accounts for a hefty sum in developing countries. These profiteers admit they have no relevant political views, but rather recognize the economic value in creating such fake “news” and spreading it across like-minded groups. Due to the lack of critiquing voices, these stories are clicked, read and shared without any independent fact-checking or sourcing.

All of this results in an echo chamber, one that is nearly impermeable. As ideas spread across a community, the more vitriolic and sensationalistic ones grow and reach a greater and greater audience. False outrage, such as posts claiming that the “other side” is supposedly outraged over a position you hold dear, is particularly damaging since it increases the divide between you and “the others.” These types of appeals convince readers that they are either “with us, or against us,” leaving no middle ground for moderate voices. Over time, this results in a intense polarization of groups with virtually no contact between them, although there is plenty of contact within each group demonizing the other.

Given the massive rise of false political messages that continue to spread across its site, Facebook needs to walk a delicate balance between removing clearly untrue news, yet not censoring users’ posts. Though it would be technically challenging to develop an algorithm to sort through posts and checking its veracity, it would be concerning if Facebook removed personal posts, even if they were clearly fake. What’s certain is that Facebook’s trending bar, which it controls via an algorithm, is easily manipulated into publishing fake stories, causing further concerns that Facebook is indulging in this unethical behaviour by netting ad revenue. Facebook, as one of the largest sources of news to the current generation, needs to develop itself into a media company with integrity and awareness, not a tech company. With proper research and manpower, Facebook can reliably and consistently remove false news stories from its trending bar, and reduce the ranking of stories from pages flagged as misleading. Google and Facebook have already taken some steps to reduce the ad revenue these websites generate, but more action is needed to encourage reasonable debate and moderate voices across the entire political spectrum.

Leave a Comment
More to Discover
Donate to The UCSD Guardian
$200
$500
Contributed
Our Goal

Your donation will support the student journalists at University of California, San Diego. Your contribution will allow us to purchase equipment, keep printing our papers, and cover our annual website hosting costs.

Donate to The UCSD Guardian
$200
$500
Contributed
Our Goal

Comments (0)

All The UCSD Guardian Picks Reader Picks Sort: Newest

Your email address will not be published. Required fields are marked *