Facebook and Google have been under fire this week for failing to control the flow of misinformation about the 2016 election through their platforms. In the months leading up to the election, widely shared articles reported fake celebrity endorsements of now-President-elect Donald Trump, including ones claiming to be from Pope Francis and actor Denzel Washington. The Washington Post reported that, at one point, Google search results about election results showed an article falsely claiming that Trump won the popular vote.
These are only a few examples, brought to light by diligent news outlets. In light of the scrutiny, Facebook and Google have each addressed the issue, albeit with different approaches.
In a Facebook post on Saturday, Facebook CEO Mark Zuckerberg claimed that “more than 99 percent of what people see is authentic” and argued that it’s “extremely unlikely hoaxes changed the outcome of this election in one direction or the other.”
Others connected to Facebook disagree. BuzzFeed reported that, after seeing Zuckerberg’s comments, a group of Facebook employees covertly formed a task force that will draft and send a list of recommendations to senior management about how to move forward.
Former Facebook employee Antonio Garcia-Martinez told NPR that Zuckerberg is being “more than a little disingenuous” about the notion that Facebook could not affect an election outcome, arguing that just by reminding certain people to vote and selling political ads, the company can have a significant effect.
Zuckerberg’s denial of Facebook’s effect on this year’s election aligns with his earlier insistence that Facebook is not a news source.
In a New York times op-ed, University of North Carolina sociology professor Zeynep Tufekci said Zuckerberg is in denial. Tufekci points to several times Facebook has used its platform to run experiments on how it can influence its users’ behavior.
It’s not clear whether Zuckerberg’s 99 percent figure is an estimation or a true assertion. But even setting aside fake news, it’s known that every Facebook user’s content experience is unique. The social media site uses algorithms that tailor what a users see based on their individual interests.
A Wall Street Journal interactive attempts to show this by displaying what “Liberal Facebook” and “Conservative Facebook” look like, side-by-side. The graphic demonstrates how social media can put users in echo chambers that prevent them from seeing views contrary to their own.
Some argue that it’s not just about echo-chambers. NSA whistleblower Edward Snowden says that relying solely on Facebook for news is downright dangerous.
“We have one company that has the ability to reshape the way we think. I don’t think I need to describe how dangerous that is,” Snowden told Fusion on Tuesday.
Google took a more direct approach when faced with flak about fake news. The search giant announced on Monday that it would ban websites that publish fake news from using AdSense, Google’s advertising service. Facebook changed its advertising policy along the same lines hours after Google’s announcement, according to the New York Times.
Google CEO Sundar Pichai told BBC News, this week, that he believed that the spread of fake news on popular websites could have swung the election, given the narrow margins. He also stressed that his company, as well as others, needs to be more responsible about the information it distributes.
A Pew Research Center study from earlier this year showed 62 percent of U.S. adults get news from social media and that 44 percent of them get news from Facebook.
With numbers like that, it’s easy to forget that Facebook hasn’t been around forever. While social media, in general, is now one of the main drivers of news consumption, that development is only about a decade in the making. Zuckerberg founded Facebook in 2004 and made it open to the general public in 2006, just two years before President Obama was elected.
Since then, people have found ways to exploit Facebook to profit off of news. Much of the Trump-leaning fake news circulating during this election was tailor-made to make money and spread on Facebook, BuzzFeed reported.
“I like to remind myself that Facebook is a product built to keep me coming back again and again,” said Peter Lauer, social marketing manager at The New York Times, in an interview with MediaFile. “What I see on Facebook is highly tailored to make me feel good about being on Facebook, which often means showing me content that affirms my point of view—that is, showing me content that I’ll ‘like’.”
To escape the online bubble of content catered to each user, Lauer recommended to get the news straight from the source – newspapers.
“The reality is that the world doesn’t always reflect what I want it to. Newspapers remind me of that in a way an echo chamber can’t.”
Melissa Zimdars, an assistant professor of communication at Merrimack College in Massachusetts, has taken matters into her own hands by compiling a list of “False, Misleading, Clickbait-y, and/or Satirical ‘News’ Sources.” In a Google document, accessible to view for anyone with the link, Zimdars has compiled a growing list of fake news sites, along with a list of “tips for analyzing news sources.” Zimdar does warn the lists “reflect the opinion of the author and are for educational purposes only.”
Technology has rapidly changed the way people consume media and will continue to do so. While online platforms like Google and Facebook try to catch up with the way people are using them, there are steps users can take to be more vigilant about how they take in news.
Zimdars’s guide is a good place to start. You can also diversify your repertoire of news outlets. Reading from outlets that publish perspectives with which you might disagree can help you better understand the discourse around an issue.
Similarly, you escape the echo chamber by checking out what your Facebook friends whose views differ from yours are posting, since the algorithms make it less likely for you to see those posts. And rather than potentially letting your news diet be swayed by Facebook’s algorithms in the first place, you can follow your favorite source on an RSS reader like Feedly, which presents news in the order that it’s published.
These are measures you can take not just to help stop the spread of misinformation, but to take back more control of the way you consume information.
Brittany Gellerman contributed reporting.