In the days immediately following the US presidential election, a quick Google News search of the phrase “final election results” netted a slimy top result. It was an article claiming, incorrectly, that Donald Trump had won the popular vote.
It was no isolated incident this election season. Many fake news stories became hugely popular online: The New York Times estimates one false story was shared at least 16,000 times on Twitter and over 350,000 times on Facebook. And in the last three months of the campaign, fake news stories on Facebook actually outperformed real news from providers like The New York Times, the Washington Post and NBC News, according to a BuzzFeed analysis. (The same analysis concludes that the top fake stories shared on Facebook overwhelmingly skewed pro-Trump or anti-Clinton.)
Google and Facebook have both said they’ll implement measures to slow the spread of false stories, mainly by limiting advertising for fake news publishers. But Will Oremus, a senior technology writer for Slate, says staunching the flow won’t be that simple. For companies like Facebook, the problem of fake news is part of a much larger question: What’s the role of social media in the news media?
“If you ask Facebook what the goal of their news feed is, the goal is to show users what they want to see,” he says.
And if those users love reading stories, even untrue ones, which confirm their personal political viewpoints?
“That presents a little bit of a conflict for Facebook because then you have the option to either keep pleasing people by feeding them fake and misleading stories,” Oremus says, “or to try to fulfill some sort of democratic obligation to inform the public or to challenge people's viewpoints. It's not clear that it's in their interest to do that.”
Oremus explains that’s because, at its heart, Facebook considers itself a technology company. Not everyone agrees with Facebook’s perspective.
“If you ask a lot of people in the media, politicians — Facebook has become a dominant force in the news industry,” Oremus says. “And they see Facebook as refusing to own up to the roles that a media company usually plays, which are to inform and educate, and not just to connect people, regardless of what the content is.”
In the wake of the election, Facebook CEO Mark Zuckerberg initially downplayed the possibility that fake news influenced the results, calling it “a pretty crazy idea.”
Zuckerberg’s comments drew criticism, and reports circulated that even some executives at Facebook questioned their accuracy. Since the election, Facebook has updated its advertising policies to say the company won’t place ads from fake news sites on third-party apps or websites. It’s a first step, Oremus says, but the real fix is much harder: updating Facebook’s algorithm to stop prioritizing viral fake news in users’ news feeds.
“The algorithm is geared toward what gets clicks, what gets likes, and that lends itself to sensationalism,” he says. “What Facebook will have to do is find probably an algorithmic solution if it really wants to tackle this, and to make truth a value that it optimizes for, along with all the other things that it optimizes for in the news feed algorithm.”
In a recent Facebook post addressing the controversy, Zuckerberg writes that the company already penalizes content “we can confidently classify as misinformation” in users’ news feeds, so the stories are less likely to spread. But he admits that Facebook needs more mechanisms to identify fake stories. In the post, he cites examples of projects that are underway at the company, like making hoax reporting easier, developing better automatic detection systems for false news and putting warnings on stories that have been flagged as fake.
It’s too early to know how these projects will impact the complex problem of viral fake news on Facebook — or when they’ll go live. In the meantime, Zuckerberg’s post confirms something else: Facebook likely won’t rebrand as a media publisher anytime soon.
“We need to be careful not to discourage sharing of opinions or to mistakenly restrict accurate content,” Zuckerberg writes. “We do not want to be arbiters of truth ourselves, but instead rely on our community and trusted third parties.”