Facebook has come out on the defensive against the criticism being directed toward its News Feed in the post-election landscape. With reports blaming the company for the spread of misinformation on its social network, and the adverse affect that has on its users, the company had no choice but to respond.
On Thursday, Facebook CEO Mark Zuckerberg addressed the accusations directly during his talk at the Techonomy conference. “I’ve seen some of the stories around this election,” he remarked. “Personally I think the idea that fake news on Facebook, which is a very small amount of the content, influenced the election in any way … is a pretty crazy idea.”
He continued: “There is a certain profound lack of empathy in asserting that someone voted the way they did … because they saw some fake news. If you believe that then I don’t think you have internalized the message Trump supporters in this election are trying to send.”
Even before election night, reports had surfaced claiming misinformation was rife on Facebook’s largely automated trending topics feed. Then, earlier this week, BuzzFeed uncovered a fake news empire that had infiltrated the platform’s user-led forums (known as “Groups”), tracing its origins to Macedonia. Following the election result, and Donald Trump’s ascendancy to the highest seat in the land, the criticism only grew louder.
For his part, Zuckerberg claims the issue is being blown out of proportion: “All the research that we have suggests that this isn’t really a problem. We study this, we know that it’s a very small volume of anything,” said the Facebook founder.
“Hoaxes aren’t new on Facebook. There have been hoaxes on the internet and there were hoaxes before, and we do our best to make it so that people can report that,” he added.
All the same, Adam Mosseri, vice president of product management at Facebook, released a statement on Thursday noting that “there’s so much more we need to do [at Facebook]” in order to quell the rise of erroneous information. This is particularly salient given that the Pew Research Center reports that more than 40 percent of American adults use Facebook as a primary news source.
Facebook puts the onus on its users to flag content as inappropriate. Taking the action to report a News Feed item brings up several options in a pop-up window that asks you to describe whether the content is, for example, spam.
Over the past several months, Facebook has made a number of changes to optimize content on its News Feed, taking steps to combat what it terms clickbait, and to serve up more “newsworthy” items to its users.
Zuckerberg also attempted to turn the tables on his accusers in the media by claiming that Facebook is in fact more diverse than traditional news outlets. Dismissing the idea that the News Feed algorithm creates an echo chamber (or filter bubble) of corresponding views, he argued that Facebook’s networking system exposes users to more opinions from both sides of the political spectrum. The real problem, according to Zuckerberg, is how to get people to engage with content that does not conform to their worldview.
“The content from both sides is there but the people who use the network … tune it out. I don’t know what to do about that,” he admitted. Zuckerberg concluded the topic by noting, “Well we have a lot of work to do. But that would have been true either way.”
Updated on 11-12-2016 by Lulu Chang: Added reports that Facebook executives admitted that there is “work to do” when it comes to combatting fake news on the social media platform.