Mark Zuckerberg says the notion that fake news influenced the U.S. presidential election is “a pretty crazy idea.”
The Facebook CEO is finding himself in a unique position in this election cycle. Many news organizations have come under fire for their coverage of the campaign. Now Facebook is getting it too, as a modern media company that does not vet fake news from its News Feed and that, critics argue, allows users to stay in information bubbles that reinforce existing prejudices.
He says hoaxes existed before his platform was created. They aren’t new, and people who say misinformation is why Donald Trump won simply do not get it. “There’s a profound lack of empathy in asserting that the only reason why someone could have voted the way that they did is because they saw some fake news,” Zuckerberg says.
He also says his company has studied fake news and found it’s a “very small volume” of the content on Facebook. He did not specify if that content is more or less viral or impactful than other information.
Weeks back Facebook algorithms accidentally promoted to “trending news” a fake story about Fox News anchor Megyn Kelly pledging her support for Democratic candidate Hillary Clinton.
Zuckerberg also said his team has studied the filter bubble effect and the research shows that almost everyone has some friend on the other side of the aisle. A Democrat may think there’s no Republican in his News Feed, but there likely is. The social network’s ability to connect people makes it “inherently more diverse” than the major news stations of 20 years back, Zuckerberg says.
He says right now the problem is not that diverse information isn’t there. It’s there more than it was in the days of traditional media. The problem, he says, is that people don’t click on things that don’t conform to their worldview. And, he says, “I don’t know what to do about that.”
Facebook is not a free speech platform. It has a long list of rules, called Community Standards, of things you’re not allowed to say or share. Naked pictures of children are a well-known example.
During this election cycle, Zuckerberg personally intervened to change the rules. When Trump called for a ban on all Muslims, it was clearly hate speech, as defined by Facebook guidelines. But the CEO ordered his staff to not take it down because it was newsworthy.
That example is just one illustration of the extraordinary power Zuckerberg has amassed to decide what news is. And from his talk, it’s clear that he’s thinking about how to wield this power.
“When we started, the north star for us was: We’re building a safe community,” Zuckerberg says. He thought about how to control for bullies. One of the things that has shifted, he says, is that now news is a more important part of Facebook content. “We’re still working through what that means.”