News
Did Fake News On Facebook Affect Election Results? Mark Zuckerberg Calls The Idea 'Crazy'
Ever since Election Day, the world has been having a large post-mortem discussion to figure out how we all got it so wrong. Why did most predictions and polls show Democratic nominee Hillary Clinton winning the election? What role did the media play in the rise of President-elect Donald Trump? Were newspapers to blame? Are polls even reliable anymore? And, most recently, experts have been asking: did fake news on Facebook affect election results at all?
You know the kind of articles I'm talking about: A family friend or great-aunt or former classmate shares a link to an article that is flagrantly incorrect, usually accompanied by their own outraged commentary on the news. Sometimes, you'll post a link to Snopes in reply and leave a comment saying, "Actually, Great-Aunt Cheryl, great news! An FBI agent investigating Clinton didn't get murdered." But not everyone wants to get involved in shooting down fake news on social media, and regardless of whether they intervene, the made-up report usually gets passed on anyway. It can seem like a losing battle.
While fake news on Facebook may have frustrated some of the website's more sensible users, some are now saying that its effects went further than simply being irritating. On Nov. 7, while stumping for Clinton in Michigan, Pres. Obama expressed his exasperation with the role fake news was playing in the election. "If they just repeat attacks enough, and outright lies over and over again, as long as it's on Facebook and people can see it, as long as its on social media, people start believing it," he said. "And it creates this dust cloud of nonsense."
A study conducted by BuzzFeed seems to back up the president's claims: BuzzFeed found that 38 percent of the news posted on large, right-wing Facebook pages and 20 percent of the news on similar left-wing pages was false or misleading. But their falsity didn't seem to affect the fake news' virality: one mostly false post, BuzzFeed found, was shared nearly 14,000 times, and garnered over 9,000 reactions and upwards of 2,000 comments from users. (Just as a heads up, the comments on that particular post are pretty awful — avert your eyes if you've had your fill of negativity for the week.)
At a Techonomy conference on Thursday, however, Facebook CEO Mark Zuckerberg disagreed with the notion that fake news on Facebook could have influenced the election. "Voters make decisions based on their lived experience," he said, according to The Guardian. "There is a profound lack of empathy in asserting that the only reason someone could have voted the way they did is because they saw fake news."
There's no way to measure the direct effect fake news had on election results, unfortunately. Regardless, Zuckerberg is likely right: fake news alone did not create the election results we saw on Nov. 8. But the sheer amount of viral fake news that inhabits Facebook (nearly 40 percent of the output of some political pages), and the fact that fake news can appear indistinguishable from real news, makes it hard to deny that misinformation did feed into the 2016 election. Luckily, that's something Facebook is working on changing.
"We take misinformation on Facebook very seriously," Adam Mosseri, VP of product management at Facebook, told Tech Crunch on Thursday. He explained that the site already attempts to reduce the spread of misinformation in both its Newsfeed and Trending sections, but admitted that there's still plenty of work to do. "Despite these efforts we understand there’s so much more we need to do, and that is why it’s important that we keep improving our ability to detect misinformation. We’re committed to continuing to work on this issue and improve the experiences on our platform."
Until Facebook's misinformation detection techniques improve, those who are fed up with the "dust cloud of nonsense" on social media can keep actively commenting on fake news stories with Snopes links and populate their own feeds with articles from the mainstream media. After all, despite Trump's many claims of a dishonest mainstream media, BuzzFeed found that less than 1 percent of news from the mainstream media contained unconfirmed information — making it, just maybe, a better news source than Facebook.