Nonprofit, nonpartisan journalism. Supported by readers.


Of course fake news on Facebook is a real problem

REUTERS/Dado Ruvic
Facebook in particular is now feeling enormous pressure to use its other-worldly assets to correct what critics are calling a profound, democracy-eroding flaw in its business model.

At some point over the numbingly long election cycle we’ve just endured you too may have had the experience of listening to someone ranting on with a zealot’s absolute certainty about some clearly outrageous piece of news and asked (maybe only to yourself): “Where in hell did you get that crap?”

Fans of traditional news reporting, information from newspapers and websites with editors red-lining out what isn’t provable and what is absurd on the face of it, may not have given as much credence to “news” heavily traded on Facebook and promoted up by Google’s algorithms. But others did. Big time.

Facebook in particular is now feeling enormous pressure to use its other-worldly assets to correct what critics are calling a profound, democracy-eroding flaw in its business model. A model that rather disingenuously says Facebook, with a billion subscribers, is merely a platform, a humble vehicle for whatever you and I want to talk about, but with no responsibility for accuracy. That position is under serious attack from serious people who take facts and accuracy … seriously. People who are startled and horrified by the popularity of ​stories like these​:


“George Soros: ‘I’m Going to Bring Down the U.S. by Funding Black Hate Groups’”

“BREAKING: Pope Francis Just Backed Trump. Released Incredible Statement Why — SPREAD THIS EVERYWHERE”

“Germany Folds to Sharia Law, Approves Child Marriages”

“No Liberals … Hillary DID NOT Win the Popular Vote, Stop With the Petitions”

Oh, and this one, too:

“IS IT TRUE!?! Wikileaks [sic] Reveals EPIC Video — Bill Clinton Having S*x with… “MEGAN KELLY” [sic] — Free Patriot Post.

Each of those stunners “went viral” on Facebook over the course of the election, shared hundreds of thousands of times and viewed likely millions of times more. The first, the one about the murder-suicide of an FBI agent, supposedly a retaliation hit arranged by the Clintons was “reported” by something called The Denver Guardian and was shared 480,000 times ​in a week.

By (stark) contrast, The New York Times’ story about Donald Trump ​claiming a nearly billion-dollar loss/tax avoidance ​was shared on Facebook 175,000 times ​in a month. Those numbers come from a BuzzFeed investigation that also tracked the origin of over a hundred completely bogus websites like the “Denver Guardian” to … a bunch of teenagers in Macedonia who figured out a way to game Facebook’s ad system, based on the number of times stories are shared, for $3,000 a month in easy cash, according to one of the kids.

Reaction to Facebook CEO Mark Zuckerberg’s initial response that it is “a crazy idea” that fake news on his site could have a significant impact on the election has been, well, intense. As recently as this Tuesday, New York Times contributing opinion writer ​Zeynep Tufekci​, associate professor at the University of North Carolina School of Information and Library Science, was explaining to readers (and Zuckerberg) how influential Facebook really is and how it accelerates confirmation bias:

In 2012, Facebook researchers … secretly tweaked the newsfeed for an experiment: Some people were shown slightly more positive posts, while others were shown slightly more negative posts. Those shown more upbeat posts in turn posted significantly more of their own upbeat posts; those shown more downbeat posts responded in kind. Decades of other research concurs that people are influenced by their peers and social networks.


The problem with Facebook’s influence on political discourse is not limited to the dissemination of fake news. It’s also about echo chambers. The company’s algorithm chooses which updates appear higher up in users’ newsfeeds and which are buried. Humans already tend to cluster among like-minded people and seek news that confirms their biases. Facebook’s research shows that the company’s algorithm encourages this by somewhat prioritizing updates that users find comforting.

I’ve seen this firsthand. While many of my Facebook friends in the United States lean Democratic, I do have friends who voted for Mr. Trump. But I had to go hunting for their posts because Facebook’s algorithm almost never showed them to me; for whatever reason the algorithm wrongly assumed that I wasn’t interested in their views.

Google ​is also under criticism ​for the way its algorithms fail to detect and “demote” fake news from its feed. In the past several days, both companies have made moves to deprive people like the kids in Macedonia from making money off their mischief. But Facebook management, to the ​consternation of some employees​, is still dragging its heels on applying journalistic protocols to what it disseminates. 

One line of thinking is that Zuckerberg is still wary after feeling the wrath of conservatives who reacted loudly to news last summer that a handful of employees on its trending news desk ​did actually suppress ​factually unmoored stories, nearly all which promoted Trump in some way. (Related: ​Twitter has now deleted ​the accounts of so-called alt-right white nationalist groups.)

Given Facebook’s ​astonishing resources, ($364 billion in market value, having recently surpassed Berkshire Hathaway and General Electric), tech experts aren’t buying the difficulties of a software/coding fix. 

For ​Quartz, Josh Horowitz​ lists several ways Facebook can bridle in the bull, as several of the companies employees ​have described it​:

Devise and list a thorough procedure for identifying and managing misinformation.

This is perhaps the most important step Facebook can take, and its biggest failure to date. Facebook remains a black box in regard to how its algorithm prioritizes not just news or memes, but nearly everything that’s shared in its main feed. With regard to truthfulness however, it’s especially lacking. The social network has entire pages devoted to how it deals with harassment and hate speech, and a transparent way for users to report these things. It also has a page where it publishes the number of requests it has received from governments looking to obtain information about its users.

Facebook’s activities in both these areas have been criticized, but at least they exist. There is no comparable, detailed explanation of how it deals with fake information. Facebook might also consider allowing third-parties to occasionally review its algorithms and procedures for how effectively they vet hoaxes (or hate speech, or pornography), and then have them release reports on how meet they live up to the standards they set for themselves.

Algorithms are not Jane Kirtley’s specialty, but ethics in journalism are. The Silha Professor of Media Ethics and Law at the School of Journalism and Mass Communications at the U of M, Kirtley, and her colleague Chris Ison, are astute watchdogs for the ways standards, practices and ethics can slide south.

Both say the topic of fake news traded through social media is trending in their classes. Kirtley says her students have “talked a lot about this ‘journalism of affirmation’ ” and how organizations like Facebook, “businesses that are not really journalism but are now very much in the journalism business,” are influencing our culture. “I get a little impatient with people like Zuckerberg saying this sort of thing ‘Couldn’t possibly have mattered.’ Of course it matters.”

A fundamental problem being that for all the complaints consumers have about timidity in traditional journalism, non-traditional journalism platforms — companies like Facebook (with no training or experience in journalistic checks and balances) — will “always default to the path of least resistance,” she said. Which is to say: avoid conflict wherever and however possible.

Playing “arbiter of truth,” culling out flagrant nonsense is always fraught with peril. Traditional journalists are trained to expect it and deal with it. Purely commercial enterprises don’t have that impulse. “Facebook can only engage so much in this denial of what they are and what they are doing,” said Kirtley. “They have to put editorial practices and ethics in place.”

She adds, “The ease with which so many people use something like Facebook brings a tremendous responsibility for ethical controls. And ​not from the government. I am firmly opposed to that. But from within the companies themselves.”

Which isn’t to say monolithic operations like Facebook are the only ones who need to re-examine their responsibilities in an age of viral nonsense. “Everyone criticizes Trump for having a short attention span,” she says. “But sadly that also describes so many people in the country as a whole.”

Ison, who won a Pulitzer Prize for investigative reporting when he was at the Star Tribune, says: “People reading crap is not new. Ten years ago, people were starting to realize the amount of junk that that was out there and how easily it was moving around. The burden is on the reader to figure out if the sources are credible.”

Which is well and good, assuming the reader cares more that what they are reading and sharing is true than if it simply supports what they want to believe.

“Fake news is the biggest problem facing journalism right now,” says Ison. “I really believe that. And it’s been true for a while now.” (He mentions Bill Kovach and Tom Rosenstiel’s book, “​Blur: How to Know What’s True in the Age of Information​​ Overload​” as​ a valuable resource for anyone interested in the problem.)

“The thing is, low-quality news can be a very subtle thing,” Ison said. “Some of the stuff we’ve seen is obviously crazy. But everyone in the news business knows there are thousand ways to shade a story to leave a particular impression. I don’t see how technical tweaks to algorithms fix that. Maybe they can. I don’t know. But there are always going to be a million ways to misinform people. The public has to learn how to tell the difference between what’s real and what isn’t.”

Comments (2)

  1. Submitted by Ray Schoch on 11/17/2016 - 12:10 pm.

    I don’t

    …hold out a lot of hope for significant change, at least in the near future. Mr. Zuckerberg’s denials of Facebook’s influence are essentially Trumpian in their self-servingness. On the one hand, ads are being sold on the basis of the number of people who will see them online, when the sole purpose of the ad is to influence the person who sees it, and on the other hand, Zuckerberg would have us believe that what appears on Facebook “couldn’t possibly” have an effect on an election, when that’s the whole point of Facebook’s business model in the first place.

    He can’t have it both ways, and neither can the public. I grew up hearing “Don’t believe everything you read in the papers,” and it seems painfully obvious that that cautionary suggestion needs an online corollary. Just because it shows up on your Facebook feed doesn’t mean it’s true. For example, despite a headline to this effect on my feed yesterday, it seems that Jennifer Aniston is not dead.

  2. Submitted by Todd Adler on 11/17/2016 - 12:34 pm.

    Information Junk-ie

    In the months leading up to the elections, I spent a fair amount of time engaging people on Facebook about voter fraud issues. The narrative ranged anywhere from lost ballets to rigged machines and felons voting in droves.

    With each person I asked them if they had a credible news source I could reference to check out their claims. For the people who bothered to respond with anything other than vitriol (how dare I challenge their belief system!), the articles they provided were invariably spurious. The best sources would reference a story of, say, 1129 felons that voted. Which is not to say that they did vote, but that they may have voted, but the allegations haven’t been substantiated.

    Of course the reader doesn’t make that distinction and gets offended when I point it out.

    The worst ones were blog posts posing as fact or “news” stories that are long on innuendo and short on verifiable facts. One guy claimed that “millions of illegals voted in the last election” and then quietly slunk away when I asked him for a credible source for the claim.

    The disinformation has gotten so bad that I had to sit down and chat with several people about election fraud at the polling place. I took one lady outside and spent a good twenty minutes chatting with her answering her questions about poll procedures and why fraud in Minnesota his highly unlikely. We went through the whole litany of rigged polling machines, swapped ballots, compromised judges, dead people voting, and on and on.

    None of her scenarios were in the least plausible for someone who knows even a little bit about how the process works, but when your days are spent going from one article to the next with outrageous headlines and they never take ten minutes to look into the details, it’s no wonder they come away with these misconceptions.

    Maybe there’s still hope for humanity, but I won’t be putting any money on that stock.

Leave a Reply