Nonprofit, nonpartisan journalism. Supported by readers.

Donate
Topics

Of course fake news on Facebook is a real problem

Among other problems, nontraditional journalism platforms — companies like Facebook — will “always default to the path of least resistance,” says U of M professor Jane Kirtley. 

Facebook in particular is now feeling enormous pressure to use its other-worldly assets to correct what critics are calling a profound, democracy-eroding flaw in its business model.
REUTERS/Dado Ruvic

At some point over the numbingly long election cycle we’ve just endured you too may have had the experience of listening to someone ranting on with a zealot’s absolute certainty about some clearly outrageous piece of news and asked (maybe only to yourself): “Where in hell did you get that crap?”

Fans of traditional news reporting, information from newspapers and websites with editors red-lining out what isn’t provable and what is absurd on the face of it, may not have given as much credence to “news” heavily traded on Facebook and promoted up by Google’s algorithms. But others did. Big time.

Facebook in particular is now feeling enormous pressure to use its other-worldly assets to correct what critics are calling a profound, democracy-eroding flaw in its business model. A model that rather disingenuously says Facebook, with a billion subscribers, is merely a platform, a humble vehicle for whatever you and I want to talk about, but with no responsibility for accuracy. That position is under serious attack from serious people who take facts and accuracy … seriously. People who are startled and horrified by the popularity of ​stories like these​:

“FBI AGENT SUSPECTED IN HILLARY EMAIL LEAKS FOUND DEAD IN APPARENT MURDER-SUICIDE”

Article continues after advertisement

“George Soros: ‘I’m Going to Bring Down the U.S. by Funding Black Hate Groups’”

“BREAKING: Pope Francis Just Backed Trump. Released Incredible Statement Why — SPREAD THIS EVERYWHERE”

“Germany Folds to Sharia Law, Approves Child Marriages”

“No Liberals … Hillary DID NOT Win the Popular Vote, Stop With the Petitions”

Oh, and this one, too:

“IS IT TRUE!?! Wikileaks [sic] Reveals EPIC Video — Bill Clinton Having S*x with… “MEGAN KELLY” [sic] — Free Patriot Post.

Each of those stunners “went viral” on Facebook over the course of the election, shared hundreds of thousands of times and viewed likely millions of times more. The first, the one about the murder-suicide of an FBI agent, supposedly a retaliation hit arranged by the Clintons was “reported” by something called The Denver Guardian and was shared 480,000 times ​in a week.

By (stark) contrast, The New York Times’ story about Donald Trump ​claiming a nearly billion-dollar loss/tax avoidance ​was shared on Facebook 175,000 times ​in a month. Those numbers come from a BuzzFeed investigation that also tracked the origin of over a hundred completely bogus websites like the “Denver Guardian” to … a bunch of teenagers in Macedonia who figured out a way to game Facebook’s ad system, based on the number of times stories are shared, for $3,000 a month in easy cash, according to one of the kids.

Reaction to Facebook CEO Mark Zuckerberg’s initial response that it is “a crazy idea” that fake news on his site could have a significant impact on the election has been, well, intense. As recently as this Tuesday, New York Times contributing opinion writer ​Zeynep Tufekci​, associate professor at the University of North Carolina School of Information and Library Science, was explaining to readers (and Zuckerberg) how influential Facebook really is and how it accelerates confirmation bias:

Article continues after advertisement

In 2012, Facebook researchers … secretly tweaked the newsfeed for an experiment: Some people were shown slightly more positive posts, while others were shown slightly more negative posts. Those shown more upbeat posts in turn posted significantly more of their own upbeat posts; those shown more downbeat posts responded in kind. Decades of other research concurs that people are influenced by their peers and social networks.

And:

The problem with Facebook’s influence on political discourse is not limited to the dissemination of fake news. It’s also about echo chambers. The company’s algorithm chooses which updates appear higher up in users’ newsfeeds and which are buried. Humans already tend to cluster among like-minded people and seek news that confirms their biases. Facebook’s research shows that the company’s algorithm encourages this by somewhat prioritizing updates that users find comforting.

I’ve seen this firsthand. While many of my Facebook friends in the United States lean Democratic, I do have friends who voted for Mr. Trump. But I had to go hunting for their posts because Facebook’s algorithm almost never showed them to me; for whatever reason the algorithm wrongly assumed that I wasn’t interested in their views.

Google ​is also under criticism ​for the way its algorithms fail to detect and “demote” fake news from its feed. In the past several days, both companies have made moves to deprive people like the kids in Macedonia from making money off their mischief. But Facebook management, to the ​consternation of some employees​, is still dragging its heels on applying journalistic protocols to what it disseminates. 

One line of thinking is that Zuckerberg is still wary after feeling the wrath of conservatives who reacted loudly to news last summer that a handful of employees on its trending news desk ​did actually suppress ​factually unmoored stories, nearly all which promoted Trump in some way. (Related: ​Twitter has now deleted ​the accounts of so-called alt-right white nationalist groups.)

Given Facebook’s ​astonishing resources, ($364 billion in market value, having recently surpassed Berkshire Hathaway and General Electric), tech experts aren’t buying the difficulties of a software/coding fix. 

For ​Quartz, Josh Horowitz​ lists several ways Facebook can bridle in the bull, as several of the companies employees ​have described it​:

Devise and list a thorough procedure for identifying and managing misinformation.

Article continues after advertisement

This is perhaps the most important step Facebook can take, and its biggest failure to date. Facebook remains a black box in regard to how its algorithm prioritizes not just news or memes, but nearly everything that’s shared in its main feed. With regard to truthfulness however, it’s especially lacking. The social network has entire pages devoted to how it deals with harassment and hate speech, and a transparent way for users to report these things. It also has a page where it publishes the number of requests it has received from governments looking to obtain information about its users.

Facebook’s activities in both these areas have been criticized, but at least they exist. There is no comparable, detailed explanation of how it deals with fake information. Facebook might also consider allowing third-parties to occasionally review its algorithms and procedures for how effectively they vet hoaxes (or hate speech, or pornography), and then have them release reports on how meet they live up to the standards they set for themselves.

Algorithms are not Jane Kirtley’s specialty, but ethics in journalism are. The Silha Professor of Media Ethics and Law at the School of Journalism and Mass Communications at the U of M, Kirtley, and her colleague Chris Ison, are astute watchdogs for the ways standards, practices and ethics can slide south.

Both say the topic of fake news traded through social media is trending in their classes. Kirtley says her students have “talked a lot about this ‘journalism of affirmation’ ” and how organizations like Facebook, “businesses that are not really journalism but are now very much in the journalism business,” are influencing our culture. “I get a little impatient with people like Zuckerberg saying this sort of thing ‘Couldn’t possibly have mattered.’ Of course it matters.”

A fundamental problem being that for all the complaints consumers have about timidity in traditional journalism, non-traditional journalism platforms — companies like Facebook (with no training or experience in journalistic checks and balances) — will “always default to the path of least resistance,” she said. Which is to say: avoid conflict wherever and however possible.

Playing “arbiter of truth,” culling out flagrant nonsense is always fraught with peril. Traditional journalists are trained to expect it and deal with it. Purely commercial enterprises don’t have that impulse. “Facebook can only engage so much in this denial of what they are and what they are doing,” said Kirtley. “They have to put editorial practices and ethics in place.”

She adds, “The ease with which so many people use something like Facebook brings a tremendous responsibility for ethical controls. And ​not from the government. I am firmly opposed to that. But from within the companies themselves.”

Which isn’t to say monolithic operations like Facebook are the only ones who need to re-examine their responsibilities in an age of viral nonsense. “Everyone criticizes Trump for having a short attention span,” she says. “But sadly that also describes so many people in the country as a whole.”

Ison, who won a Pulitzer Prize for investigative reporting when he was at the Star Tribune, says: “People reading crap is not new. Ten years ago, people were starting to realize the amount of junk that that was out there and how easily it was moving around. The burden is on the reader to figure out if the sources are credible.”

Article continues after advertisement

Which is well and good, assuming the reader cares more that what they are reading and sharing is true than if it simply supports what they want to believe.

“Fake news is the biggest problem facing journalism right now,” says Ison. “I really believe that. And it’s been true for a while now.” (He mentions Bill Kovach and Tom Rosenstiel’s book, “​Blur: How to Know What’s True in the Age of Information​​ Overload​” as​ a valuable resource for anyone interested in the problem.)

“The thing is, low-quality news can be a very subtle thing,” Ison said. “Some of the stuff we’ve seen is obviously crazy. But everyone in the news business knows there are thousand ways to shade a story to leave a particular impression. I don’t see how technical tweaks to algorithms fix that. Maybe they can. I don’t know. But there are always going to be a million ways to misinform people. The public has to learn how to tell the difference between what’s real and what isn’t.”