Nonprofit, nonpartisan journalism. Supported by readers.

UCare generously supports MinnPost’s Second Opinion coverage; learn why.

Trolls, spambots and the psychology of online comments

A ban on article comments may simply move them to a different venue, such as Twitter or Facebook.

Earlier this fall, Popular Science, a 141-year-old science and technology magazine, announced that readers would no longer be able to post comments about its articles online.

“We are as committed to fostering lively, intellectual debate as we are to spreading the word of science far and wide,” wrote Suzanne LaBarre, the magazine’s online content director. “The problem is when trolls and spambots overwhelm the former, diminishing our ability to do the latter.”

“A politically motivated, decades-long war on expertise has eroded the popular consensus on a wide variety of scientifically validated topics,” she added. “Everything, from evolution to the origins of climate change, is mistakenly up for grabs again. Scientific certainty is just another thing for two people to ‘debate’ on television. And because comments sections tend to be a grotesque reflection of the media culture surrounding them, the cynical work of undermining bedrock scientific doctrine is now being done beneath our own stories, within a website devoted to championing science.”

Incivility and polarization

Anybody who spends any amount of time online understands the frustration expressed by LaBarre toward the uncivil and sometimes amazingly uninformed reader-comments that accompany online science articles.

Furthermore, LaBarre is justified in being concerned about the effect that these uncivil comments have on people’s perception of science. As she notes, a University of Wisconsin-Madison study published earlier this year found that the more uncivil the comments to a news story, the more polarized readers became about the issue — a phenomenon dubbed “the nasty effect” by the authors of the study.

In addition, uncivil comments tended to change readers’ interpretations of the news item itself — making them believe, for example, that the negative aspects of a technology being reported upon is greater than they had previously thought.

A ban may not be the answer

But will banning online comments help?

Perhaps not, suggests psychologist Maria Konnikova in a provocative review of research on the topic that was published last week on the New Yorker’s website.

Writes Konnikova:

A ban on article comments may simply move them to a different venue, such as Twitter or Facebook — from a community centered around a single publication or idea to one without any discernible common identity. Such large group environments, in turn, often produce less than desirable effects, including a diffusion of responsibility: you feel less accountable for your own actions, and become more likely to engage in amoral behavior.

In his classic work on the role of groups and media exposure in violence, the social cognitive psychologist Alfred Bandura found that, as personal responsibility becomes more diffused in a group, people tend to dehumanize others and become more aggressive toward them. At the same time, people become more likely to justify their actions in self-absolving ways.

Multiple studies have also illustrated that when people don’t think they are going to be held immediately accountable for their words they are more likely to fall back on mental shortcuts in their thinking and writing, processing information less thoroughly. They become, as a result, more likely to resort to simplistic evaluations of complicated issues, as the psychologist Philip Tetlock has repeatedly found over several decades of research on accountability.

Removing comments also affects the reading experience itself: it may take away the motivation to engage with a topic more deeply, and to share it with a wider group of readers. In a phenomenon known as shared reality, our experience of something is affected by whether or not we will share it socially. Take away comments entirely, and you take away some of that shared reality, which is why we often want to share or comment in the first place. We want to believe that others will read and react to our ideas.

Konnikova argues that what the University of Wisconsin-Madison study shows is not “the negative power of a comment in itself but, rather, the cumulative effect of a lot of positivity or negativity in one place, a conclusion that is far less revolutionary.”

Possible solutions?

Konnikova seems to support commenter policies that enable readers to vote a comment “up” or “down.” “Users can set the tone of the comments, creating a surprisingly civil result,” she writes.

Here at MinnPost, we have another system in place, of course. We ban anonymous comments and also use a dedicated team of volunteers to review comments before they are posted, thus ensuring that they follow the site’s rules. As a result, we have fewer comments, perhaps, than other websites, but the ones we receive are (usually) highly civil and often add depth and insight to the articles themselves.

I know I’ve learn a lot from many of the comments (and e-mails) I’ve received from Second Opinion readers. Of course, I receive my share of uncivil comments as well, although usually via e-mail.

Still, that’s nothing new.

“While it’s tempting to blame the Internet, incendiary rhetoric has long been a mainstay of public discourse,” writes Konnikova. “Cicero, for one, openly called Mark Antony a ‘public prostitute,’ concluding, ‘but let us say no more of your profligacy and debauchery.’”

That was actually one of Cicero’s tamer comments about Antony. One wonders what he might have done with a Twitter account.

You can read Konnikova’s essay on the New Yorker website.

Comments (7)

  1. Submitted by Nancy Hokkanen on 10/28/2013 - 12:11 pm.

    Comment censorship also blocks evidence of scientific fraud

    “As hard as corporate-biased writers and editors try, they cannot control readers’ skepticism about who determines what information and research constitutes science bedrock.

    “Despite the high-sounding defensive rhetoric of so-called “science writers,” one cannot always believe what one reads. Cognitive dissonance is inevitable when readers discover vaccine and autism research that:
    • excludes affected populations,
    • uses inappropriate placebos,
    • leverages fraudulent statistics,
    • employs questionable researchers,
    • approves products not fully tested,
    • contains amounts of metals deemed toxic by other agencies, and
    • refuses to compare health outcomes with those not using the product.”

    (Links to examples illustrating those bullet points are provided at the website.)

    Popular Science To Cease Unpopular Comments

  2. Submitted by Rosalind Kohls on 10/28/2013 - 01:23 pm.

    Where do they get the time?

    What always surprises me about Minnpost’s comments is how much time the commenters invest in them. Not only are the comments often long, but there are multiple long comments from the same commenter for the same article. Most of the comments seem to be written in the middle of the day when people work.
    I’m surprised that the commenters’ employers don’t complain. Are the commenters retired or self-employed?

  3. Submitted by Richard Parker on 10/28/2013 - 05:33 pm.

    Real names are good

    It’s Obama’s fault. No, wait…

    I agree strongly that requiring commenters to use their real names at least tones down the craziness and deters writers from getting vicious. The culture in the Star Tribune’s anonymous comments — when they aren’t disabled — has been characterized as a “cesspool.”

  4. Submitted by jason myron on 10/28/2013 - 05:51 pm.

    Since your comment was written at 1:23

    in the afternoon, perhaps you could answer your own question…are you retired or self-employed? This is a connected society…the old norms of everyone working the same hours in the same confined settings no longer apply. I don’t assume anything about the source of a comment from the time posted.

  5. Submitted by Charles Holtman on 10/29/2013 - 08:39 am.

    The most harmful proposition

    For productive online dialogue is “not B = A.”

    For those who set the frames of discourse, the crux is to narrow the range of options. Put simply, instead of a society of thoughtful people contemplating a full range of political, economic, social options without artificial line-drawing, it serves vested interests that the people think there are just two options – Republican and Democrat, or conservative and liberal – that themselves constitute the poles of a very narrow spectrum of thought that doesn’t challenge those interests (captured by the Thatcher-coined acronym “TINA” or “There is No Alternative”). One result is that discourse is reduced to a verbal battle of two clans. The goal is no longer working out what is best public policy in a complex society, but in defeating the other side. And since there are only two sides, the proposition “not B = A” becomes central: one no longer has to make a cogent argument for one’s position, one can just swing wildly at the “other side’s” position. Showing a flaw in that position then counts as a victory for “your side,” even if your position may be no more valid.

    This is not a “both siderism” observation. One “side” is much more guilty of this. But the other “side,” as well, has let itself be baited into this phenomenon much more than it should.

  6. Submitted by Bill Schletzer on 10/29/2013 - 08:54 am.

    What I have noticed on Minnpost

    I have noticed that in the political articles when a right winger lobs left-hostile comments into the comment section then the level of discourse goes downhill. Instead of thoughtful comments about the article so many people, myself included, seem to feel a need to respond to the troll-like comments from the right. Maybe that is the point: sort of a form of intellectual guerilla warfare. Instead of the left/center community becoming stronger and more unified working out the weaknesses in their collective thinking, they waste their energy responding to outlandish comments that are only posted to offend.

    Just mention global warming, Obama, ACA, creationism or many other topics and one of the reliable right wingers will post something guaranteed to incite.

    I come here to be with like-minded people. I don’t go to the Focus on the Family web site (if there is one) and make comments about how silly creationism is or how God made gays gay. There are web sites for every flavor, so why hang around somewhere that you disagree with so strongly?

    For the autism commenter above I beg to differ. Blocking comments doesn’t block evidence of fraud because comments aren’t evidence, they are reactions. Scientific evidence, whatever its quality, is refuted by other scientific evidence that is gathered using the scientific method. It can be a long, arduous process but ultimately the truth will win out. Commenters often rely on anecdotes which are more suggestive rather than evidential. Having had some personal experience with a relative on the “autistic spectrum”, itself a rather recent term, I know that knowledge about this is evolving and when loved ones are affected it is natural to want to leap ahead to conclusions if those conclusions have a chance to help. I think that is natural, but caution is needed when leaping. That’s why scientists use the scientific method.

    I hope you post this. I’m not writing to offend.

  7. Submitted by Christopher Williams on 10/29/2013 - 10:31 am.

    Self Moderation Helps

    The self moderation features on sites like Slashdot and Reddit help to police things as well. With enough community downvotes, certain comments become hidden. Other places like CNN or Yahoo have up and down voting but it doesn’t seem to have any effect on the display of the comment. Of course this can lead to community groupthink (such as on Slashdot where anything involving Linux is good and anything with Microsoft is bad). But it’s helpful in a lot of cases to hide trolls and spam.

Leave a Reply