Nonprofit, nonpartisan journalism. Supported by readers.

Donate
Topics
UCare generously supports MinnPost’s Second Opinion coverage; learn why.

Does the field of social psychology have an anti-conservative bias?

A New Yorker author summarizes the evidence for both a “yes” and a “no” answer, but mostly focuses on findings that suggest an anti-conservative bias does exist.

Attendees of a 2012 Tea Party Patriots rally on Capitol Hill in Washington, D.C.
REUTERS/Jonathan Ernst

In a provocative article published online last week in the New Yorker, Maria Konnikova (“Mastermind: How to Think Like Sherlock Holmes”) asks, “Is social psychology biased against Republicans?”

She summarizes the evidence for both a “yes” and a “no” answer to that question, but mostly focuses on the findings that suggest an anti-conservative bias does exist.

For example, she cites a 2012 study that surveyed 800 social psychologists. It found that significant percentages of social psychologists have conservative viewpoints (on some issues), but that both they and their ideas face significant obstacles. Here’s how the two Dutch authors of that study summed up those findings:

First, although only 6% [of the social psychologists surveyed] described themselves as conservative “overall,” there was more diversity of political opinion on economic issues and foreign policy. [Eighteen percent described themselves as conservative on economic issues, for example.] Second, respondents significantly underestimated the proportion of conservatives among their colleagues. Third, conservatives fear negative consequences of revealing their political beliefs to their colleagues. Finally, they are right to do so: In decisions ranging from paper reviews to hiring, many social and personality psychologists said that they would discriminate against openly conservative colleagues. The more liberal respondents were, the more they said they would discriminate.

Article continues after advertisement

This bias may influence the design, execution, evaluation and interpretation of research, says Konnikova. But, as she also points out, there are other biases that may have similar influences. Over the years, research has suggested that peer reviewers  — professionals within the same field who evaluate a paper to determine whether it merits being published — judge research more highly when it is attached to a famous institution than to a lesser known one. In addition, studies authored by men tend to receive higher evaluations than those authored by women when the reviewers are male — and vice versa. (Much of the research Konnikova cites about these biases is two or three decades old. It would be interesting to see if such biases continue to exist today — and, if so, to what extent.)

‘A bias of belief’

But, says Konnikova, there is another type of bias, which is even less visible: a “bias of belief.” She explains:

Here, the question isn’t about things that can be easily tested as empirical fact, like whether the sky is green or whether French fries make you skinny. They’re about the nebulous areas of philosophical and ideological leanings about the way the world should be.

One early study had psychologists review abstracts that were identical except for the result, and found that participants “rated those in which the results were in accord with their own beliefs as better.” Another found that reviewers rejected papers with controversial findings because of “poor methodology” while accepting papers with identical methods if they supported more conventional beliefs in the field. Yet a third, involving both graduate students and practicing scientists, showed that research was rated as significantly higher in quality if it agreed with the rater’s prior beliefs.

When Armstrong and the Drake University professor Raymond Hubbard followed publication records at sixteen American Psychological Association journals over a two-year period, comparing rejected to published papers — the journals’ editors had agreed to share submitted materials — they found that those about controversial topics were reviewed far more harshly. Only one, in fact, had managed to receive positive reviews from all reviewers. There was a secret, however, about that one. “The editor revealed that he had been determined to publish the paper, so he had sought referees that he thought would provide favorable reviews,” Armstrong wrote.

All these studies and analyses are classic examples of confirmation bias: when it comes to questions of subjective belief, we more easily believe the things that mesh with our general world view. When something clashes with our vision of how things should be, we look immediately for the flaws.

Of course, liberal social psychologists aren’t the only ones susceptible to confirmation bias. Conservatives are guilty of it, too. As social psychologist Jonathan Haidt, who has been quite vocal about expressing his concerns about political biases in his field (and who is featured in Konnikova’s article) told BBC Radio: “If you know what a group holds sacred, you’ll be able to find where they deny science. Everybody denies science when it’s uncomfortable.”

A ‘blinding’ solution

What’s the solution? Konnikova suggests “a blinding of the peer-review system — both in terms of applicants’ names and personal backgrounds and the hypotheses (or findings) of their research.” Here’s how she says that would work:

If you want to research Democrats and Republicans, say — or any other ideologically loaded topics — call them Purples and Oranges for the duration of the paper. The methods and research structure will be evaluated without any ideological predispositions. Blind peer review in papers and grants would also solve a number of other bias problems, including against certain people, institutions, and long-held ideas. As for ideologically sensitive papers that have already been published, blind that data as well and reanalyze the premises and conclusions, pairing them with Tetlock’s turnabout tests. Is the opposite approach nonsensical? Chances are, then, that this one is, too.

Article continues after advertisement

As I said, this is a provocative article. You can read it on the New Yorker website. And if you want to be challenged even more on the topic, I recommend listening to a BBC Radio documentary from a couple years back on “Political Prejudice.” (hat tip: MindHacks).