People who rely on their “gut instincts” rather than on their reasoning skills or who believe that facts are politically biased are more likely to believe in fake news and conspiracy theories, a new study reports.

Conversely, people who use evidence to form their beliefs are less likely to have misperceptions about politically charged issues, such as climate change and the safety of vaccines, the study found.

“Scientific and political misperceptions are dangerously common in the U.S. today,” write the study’s two authors, R. Kelly Garrett, an associate professor of communication at Ohio State University, and Brian Weeks, an asisstant professor of communication at the University of Michigan. “The willingness of large minorities of Americans to embrace falsehoods and conspiracy theories poses a significant threat to society’s ability to make well informed decisions about pressing challenges.”

The study was published last week in the journal PLOS One

Assessing how beliefs are formed

For the study, Garrett and Weeks analyzed data from three nationally representative surveys, which included a total of about 2,100 participants. The researchers wanted to understand two things: how people form beliefs and how that process makes individuals more or less likely to believe non-factual information.

First, they examined the participants’ responses to a dozen questions, such as “I trust my gut to tell me what’s true and what’s not,” “A hunch needs to be confirmed with data,” and “Scientific conclusions are shaped by politics.” The questions were designed to measure the participants’ faith in intuition for facts, their need for evidence and their belief that truth is political.

“These are characteristics that we expected would be important above and beyond the role of partisanship,” said Garrett in a released statement. “We’re tapping into something about people’s understanding of the world, something about how they think about what they know, how they know it and what is true.”

Hot-button topics

Garrett and Weeks then compared the participants’ answers to those questions to their beliefs in four high-profile issues that are supported by scientific and factual evidence: that human activity is contributing to changes in the global climate, that most Muslims oppose violence against Western countries, that Iraq had no weapons of mass destruction immediately before the Iraq war, and that vaccines do not cause autism.

They also compared the answers to how likely the participants were to endorse seven major conspiracy theories: that the assassination of John F. Kennedy was not committed by Lee Harvey Oswald alone, that the assassination of Martin Luther King Jr. was the result of a government conspiracy, that Princess Diana’s death was an organized assassination by members of the British royal family, that a group known as New World Order is planning to eventually rule the world through an autonomous world government, that the U.S. government permitted the 9-11 attack as an excuse to start a war in Afghanistan and Iraq and to attack civil liberties, that the AIDS epidemic was intentionally created by government agencies and purposely inflicted on black and gay men in the 1970s, and that the Apollo moon landings never happened but were instead staged in a Hollywood film studio.

The share of the participants who endorsed these conspiracies ranged from 45.7 percent (the assassination of Kennedy) to 15.3 percent (the Apollo moon landings.)

A susceptibility to misinformation

“We find,” write Garrett and Weeks, “that individuals who trust their intuition, putting more faith in their ability to use intuition to assess factual claims than in their conscious reasoning skills, are uniquely likely to exhibit conspiracist ideation. Those who maintain that beliefs must be in accord with available evidence, in contrast, are less likely to embrace conspiracy theories, and they are less likely to endorse other falsehoods, even on politically charged topics.” 

“Finally, those who view facts as inexorably shaped by politics and power are more prone to misperception than those who believe that truth transcends social context.”

“While trusting your gut may be beneficial in some situations, it turns out that putting faith in intuition over evidence leaves us susceptible to misinformation,” said Weeks in the press release.

Paying attention to evidence

Garrett sees some hopeful news in the study’s findings.

“People sometimes say that it’s too hard to know what’s true anymore. That’s just not true,” he said. “These results suggest that if you pay attention to evidence you’re less likely to hold beliefs that aren’t correct.”

“This isn’t a panacea — there will always be people who believe conspiracies and unsubstantiated claims — but it can make a difference,” he added.

FMI: You can read the study in full on the PLOS One website.

Join the Conversation

1 Comment

  1. My 2¢

    In an emotional/personal relationship context, “gut instinct” may prove worthwhile, and I’ve heard mental health professionals in both public and private venues suggest that it’s worth paying attention to in those kinds of situations, though even in those cases, factual evidence ought to take precedence if it’s available. In a public policy context, however, “gut instinct” is much less useful. One’s personal experience is likely to be fairly limited, and ithere’s plenty of evidence that, in recent years, we’re inclined to live in informational “silos” that tend to favor our own preconceived notions, while simultaneously ignoring information or opinion that doesn’t conform to what we already believe. It’s pretty easy, in those situations, to fall prey to what we’ll politely call “misinformation,” but which, in other circumstances, would be labeled as “propaganda,” “bull***t,” or simply “lies.”

Leave a comment