Two weeks after the midterm elections, we’re talking about the size of the blue wave, structural problems in the voting system and the new dynamic in Washington after Democrats take over the U.S. House. One thing we’re not discussing much — after considerable hand-wringing — is Russian interference.
Did the government and Silicon Valley suddenly get good at fighting disinformation? Did Americans become a lot smarter about their news sources? Maybe the Russians just decided to leave well enough alone?
Russia did try to interfere in the election. If it was less of an issue this time, that may be because of the specifics of the last couple of elections. But experts also are clear on another point: Worse is yet to come.
Reports indicate that the Russians were up to the same tricks they used in 2016: using wedge issues to exacerbate tensions; playing both sides; seeding social media with fake stories. On election night, Facebook announced it had shut down more than 100 accounts it thought were connected to Russia’s Internet Research Agency.
The Hamilton 68 dashboard, created to monitor disinformation, reported that just prior to the election, social media accounts it tracks were “promoting divisive election issues and spreading articles intended to incite fear regarding migrant caravans in Mexico.” After the election, it said, those accounts started pushing vote fraud conspiracy theories.
It does appear that the government and social media companies have gotten a bit better about sniffing out Russian operations. In late October, the New York Times reported that the U.S. military’s Cyber Command was contacting individual Russians involved in the disinformation campaign to warn them that they had been discovered.
But maybe there also was something about this election. Even though the campaign was in many ways a referendum on President Donald Trump, midterms by definition have less of a clear focus. The campaign included national issues, but also hundreds of local ones. Although high turnout in a midterm still is less than participation in a presidential election year, many voters were deeply engaged this year.
In contrast, 2016 looks somewhat unique – a national contest between two well-known and highly polarizing figures. It’s simple common sense that it would be easier to encourage people to believe a falsehood about someone they already dislike, or about whom they have misgivings. Remember Pizzagate – the crackpot story of Hillary Clinton’s supposed child sex ring run from the basement of a Washington pizzeria.
Trump’s presidency and his campaign rallies this fall actually did a lot of the Russians’ work for them. Disinformation is not only about making stuff up; it’s about building on what’s already in the public realm. As the Hamilton68 report suggests, Trump’s incessant focus on the migrant convoy was a prime opportunity.
If it didn’t have a huge impact, perhaps the messenger had something to do with that. Every public issue in the Trump era is to one degree or another about Trump, and the migrant convoy was no different. If in the end, it’s all about Trump, voters don’t need help making up their minds. They already either detest him or love him.
So, do we just need to get past an era in which voters are focused on polarizing personalities? Does disinformation lose some of its zing when our candidates are boring? Not exactly.
Even before the Russians got serious about it, the U.S. already was deeply divided. Trump may be one cause; he may be an effect. He probably is some of both.
As The Times makes clear, exploiting and amplifying differences is an old KGB game, reinvigorated by Vladimir Putin and adapted to the Internet age. Russian intelligence is willing to take a very long view. It doesn’t need to destroy a society, simply to give it a nudge now and then to help it destroy itself. None of the experts The Times interviewed expect the challenge to go away. They expect it to get worse.
Polyakova notes that Russia is far behind the United States and China in developing artificial intelligence. But its spies discovered that commercial tech tools and platforms can be turned into high-impact weapons at minimal cost. That remains true as research allows for far superior audio and video manipulation, as artificial intelligence learns how to manipulate human emotions and as more sophisticated content distribution permits micro-targeted propaganda. You could be forgiven for not knowing what’s true anymore, or if you gave up trying to decide.
That’s precisely the point. Polyakova directs us to Hannah Arendt’s observation that a society unable to agree on core beliefs is paralyzed. “It is deprived not only of its capacity to act, but also of its capacity to think and to judge. And with such a people, you can then do what you please.”