Chris Paul of Rand Corporation joined NeedleStack to outline their research on Russian disinformation tactics and what researchers can learn from it.

Many modern Russian disinformation techniques date back to the Cold War. Spreading propaganda and shaping people’s worldviews in a way that benefits them and their allies has become table stakes for the Kremlin. We sat down with Chris Paul, a senior social scientist at the Rand Corporation and professor at the Pardee RAND Graduate School, to discuss his research on Russia’s efforts to disinform, influence and persuade people within and outside of their borders. Below are the highlights from that conversation.

The four pillars of falsehood 

Paul’s interest in Russian propaganda techniques piqued in early 2016, when he and his colleagues had an opportunity to dig into Russia’s disinformation model amid widespread media reports of Russian meddling in the U.S. Presidential elections and other world affairs. The team coined the term, “The Firehose of Falsehood,” to describe the Kremlin’s efforts. The research followed obvious falsehoods that successfully influenced public opinion and became established as widely held beliefs. Paul refers to the four main characteristics that contribute to Russia’s dissemination model — each deeply rooted in psychology and the tendency of human nature to stick to their worldview once it’s been formed. 
 

  1. Volume and multichannel: Social psychology research shows that the more people hear something and the more different sources they hear it from, the more likely they are to believe it. Russia often uses multiple state-owned outlets to broadcast the same false, or mostly false stories, each channel referencing and cross-referencing one another, until it becomes ingrained in the viewers minds and is indistinguishable from the truth.
  2. Rapid, continuous and repetitive: First impressions matter, and it has everything to do with how humans store information. Unlike computers, people don’t organize what we hear into distinct filing systems. Instead, they rely on stories. We store our life experiences as one holistic narrative, and when we are presented with a factoid, instead of cataloging it into a mental filing cabinet, we tend to bake it into our worldview. So, when conflicting information comes along and someone attempts to debunk a factoid that we have accepted as part of our own narrative, it comes across as an attack on that worldview. As a result, the burden of proof becomes a greater task: to convince people that the factoid that has penetrated their consciousness is false, they have to be presented with enough facts to repair their worldview. By moving quickly and preemptively planting falsehoods into people’s minds, propaganda agents secure a hold on their worldview, making it very hard to debunk those untruths down the line.
  3. No commitment to objective reality: Effective lies don’t need to be wholly false. Maybe stories have some verifiable truths in them, maybe they are backed up by manufactured evidence, maybe they are false but presented by credible sources or maybe they are entirely true but spun in a way that’s completely misleading. Russia has mastered the art of finding angles that support their narrative, making it difficult for a regular person to distinguish fact from fiction. 
  4. No need for consistency: If one lie doesn't stick, the Kremlin can quickly pivot and try another. They don’t need to promote a consistent message -– just something to undermine the inconvenient truth. It’s quite common to see the same channel broadcasting conflicting accounts of the same events within a short period of time.   

 Exploiting human vulnerabilities 

No one wants to think so, but humans aren’t that great at distinguishing truths from falsehoods. Paul references a famous social psychologist to outline why, Daniel Kahneman and his book, “Thinking, Fast and Slow.” The book outlines two systems of thinking. The first system refers to an autopilot mode — a series of fast automatic shortcuts that require minimal effort and help us get through the day as we are constantly bombarded by information. It’s like getting in the car and driving to work. You almost don’t remember how you got there, even after you operated the car, navigated the roads and avoided collisions with other drivers. Since it's the same route you drive every day, it doesn’t require a conscious effort. 
 

S1E28-social-old-wine-new-bottle.mp4 - powered by Happy Scribe

A lot of the techniques that Russiauses date back to the Cold War.

It's old wine in a new bottle.

It's active, measuresjust the information.

The environment is so much more ripe for easydissemination,

for planting a story one place and thenusing a different source to say, hey,

look, somebody said that,and kind of amplifying your own bologna.

And we as humans are super vulnerable

to being tricked, misled,deceived and manipulated.

 

System two, on the other hand, is when you are attentive and focused. It’s a slow, effortful thought process that requires concentration, like figuring out directions to an unfamiliar place or mentally going through the presentation you are going to give in the afternoon. Being in System two is demanding and draining for our mental resources, according to Kahnemann, and humans simply can’t stay in this alert state all day. 

 

These systems tie into propaganda strategies. Creators of disinformation are relying on the fact that most of the time we are operating on an autopilot. When we are tired, distracted or overwhelmed, we are more vulnerable to accepting information we see without being skeptical. So, in a scenario where you are scrolling through social media and something comes on looking like a news story — maybe it has a news logo, the anchor seems reputable and there is  a footer spooling at the bottom of the screen with related headlines —- there’s nothing that you recognize as being alarming or something to be skeptical of. In system one, you may just accept what you’ve just encountered. All these brain shortcuts make humans vulnerable to being tricked, and propaganda agents know how to exploit that to the fullest potential. 

S1E28-social-determining-facts.mp4 - powered by Happy Scribe

It's the fact that we are just notthat good at adjudicating.

Truth from falsehood.

Famous social psychologist Daniel Kahneman

has this great book calledThinking Fast and Thinking Slow.

And in it, he isolates what he callsSystem One and System Two thinking.

System one is your autopilot.

System One is the shortcutsand the heuristics that you use to get

through a day bombarded by dataand attention grabbing things.

So it's anything that's easy,you just do the easy path.

It's what you do when you're tired.

It's what you do when you get behind

the wheel and you think,oh, I'm going to work.

And then you kind of zone out and 20

minutes later, oh,I'm in the parking lot at work.

Good for me.System Two is when you're really turned

on and you're attentive that's when you'refocused on something,

you're devoting all your attention,your concentration, all your faculties.

You're zoomed in on something.

Something has provoked youand stimulated you to that higher level.

And when we're in System Two,we're harder to trick.

But being in System Two is demanding.

It takes energy, and we don'thave have enough energy.

It's not possible to bein System Two all day long.

 

Paul’s research suggests that Russians are not even exceptionally good at this tradecraft. If they were, we probably wouldn’t actually notice when we are being manipulated. But even with crude techniques, Russian disinformation operatives are effective at disseminating stories, especially at times when society is ripe for disconcert. He compares the opportunistic moves of Russia’s propaganda strategy to a game of chess when a player has a long-term plan for the game but pivots depending on their opponent’s moves. If a rival player makes a mistake and a new opportunity presents itself, a skilled player will take advantage of it. The Russians tend to look for opportunities around specific events: natural disasters, calamities, civil unrest or political struggles. When they see a society in crisis, already struggling with contentious division, it creates an opportunity to amplify that with their carefully crafted messages. By stoking fire on both sides, they aim to escalate the conflict to the level where people are blaming each other, rather than trying to find solutions through cooperation and compromise. 

Nobody is immune, but everyone can do better 

Most people see themselves as resilient to disinformation, but that’s not what the data shows. Many of us tend to think that we are smart enough to discern fact from fiction and will never be manipulated into believing something that’s not true. But in reality, everyone is vulnerable. We all have those moments when we simply don’t pay close attention and each have our biases.. By recognizing the adversary’s tactics, educating ourselves and employing our own techniques to counter disinformation, we can get some headway in combating foreign influence.

“Everyone is vulnerable,” says Chris Paul.
 
One technique is called pre-bunking. Instead of debunking a falsehood after the fact, we can focus on shaping the audience’s first impression before the adversary does it. By publicizing evidence before an event occurs, people can make their own assumptions on what’s about to happen based on the facts. When an event is already unfolding, like an invasion or a separatist movement, and there’s ambiguity or confusion, the world doesn’t know how to respond. So, the next time a similar event occurs, intelligence agencies who want the world to know the truth, can take control of the narrative by pre-bunking the information: pointing out the movement of troops, for example, to get the first mover advantage and preempt the adversary from spinning the truth to support their cause. 

Another important part of debunking is to offer an alternative narrative. When a piece of information threatens someone’s worldview, they need a way to fill the hole that creates. It’s important not just to present the falsehoods but determine and present the truth as well.
 
Adversaries will never stop trying to spread falsehoods to support their agenda. The best way to promote the truth is to beat them at their own game, get the first mover’s advantage and recognize our own vulnerabilities. 
 
Silo for Research can help researchers and analysts fact-check publicly available information. Learn how to use tools like Silo Image Metadata Viewer can assist your research.
 

Tags
Social media