Disinformation: A Modern Style of Warfare

Since the dawn of social media, one of the most prevalent problems has been the spread of disinformation. It can involve insignificant things, such as claiming that this celebrity said this controversial thing or that the aliens built the pyramids in Egypt. But more recently, countries and state actors have been spreading disinformation through social media to advance their goals and influence conversations. 

Earlier this summer, I attended a politics program at Salve Regina University where I met Dr. Jim Ludes, executive director of the Pell Center for International Relations and Public Policy. Ludes has spoken and written extensively about this topic and has been long fixated on the topic of national security, having previously served as a legislative assistant for defense and foreign policy to Senator John Kerry and was a member of President-elect Obama’s Defense Department transition team. I spoke to him on this topic of disinformation.

Note: The interview was conducted on August 8, 2024. The views and opinions expressed in this interview are those of the interviewee and do not necessarily reflect the views or positions of Polinsights.

Could you give a basic definition of disinformation? 

So it’s funny because there are different definitions that are used all the time. So you hear disinformation, you hear misinformation. More recently, we’ve heard malinformation, and the keyword there is information. So, the way I like to think of it is that misinformation with an “m” is information that’s wrong, but it might be wrong just because it’s an accident. Someone just got the reporting wrong. There’s no intention behind it. It’s just inaccurate information that’s misinformation.

Disinformation is information that is intentionally wrong on purpose. It’s typically a political purpose. There’s the intention behind that lack of truth that’s usually trying to accomplish something.

Malinformation (mal meaning sickly or poor in French) is bad information. It could be bad because it was an accident or it could be bad on purpose with intention. So disinformation and misinformation are different based on intention.

Could you talk a bit about how disinformation came to be used as a tactical weapon by countries? 

So disinformation and information on the battlefield is as old as human conflict itself. We can go back and we can look at examples from antiquity, where it’s been a clear understanding that information is a central part of operations on the battlefield. Sun Tzu – the great Chinese military theorist, writing however many centuries before Christ – was acutely aware of the power of information on the battlefield. It is something that we have lived with across human history. I have a great colleague of mine who works on these sorts of issues who says that the first propagandist was the snake with Adam and Eve. He used information to tempt Eve into humanity’s first sin. So, information as a weapon has been around as long as human beings have been around. 

In the Cold War era, you saw sophisticated structures begin to develop inside governments. There were antecedents – initial examples, even in the First World War and the Second World War – where there were information offices that sprung up at all sides of the conflict that were intended to make sure that the story about why society was fighting a global struggle was explained to the population that was going to bear the weight of that burden. As you know, World War Two ended with atomic bombs being dropped on Hiroshima and Nagasaki. The national strategists at the time, both in the United States and in the Soviet Union looked at that dynamic and said, well, wait a minute. If general war between the superpowers risks the destruction of life as you know it, then we have to figure out a way to struggle with one another, to fight each other, to protect our own national interests as we understand them, short of risking general war between the superpowers. What the strategists of the United States came up with was what they called “cold war,” lowercase “c,” lowercase “w.” A set of tools short of the general application of armed force to bend your adversaries’ will to your own. 

So there’s an economic piece to it. There’s a diplomatic piece to it.But a huge piece of it was information. How could you use the information of any – not necessarily disinformation, but information – to advance the interest of the West, to protect the West from the ideological ambitions of the Soviet Union? The Soviets had a similar approach where they were going to use information to undermine the West, to protect what they thought were Soviet interests. So all of those pieces evolve in the context of if there’s a general war between the United States and its allies, the Soviet Union and its allies, it’s going to lead to a nuclear war that will destroy life as we know it. So they’re looking for a way to struggle, to fight, to combat one another short of risking general war, and information becomes a big, important piece of that.

Cartoon of then-Soviet Premier Nikita Khrushchev and then-U.S. President John F. Kennedy demonstrating the concept of mutually assured destruction.

Let’s fast forward to the 21st century. What are some examples of modern disinformation campaigns?

So there are lots of them. The one that I am most familiar with is the campaign that Russia launched basically in about 2014. At about the time that they invaded Crimea, there was a determination made by Russian President Vladimir Putin and the people around him that they were going to be in a much more confrontational position relative to the West than they had been in the 20 years preceding that after the end of the Cold War. The current head of the Russian General Staff Valery Gerasimov, who is the the top-ranking uniformed military officer in Russia, published an article in about 2013 looking at the Arab Spring and the role that social media networks played in the rise of civil disturbances, social revolutions, and political upheaval in the Middle East from about 2011 to about 2012.

Russian President Vladimir Putin and General Valery Gerasimov at a military academy reception. Source: AFP/Getty Images

He described the use of social media networks as a possible avenue for controlling state outcomes. So basically, he said “Look, there’s this tool that is in everybody’s hand on their phone. And it’s a pathway for us to use information, to target Western audiences in a way that, we think could have strategic benefits for Russia.” So it began with some very sloppy, campaigns around things like whether or not Americans should be vaccinated. They looked for sort of hot-button issues and tried to exploit them, tried to ramp up the attention on both sides. There’s a long, long history going back to the Soviet era of Russian sources looking to exploit America’s racial divisions or to sort of tear us apart internally. We’ve seen that grow in levels of sophistication since 2014. They’ve been playing that game again. There was a whole universe of social media accounts on Facebook, Twitter, and now TikTok and other platforms that were putting divisive messages into the American body politic.

Around 2016, they took a clear and deliberate hand to support the candidacy of Donald Trump and to undermine the candidacy of Hillary Clinton, as a way of sort of shaping what America’s, future foreign policy was going tobe now. Let me be clear. I don’t know if they thought that they were going to help Donald Trump win. They wanted to complicate Hillary Clinton’s life. They wanted to create some churn in American politics. But I don’t have a smoking-gun piece of documentary evidence that says, “They were trying to support Donald Trump”. But there’s a lot of analysis and a lot of expert opinion that says that the benefit that Trump gained over those years is pretty clear. And that’s not just my opinion. That’s also the opinion of the United States Senate Select Committee on Intelligence, which investigated this under a Republican chairman, Richard Burr of North Carolina, who concluded largely the same. 

So, how has foreign disinformation affected political discourse, in the US? 

So the term that the US intelligence community uses now is “foreign malign influence”. It’s foreign malign, meaning not in our interest. They’re trying to shape the way we either perceive things or how we’re going to act. They’re trying to affect our policy. So foreign malign influence has sought to coarsen political discourse in the United States. One of the important things to understand is there’s a dialogue between what we do and what we’re doing ourselves.

Our foreign adversaries, whether it’s Russia or China, or Iran, which are the three biggest players, are watching the tenor and the tone of the conversation that we’re having. They’re looking for places to make inserts. Places where they can they can raise the temperature. An example is what the Russians did in 2016 outside the Islamic Center in Houston, Texas. They sponsored both pro-immigration and anti-immigration rallies on opposite sides of the same street. They organized them on social media. People showed up. If you looked at it, it looked like a civil war. One side arguing one thing, the other side arguing the other. They’re yelling at each other across the traffic. They have sides that menace each other. Russia is trying to amplify the divisiveness of American politics because it’s a way of tearing ourselves apart, then the theory goes that Russian leaders will be more likely to be able to execute Russia’s interests both domestically and internationally if America is divided internally.

The aforementioned incident in 2016; Source: Houston Chronicle

Speaking of divisiveness, there has been a lot of division in public opinion about the war in Ukraine and the war in Gaza. How has disinformation, played a role in forming public opinion about those wars? 

Well, one of the most frequent themes in Russian disinformation campaigns right now is about the war in Ukraine. In particular, not just in the United States, but also in Europe, and in other parts of the world, these themes include amplifying the cost, the risk of expanding the conflict, the things that societies are not spending on versus the things that societies are spending money on. This was a real focal point for Russian efforts in the lead-up to European parliamentary elections earlier this summer in Europe.

It has been a continual point of emphasis in information campaigns targeting Germany, Poland, France, and the UK, targeting little European countries like Moldova that if they continue to support Ukraine, they can wind up in a conflict themselves. One of the things that the Russians have done in France is that they put two caskets under the Eiffel Tower earlier this summer. They both had social media posts saying that these were the bodies of French soldiers who died fighting in Ukraine. Putin has made very specific nuclear threats to France that he would target France because it was supporting Ukraine. All of these things are intended to change French behavior because it’s so antithetical to what Russia is trying to achieve in Ukraine. Even around the war in Gaza, French authorities arrested two Russian-speaking people, who had been paid a fairly modest amount of money to draw stars of David, the national symbol of Israel, on different buildings across Paris. It was intended to suggest there’s some antisemitism in France. It was Russian-speaking actors who had been paid by some third party to do this in France after the war in Gaza. There’s a desire to stir up division, animosity, attention, and anxiety in all of these target countries, including the United States. 

More than 250 Stars of David were painted on buildings across the French capital in November 2023; Source: AFP/Getty Images

It’s 89 days until election day. The social media landscape is a hotbed of disinformation. How should the US government take action against fighting foreign disinformation? 

In 2016, I think that the Obama administration made a horrible mistake in not alerting the public to what they knew was happening because the intelligence community had a good bead on what Russia was doing. I think the Obama administration dismissed it as just the sort of normal stuff that Russia always does. I just don’t think they had an appreciation for the impact that it could have. In 2017, the French had their presidential election. The French authorities immediately, when they were identified that Russia was doing this, told the French people. “Russia is active in our information space. They are trying to push particular viewpoints and particular agendas. Just be aware of it. Vote however you’re going to vote, but be aware that there’s a bad actor in our information ecosystem.” It largely muted the effectiveness of Russia’s campaign in France that had been so effective in the United States.

What’s evolved from that is this technique called pre-bunking. It is that before the information gets out there that needs to be debunked, you alert people that Russia’s about to launch a campaign saying that if whoever wins the election, the United States is going to go to war in Ukraine. You forewarn the American public to be sensitized to the fact that that narrative is going to be out there and that it’s disinformation. Then you hope that civic-mindedness and critical thinking have enough of a boost to be able to resist that. That is largely the approach that we’re seeing the intelligence community issue. 

Just last week, with 100 days to go before the election, the Department of Justice, the FBI, the CIA, and the Office of the Director of National Intelligence issued a summary of what we see happening, particularly focused on Russia, China, and Iran. In the case of China, they don’t seem all that interested in the presidential election, but they are using their information campaigns to oppose congressional candidates who they think would be bad for US-China relations from their perspective. They’re supporting candidates in the information space who they think would be more amenable to China’s interests in American policy. 

So how should ordinary citizens be able to spot disinformation online? 

You want to think in terms of being skeptical. Don’t accept anything at face value. Ask yourself some basic questions like, does this particular claim that I’ve seen on social media, does it pass the “sniff” test? Is it credible? Could it possibly be true? Is it so outrageous? If this is an outrageous claim, and the only person who has that claim or the evidence of that claim is some random person that you happen upon on social media, and there’s not a single major Pulitzer Prize-winning news outlet that’s also got that same story or is confirming it or is reporting on some aspect of it. . . .Be highly skeptical Joe Blow from Kokomo knows something that all of the White House press corps would not know. So be skeptical.

Question whether or not it’s even in the realm of the plausible. Ask your question. Well, how does this person know what they claim to know? If they have some sort of insight about decisions being made inside a campaign, inside the White House, inside the Pentagon, are they actually in the position to know? Can you confirm that they were there at that meeting where that decision was made? If the answer is no, I can’t confirm that, then be skeptical, and then finally, take a look at who are the other sources that they’re citing.

So some of the things that the Russians have done this year, both in Europe and the United States, is that they’ve created these fake news accounts. These websites look like established news sources, but they’re not and they put up these facades. This campaign is called “Doppelganger.” The facade looks like The Guardian newspaper, for example, or it looks like your local newspaper. They just have something a little bit wrong in the URL. So if you’re not paying attention, you might not realize that it is not the site that you are used to visiting. And then they have these bogus news stories. Take a look at the sources that they cite. Can you find that repeated anyplace else?

If it’s something more serious, take a look at the footnotes. Google some of the authorities and the experts. And is this really what they’ve said or is this somebody making stuff up? Take the time to dig deeper, ask yourself some critical thinking questions, and make an informed judgment about whether or not the claim that you’re seeing is worth sharing again. Ultimately, in this age of social media, we’re not all just consumers of information, we’re also all purveyors of information.

Every time we like, share, retweet, repost, whatever it is, we’re sending that information out to our networks. So consider ourselves as individuals, stewards of that information, and keepers of that information environment ourselves. We have to practice good information hygiene and share that with our networks as well.


Posted

in

Tags:


Subscribe to POLINSIGHTS

* indicates required

Intuit Mailchimp

Comments

Leave a Reply

Subscribe
SIgn up for the POLINSIGHTS newsletter
We respect your privacy.