UARU
Vilify and Amplify: Disinformation and Cyber Warfare
23 July, 2017

Did Russia interfere with the U.S. and French elections? All evidence points to yes, says Ben Nimmo, Information Defense Fellow at the Altantic Council's Digital Forensic Research Lab.

There was, of course, direct hacking. But Russia also spread fake news and disinformation. This has raised serious questions about who bears the responsibility for protecting society from disinformaion. While many commentators were quick point at media platforms and government agencies, Nimmo suggests considering who has more power in the disinformation war: the media platform or the individual with nine-and-a-half thousand accounts on it?

Nimmo argues that it is the individual who has the power to amplify and spread disinformation. These individuals — as well as political parties — set up propaganda machines that often use “one technique which you see time and again. I think of it as 'vilify and amplify.'” It is the job of the media and individuals to be more analytical about the news they spread or consume.

The media and civil society are at the forefront of raising public awareness about cyber security, Nimmo says. While institutions such as NATO and the EU have allocated resources to cyber defense, they are not responsible for their member states, nor are national governments responsible for the internet security of their populations.

According to Nimmo, it is ultimately up to the individual internet user to be aware and conscientious of his or her cyber security.

Cyber security has been a major issue in Ukraine recently. On June 28, a ransomware attack hit Ukrainian businesses, banks, and state institutions. In an interview with digital security expert Vadim Losev, Hromadske learned that many countries are equally vulnerable to this type of cyber attack because governments do not invest enough resoures into cyber defense. While visiting Ukraine last week, NATO Secretary General Jens Stoltenberg pledged to aid Ukraine in its cyber defenses.

Hromadske sat down with Ben Nimmo, an expert in social media and digital forensic research to discuss disinformation and individual responsibility in combatting its spread.

Today, there are way more talks on the disinformation, at least, the analysts are aware. And people from security, in the EU, in the U.S., in NATO countries, they are people much aware. But what are the real actions taking place? If you speak about the strategic communication, because there are more news on hacks and the response is not very clear a part from a few.

Ben Nimmo: If you think about government responses, the government responses have been limited, pretty much everywhere. Actually, there is a very good reason for that. Which is, when you think about the way a democracy works, you have to have a strong and independent press that is capable of holding power to account. You have to have journalists in every country who can say, “actually minister, what you are saying is not true.” And if you think about that in the context of fake news, the wrong response would be to say, “right, we’ll give the government lots of power to control the news.” Because somebody somewhere is going to say, “I got the power to control the news now, well, I can rewrite that headline about me the other day.” So, there has to be a limitation on what governments, and lawmakers, and international organizations do about the news, because we need the news to be free. We need the news to be independent and that means independent of political control. You have seen in certain areas, for example the EU has set up what is call the East StratCom team, which is both explaining the EU’s policy a bit better, which is a good thing to do; and is exposing some of the fake news which goes around. Those are appropriate things for the EU to do and what it’s been asked to do. But you don’t want to have a situation where the EU is controlling the media in EU member states, that would be a nightmare. And the same with any national government, you wouldn’t want a particular national government to have the right to censor what’s going on in the news. What you have seen and what’s most promising is the media themselves have woken up to this, they’ve woken up both to the fact that it’s a problem and the fact that they are being targeted. And at various times, varied different media have broadcasted stuff which turned out to be fake news. Now for a real journalist at a real news organization, that is a damage to their reputation to be found that you have not done your job properly and broadcasted fake news. And there’s a lot more writing in the media, in the real media, in the mainstream media nowadays about fake news, disinformation, propaganda, how it works and how it fits together. And that is, therefore, spreading the understanding, spreading the concept of what the problem is.

And what could the military do? We also speak about cyber warfare and information warfare. And that’s exactly not the case when you leave just to the media which just have to become better, self-governed. So how does that develop? I mean, I have been to a number of events where the defence people were discussing fake news, but to what end?

Ben Nimmo: Again, the military occupy a very niche in the whole spectrum. In terms of disinformation we see it is appropriate for the military to respond to individual stories about them, for example, if there’s a fake story about what American soldiers did in Poland, then the U.S. Army will say, “well actually, that’s a fake, here’s what really happened.” So, there’s a specific response. In terms of cyber, then you have much more attention paid by the military to cyber security, making sure people can’t hack networks, making sure people can’t hack internal emails. And that’s very much the military protecting itself. Again, in a democracy, you wouldn’t want the army to be policing everybody’s email accounts, because there’s all sorts of conflicts of interest there. So, what we are seeing the military is again an awareness of how they can be targeted, an awareness of how they are being used. But what we are not seeing, and I think, we would want to see would be the military being asked to defend civilian email accounts, for example. That is not the job of the army. Would you want a soldier reading your emails? You wouldn’t. The problem is that certainly in Russia, you have got an entire structure which includes the government and the security services and the military, all of which is working on offensive operations. They are working to attack systems in the West, and that’s both cyber systems, email systems, hacking media, disinformation, it is a single package.

And what is the role of these big media data companies such as Facebook and Google and the others in that game? Because they are themselves more powerful than some of the governments.

Ben Nimmo: They are, but again, you need to ask the question, is it Twitter that’s more powerful or is it a person who has 100,000 followers on Twitter who is more powerful? Is it the user or is it the platform? The main way in which platforms like Facebook and Twitter get abused is people create a network of fake accounts. For example, I recently discovered a network of fake accounts on Twitter which had at least nine and a half thousand accounts all running the same thing. So that’s one person who can run nine thousand different accounts, anything they tweet can get amplified nine and a half thousand times just like that. That’s incredibly dangerous and that is the sort of the system that platforms like Facebook and Twitter absolutely needs to crackdown on.

So, as somebody who researches fake news, bots, cyber warfare, what is the most popular unconventional ways of using the information besides, you know, we can speak a lot about Russia today, in particular, what are the tools that you’ve found?

Ben Nimmo: There’s one technique which you see time and again. I think of it as vilify and amplify. Let’s say you are a foreign government and you control everything from the foreign ministry to broadcasters, to internet trolls, to internet bots. You find one person who says the thing you like, and then you get all the different parts of your machine to amplify that message. And you point to it and say, “look, it’s not us saying it. All we are doing is reporting it.” But the effect is to take one person’s voice and to amplify it a hundred thousand times. And by doing that, a) you are encouraging people that believe the same thing to speak louder, and b) you are making that person artificially loud.

What in the end is really the evidence that Russia is that strong in cyber warfare?

Ben Nimmo: One of the problems we are seeing is a lot of people are not being very subtle in their analysis. And it is at the stage where any time, anybody, for example, sees the operation of a botnet on Twitter, some journalist would phone and say, “was it a Russian one?” It might have been, it might equally have been a French one, an American one, or a Canadian one. You always need to be very precise in your definitions. What we do know about the Russia operation is that, a number of credible expert groups in the cyber industry have said independently that individual hacks have been traced back to Russian hackers. And in fact, the other day, we had President Putin saying, “maybe it was patriotic hackers,” in the same way that a couple years back, President Putin said, “well maybe the Russian soldiers in Ukraine are just patriots there on holiday.” C’mon, it’s not a serious comment. It’s all about confusing people. So, we have even President Putin admitting in his own very twisted way that maybe Russian hackers were actually involved in hacking the U.S. election and the French election. So, you’re never going to get to the stage, well I don’t think we’ll ever get to the stage, where Putin is standing up and saying, “Yes, I ordered the hackers to actually target Hillary Clinton.” I mean, he did in the end say that he ordered the annexation of Crimea a year later. But it is very unlikely that he would admit that. So failing in that, you have to look at the balance of the evidence, the balance of probability. And the balance of the evidence is that there are multiple occasions on which the Russian propaganda, the Kremlin propaganda machine, for example, targeted Hillary Clinton. They were actively stories that would make Hillary Clinton look bad. And you compare that pattern of behaviour with what hackers were doing, and you compare that with what the experts have said about where the hacks came from, and it all points in the same direction, which is to the Kremlin.

/Interview by Nataliya Gumenyuk

/Text by Chen Ou Yang