Psy-Ops, Meet Cyber-Ops: U.S. Takes on Russian Trolls

A Russian troll sits down at his desktop and logs into one of the social media accounts he uses to impersonate and radicalize Americans. Suddenly, a direct message appears:

“Hello, Ivan. This is U.S. Cyber Command. We can see your mouse from here.”

The threat is implicit, not explicit, but it might be enough for Ivan to start searching for job opportunities that do not come with the peril of U.S. indictments or criminal complaints.

It reads like a scene from the Wachowski brothers’ Matrix series. But this, or something like it, is the concept that U.S. Cyber Command reportedly is using to deter Russian operatives from spreading disinformation intended to interfere in U.S. elections.

According to the New York Times’ Julian Barnes, Cyber Command is finding ways to send a clear message to individual Russian operatives to signal that the U.S. knows who they are and what they are doing. The purpose is to deter the sort of information operations that Russians launched against the 2016 U.S. presidential election, without triggering a retaliation against critical infrastructure.

It is a combination of cyber and psychological operations that could be called “psyber-ops,” using the skills of cyber intrusion to create a psychological effect. Just as some old-style psy-ops targeted the opponent’s frontline fighters, trying to sap their will to fight, this psyber-op looks designed to intimidate Russian trolls.

Cyber Command would not give precise details on how the campaign works, according to the Times report, making it difficult to judge the potential effectiveness of the approach. But a look at the effect of previous exposure of trolls indicates that it might at least have short-term impact.

The cornerstone of the operation appears to be attribution, singling out Russian individuals and telling them that they have been identified.

Connecting to Real Life

That connection between online activities and real life is one the trolls are known to fear. According to an indictment unsealed in February by Special Counsel Robert Mueller, one operative at the Internet Research Agency in St. Petersburg, emailed a relative in September 2017, “We had a slight crisis here at work: the FBI busted our activity (not a joke). So, I got preoccupied with covering tracks together with the colleagues.”

After September 2017, the trolls kept on posting, but according to the archive of troll posts that Twitter published recently, they did so at a much lower rate, and with much more use of automation software, suggesting they were trying hard to hide. So, the FBI probe and the simultaneous suspension of thousands of troll accounts in September 2017 clearly was “not a joke” for the troll operators. While it did not stop them, it demonstrably slowed their activity. The administrators of a group of Facebook accounts which were exposed in late July, and which were most likely run from the troll farm also tried to cover their tracks in various ways, including hiding among real American communities.

The link to real-world consequences is especially important because, according to most accounts, the Internet Research Agency’s operatives joined up for the money, not out of patriotic fervor.

“It was business, and nothing personal,” according to Russian independent journalist Yevgenia Kotlyar, who investigated the Internet Research Agency. An anonymous former troll interviewed by Russian newspaper Bumaga in April 2017 made similar comments: “I was tempted by easy work and good money. I resigned myself to working there and just started enjoying the fact that I was being paid well for doing very little.”

Threatening to increase the costs of doing business, literally and otherwise, for individual troll operators through legal procedures, asset freezes or travel bans, might well change their calculus.

Much will depend on the accuracy of the attribution. A personal message from Cyber Command delivered to the right addressee might be shocking; a message to the wrong addressee risks undermining the entire effort.

Internal Troll Farm Communications

The evidence suggests that at least some of the attributions will be accurate. In addition to Mueller’s February indictment quoting emails sent by a troll employee to a relative, last week’s criminal complaint quoted internal troll farm communications. This suggests that the troll farm itself has been penetrated or hacked.

Cyber Command is not the only actor to have cracked down on Russia’s operations. Starting in September 2017, Twitter, Facebook, Reddit, Tumblr, and the Facebook-owned Instagram all suspended hundreds of trolling accounts. The farm replaced the accounts but struggled to rebuild audiences before its replacement accounts, too, were suspended this summer.

In July, the U.S. Department of Justice indicted hackers from the Russian military intelligence service, the GRU, for their role in the hacking and leaking of Democratic emails during the 2016 election. In August, the Department of Homeland Security ran its first national cyber exercise aimed at protecting elections against online interference, including “news and social media manipulation.”

Exposures, indictments, and account closures are all real impacts, but they are likely to be temporary. Exposed agents can be replaced. So can trolls intimidated by the psyber-ops, and so can accounts that have been suspended. Even their audiences can be rebuilt, although this takes time.

As far back as April 2017, the former troll interviewed by Bumaga said that the troll farm was already changing its hiring policies, firing some staff and replacing others with more “patriotic” workers. The troll farm reportedly shifted its focus from running fake accounts to running “patriotic” websites pushing pro-Kremlin interpretations of the news. Earlier this year, the core of that network, known as the Federal News Agency or FAN (from its Russian acronym), launched an English-language service targeting America. Called “USA Really,” it advertises for English-speaking contributors. All this suggests the troll farm is an adaptive adversary that is more likely to change its tactics than stop its attacks.

That said, short-term disruption of hostile activity is a good start, especially if it is timed to just before an election. Cyber Command’s counter-trolling, if based on accurate attribution, would fit into that category.

Longer-term disruption would require a focus on the Russians giving the orders, rather than those carrying them out. Cyber Command, the U.S. government, tech companies and the research community can all shrink the space in which Russian troll farms can operate, but only the Russian leadership can order them to stop.

For the U.S. to put effective pressure on Russia’s leaders to do so would require real leverage, real political will from all levels of government, and a clear assessment of the likely costs.

Cyber Command can help disrupt the threat to the U.S. electoral system. It will take much more to defeat it.

IMAGE: With examples of Russian-created Facebook pages behind him, Sen. Patrick Leahy (D-VT) questions witnesses during a Senate Judiciary Subcommittee on Crime and Terrorism hearing titled ‘Extremist Content and Russian Disinformation Online’ on Capitol Hill, October 31, 2017 in Washington, DC. (Drew Angerer/Getty Images)

 

About the Author(s)

Ben Nimmo

Studies online disinformation and influence operations for the Atlantic Council's Digital Forensic Research Lab (DFRLab). Follow him on Twitter (@Benimmo)