Sun Tzu said, “The supreme art of war is to subdue the enemy without fighting.” Today, this supreme art is taking on unprecedented forms in the cyber environment. Wars are fought and the hearts and minds of populations captured in a landscape with no borders using armies of bots, cyborgs, trolls and sock puppets. Army leaders of the 21st century must be aware of the changing nature of information warfare and its impact on soldiers, operations, the operating environment, and the nation and society we defend.
Today, the borderless internet is used by state and nonstate actors to manipulate information and societies in ways that were unheard of 25 years ago. Adversaries can rapidly conduct information maneuvers with little cost at unprecedented scales to achieve far-reaching consequences across the internet by exploiting features of social media platforms and the way humans naturally understand what they read and hear.
These cyber-mediated threats to open and democratic societies have led to an emerging discipline known as social cybersecurity. Social cybersecurity is an emerging scientific area that uses computational social science techniques to understand and secure a society and the culture that binds it from malicious social cyber manipulation over the internet. It is focused on keeping the internet an environment that supports democracy rather than an environment that enables the destruction of democracy.
Here, we highlight some of the emerging threats and the frameworks and methods being developed to thwart them.
Russia created the word dezinformatsiya in the 1920s and matured its use of disinformation throughout the Cold War with operations often called “active measures.” According to KGB Maj. Gen. Oleg Kalugin, active measures were developed to “sow discord among allies, to weaken the United States in the eyes of the people of Europe, Asia, Africa, Latin America, and thus to prepare ground in case the war really occurs.”
Russia has now adapted these capabilities for the internet age and turned them on the West. The effects are tangible. In 2014, now-retired U.S. Air Force Gen. Philip Breedlove, then commander of the U.S. European Command and Supreme Allied Commander Europe, said, “Russia is waging the most amazing information warfare blitzkrieg we have ever seen in the history of information warfare.”
Rand Waltzman, former program manager at the Defense Advanced Research Projects Agency, asserted in Time magazine in October 2015 that the U.S. is losing this social cyberwar. “The use of social media and the Internet is rapidly becoming a powerful weapon for information warfare and changing the nature of conflict worldwide,” he wrote. “Because of misaligned U.S. policies and laws, we continue to largely rely on only conventional warfare techniques, which puts us at a severe disadvantage. We are losing our military and political advantage and ability to compete.”
Although technical, social cybersecurity is different from traditional cybersecurity. Cybersecurity involves hacking networks and information systems. Social cybersecurity involves “hacking” other humans. The goal is to manipulate the beliefs and social connections in a target culture or group. At times, the desired end state is simply agitation and distrust. Agitation and constant doubting cause a democratic society to become a slower, less decisive and less powerful nation-state. At other times, it is to promote specific leaders or denigrate others.
Open democratic societies are by their nature more susceptible to external manipulation through the internet, at least in the short term. Allowing anyone to speak means that in cyberspace, even the views of bots and trolls controlled by adversarial forces are heard. Americans must be careful we don’t compromise our national values in countering these threats, e.g., by banning all seemingly subversive actors from the internet. These values make open societies more resilient in the long term, as the Cold War demonstrated.
Conversations about disinformation are often constrained to the politically charged term “fake news.” This is unfortunate. The majority of information campaigns we have observed in Europe, Asia, and North and South America have little to do with fake versus fact. While false claims and photoshopped images have been used, they are not the primary focus. Disinformation operations are conducted using information maneuvers aimed at manipulating both narrative and networks, not necessarily fake versus fact. But what are these maneuvers?
Russian information maneuvers have been described as the four D’s—dismiss, distort, dismay and distract. There are campaigns that take these forms. However, in cyberspace, attracting people to your ideas is like throwing a picnic, in that you attract more flies with honey. Thus, many information maneuvers involve affecting the narrative in positive ways by using the four E’s—engage, explain, excite and enhance. These maneuvers impact what is being said—the narrative.
Some information maneuvers move beyond the narrative. Using bots and other denizens of cyberspace, maneuvers can be conducted to make and break leaders, build or destroy groups, and create echo chambers prone to reacting only emotionally.
Just as there are positive narrative maneuvers, there are also the “four B” positive social network maneuvers—back a leader, build a group, bridge two groups and boost group membership. There are also the “four N” negative social network maneuvers—neutralize a leader, nuke a group, narrow a group and neglect a group to drive down membership. By combining these maneuvers, complex information campaigns can be waged.
This “BEND” approach is described further in a March-April w article we wrote headlined “Social Cybersecurity: An Emerging National Security Requirement.” The key point is that in cyberspace, the art of war is conducted by building and attacking the narrative, and by building and attacking the communities and the social network among their members. Network manipulation is just as critical as narrative manipulation in conducting cyberwars. The art of war in social cyberspace is about BENDing the environment.
Bots, Cyborgs, Trolls and Sock Puppets
The forces conducting these information maneuvers include bots, cyborgs, trolls and sock puppets. Bots are automated accounts that allow a computer to carry out prescribed actions on social media (i.e., tweet, reply, retweet, follow, etc.). These bots, or “robots,” can act faster and be more numerous than people. However, the messages they send are often less nuanced and show less adaptation than those from humans. Bots can be positive, neutral or malicious. Positive bots include personal assistants and warnings of natural disaster. Neutral bots send spam ranging from messages about commercial products to adult content. Malicious bots are involved in propaganda, intimidation and network infiltration/manipulation.
Cyborgs are the new centaurs—part human and part bot. For a cyborg, the human operator conducts nuanced dialogue and the computer conducts background operations at scale, all on the same account.
Trolls are the ugly creatures under the bridge. They are humans often masquerading under a false persona who use a combination of abusive language and disruptive behavior to achieve their goals.
Sock puppets are fake personas often attached to both bot and troll accounts.
In an information campaign, these bots, cyborgs, trolls and sock puppets are force multipliers. Bot armies are carefully cultivated and deployed to manipulate the narrative or the network. For example, one bot operation involved a bot army intimidating a freelance journalist in Yemen. In this case, many nondescript accounts, at times with disturbing profile pictures, suddenly began following the journalist’s Twitter account, which was her primary platform for bringing news of the conflict in Yemen to the world.
The bot army doubled her followers in a few short months, hoping to push her off Twitter once her account became associated with odd behavior and images. Actions like this are easy to automate with a few lines of code.
Researchers have developed machine-learning algorithms to detect bots. We have developed an algorithm called “bot-hunter” that can easily scale to run detection on large streams of Twitter. As an example, in the 2018 Swedish national elections, 30% to 35% of accounts were tagged as likely bots by bot-hunter. These efforts to find and create bots have led to a new arms race where detection algorithms evolve to keep up with ever-evolving bots.
Impact of Memes
Bots, cyborgs, trolls and sock puppets are not the only social cyberwar weapons. Memes are another. Many of our midgrade and senior leaders, while familiar with the concept of an internet meme, are ignorant of the power of these humorous and seemingly innocent artifacts of the internet age to impact populations.
Memes leverage humor and satire to connect a carefully crafted message with a target subculture. They are often designed to target and reinforce existing biases in the target audience. Memes enable a message to stick when other methods fail. Multiple actors, particularly the Russians, are deploying memes with strategic affect.
Memes, as originally envisioned by British evolutionary biologist Richard Dawkins, evolve as they are transmitted across the digital landscape. This online evolution enables specially crafted Russian memes to jump from culture to culture, evolving and spreading, while maintaining original Russian narrative and messaging. In some cases, simple but creative Russian memes achieved greater reach (measured by unique internet locations) than elaborate and relatively expensive NATO public affairs efforts.
Due to the power of memes and their increasing use by nation-state actors, we developed “meme-hunter” to aid in finding and studying these cultural “genes.” Meme-hunter leverages state-of-the-art deep learning in multiple modalities to detect memes in large social media streams. We have used this to extract memes from social media image streams of up to 5 million images. Using meme-hunter, we have identified and extracted from image streams hundreds of memes associated with known Russian propaganda outlets. We’ve found that memes summarize information warfare lines of effort.
Develop Policies, Teams
In light of these emerging cyberspace threats, Army leaders must develop relevant policy enabling strategic and operational initiative while remaining within national values and the authorities granted to DoD. The Army must develop special teams with the skills and authorities to access requisite data and application programming interfaces. These APIs enable both defensive action (access data and identify threat lines of effort) and offensive action (push timely response and counternarrative). The first involves pulling data, and the second involves pushing data. Both require API software programming access. Arguably, intelligence teams (particularly open source intelligence teams) require API access with pull authorities, while other information operations teams require API access with both pull and push authorities.
In today’s ongoing social cyberwar, the Army and the nation are arguably in reaction mode. Finding a solution is difficult since the information warfare problem is in a sort of no man’s land between federal agencies, the government, private industry, political parties and academic disciplines. We must quickly regain the initiative. Our adversaries have adopted Silicon Valley’s motto of “move fast and break things.” The Army must assist in searching for a bold and creative response that restores operational and strategic initiative while staying true to our national values.
The force must be educated about the risks that exist in the open internet. Specially crafted messages exist that are often designed to exploit existing biases in subgroups. If left unguarded, these messages will help drive wedges in every fissure possible: between racial and ethnic groups, political groups, religions, and even between the military and the civilians that lead it. As a country, we must help guard these existing fissures from external manipulation.
The threat is already manipulating narratives and networks using the BEND forms of maneuver … and is winning. We need to understand these forms of maneuver and develop appropriate countermeasures so we can safeguard the long-term benefits of our democratic values. The Army must understand and enable social cybersecurity.