Get Serious About the Science of Influence | Proceedings - June 2024 Vol. 150/6/1,456
▻https://www.usni.org/magazines/proceedings/2024/june/get-serious-about-science-influence
Two Information Warfare Types
Broadly, there are two information warfare types: command-and-control (C2) manipulation and psychological influence. The former refers to bettering and defending one’s C2 while disrupting an adversary’s—depriving it of information or feeding it false or misleading data. This is typically accomplished through electronic signature control—hiding one’s signature, spoofing it, and finding and fixing enemy signatures. It is a back-and-forth of signature detection, jamming, antijamming, counter antijamming, and so on. Cyber tools can defend and attack C2 as well, even with something as simple as a power outage. Importantly, this type of information warfare is used primarily at the tactical level and requires no special insights about a target’s psychology or cognitions. It may be informed by knowledge of an adversary’s institutional decision-making processes; tactics, techniques, and procedures (TTPs); and kill chains—those processes that take place among individuals and across units as opposed to existing inside a single person’s head.
The other type of information warfare—psychological influence—is used primarily at the strategic and operational or campaign levels and, unlike C2 manipulation, does rely on information about the psychology and cognitions of individuals or groups (e.g., local populations, military leaders, etc.). This type of information warfare is very much in vogue.
China, Russia, Iran, and North Korea conduct information warfare externally and internally—against their own populations. In both directions, they promote truths and falsehoods that benefit them and restrict those that could harm them. This is accomplished through heavy internet controls and restrictions on speech and the press. Newspapers and websites are shut down or nationalized. Russia’s detention of Wall Street Journal reporter Evan Gershkovich and China’s arrest and detention of Apple Daily newspaper owner Jimmy Lai are two examples. Beijing seized and shuttered Apple Daily, one of the last independent media publications in Hong Kong.9 Excluding journalists, China jails more writers than any other nation according to PEN America’s 2023 Freedom to Write Index. It is followed by Saudi Arabia, Iran and Vietnam (tied for 3rd place); Israel (including the occupied territories); and Russia and Belarus (tied for 6th place).
These tactics are pursued internationally as well, though with less success. To influence international narratives, authoritarian states turn to clandestine influence, subversion, and sabotage. Beyond social media, they use international law, money, and politics to shape narratives—just as they use the same methods to mold the rules-based order to their advantage.
In competition, the manipulation of news through social media, state-owned media, and financially compromised foreign media appears mostly ineffective.10 While the size and number of these operations is large, their returns are few. Most followers of Chinese-run Facebook campaigns turn out to be Chinese-owned or -purchased bots.11 Alarmists warn that AI-generated news and deepfakes will soon be indistinguishable from real news, and populations will be easily fooled or doubt truth as something they can discern—though, it seems at least as likely that consumers will become more skeptical and discerning. Deepfake detectors are already available. Indeed, U.S. adversaries have so heavily invested in these ineffective influence methods, it bears considering whether we should be spending tax dollars to counter them.
Two purported cases claimed as evidence of disinformation’s harmful effects are the 6 January 2021 attack on the U.S. Capitol and COVID-19 vaccine hesitancy. Those examples have at least face validity, and though the evidence and arguments for them are beyond the scope of this essay, they raise questions that demonstrate the complexity involved in determining the conditions under which influence operations may work and the distinct causal contributions of multiple independent variables. For example, if it is the case that the 2020 presidential election disinformation causally contributed to the 6 January attack, to what degree was the effect from the disinformation having a domestic, rather than foreign, source? Similarly, in assessing the effects of COVID-19 mis- and disinformation, to what degree does it matter that most interactions with misinformation happen when people seek out views with which they already agree?
None of this speaks to potential consequences from combined attacks of simultaneous disinformation and kinetic effects. Even if disinformation’s value is short-lived, it could be of outsized benefit in crisis or the chaotic opening hours of conflict. Today, a deepfake video of the U.S. President saying a ransomware attack has compromised coastal electric grids might do little harm before it is quickly debunked. The same video released simultaneously with cyberattacks causing blackouts at the start of actual conflict could potentially cause substantial confusion and disarray—even delaying a defensive, military response .
The successful use of Russian influence activities in Crimea, which has a large population of Russian Black Sea Fleet sailors and veterans already inclined to agree with Russian narratives—and the consensus view among researchers that this is the kind of circumstance under which influence operations could have an effect—suggests that some subpopulations in the United States may be susceptible to specific instances of foreign malign influence. “Russia is our friend,” for example, was among the slogans white supremacists chanted at Charlottesville.18 But while there will always be some U.S. citizens and service members predisposed to believe adversarial messaging, mis- and disinformation interventions, such as media literacy education, are unlikely to benefit them. Theirs is a problem of values—not of defending against malign information.
Studies skeptical of the efficacy of mis- and disinformation operations do not show that they have no effects. They show, rather, that we do not yet know if they are, or can be, effective—and if they can be, under what circumstances. Answering those questions requires rigorous, controlled studies that both detect effects and measure effect sizes.
To date, the few effects studies there are suggest that attempts to leverage psychological influence against foreign audiences have limited effectiveness and largely depend on an audience’s prior attitudes toward the messenger and message. Ukraine, for example, has successfully lobbied for international aid and support, but that success is limited to countries that already favored Ukraine—or at least opposed Russia.
#Désinformation #Psyops #Influence #Militaires #Guerre_psuchologique #Education_médias_information