Adversaries in cyberspace have come to realize the outsized impact that relatively cheap disinformation campaigns can have, according to Renee DiResta of the Stanford Internet Observatory, and companies and policymakers in the coming months must be on guard for “hack and leak” operations, cyber attacks on voting systems, infiltration of groups of U.S. citizens, and amplification of messages that divide Americans.
DiResta is urging policymakers, security pros, companies and the public to be aware of how social media infiltration by state actors intersects with the nation’s information systems, while explaining key differences in disinformation campaigns orchestrated by China and Russia.
She spoke Thursday on “Hacking Public Opinion,” on the final day of the all-digital Black Hat USA 2020. Cyber researcher Matt Blaze delivered the keynote on Wednesday, discussing election security challenges including securing software.
Renee DiResta, Research Manager, Stanford Internet Observatory
DiResta offered a detailed examination of the nexus between information systems and a social media-fueled disinformation threat, as well as a closer look at the very different campaigns being mounted by the Chinese and Russian governments.
The consolidation of audience, personalized targeting and “gameable algorithms” have created the environment for actors to launch low-cost, highly effective disinformation campaigns on platforms such as Facebook and Twitter, DiResta said.
Twitter on Thursday announced a new policy on labeling government accounts and state-affiliated media accounts.
Twitter said in a blog post: “Our focus is on senior officials and entities that are the voice of the nation state abroad, specifically the account categories listed above. Labels will only be applied to accounts from the countries represented in the five permanent members of the UN Security Council: China, France, Russian Federation, the United Kingdom, and the United States. For transparency and practicality, we are starting with a limited and clearly-defined group of countries before expanding to a wider range of countries in the future.”
DiResta said disinformation campaigns take four forms: distraction, persuasion, entrenchment – “the creation of social-media groups dedicated to an identity” – and division.
“The model works because the message resonates with groups of people,” she said, leading to a “critical mass” on social media that’s ultimately picked up and amplified by the mainstream media. “At Stanford we’re looking at points in the kill chain where we can stop it,” she said, noting the stage where actors create a persona, as an example.
Facebook is banned in China, she pointed out, but the Chinese government and aligned entities have created “massive Facebook pages” that spread the Chinese Communist Party’s message to an outside audience, DiResta said.
She said China “began a full-court press” in January to change the global narrative on COVID-19 using all of “the overt and covert means” at its disposal to distract from and discredit criticism of its response to the outbreak.
“But they didn’t do a very good job” of getting real people on social media to pick and share material, DiResta said, saying China’s “sloppy social media activity” did a “terrible” job at engagement.
However, DiResta said, “They want to get better” and are adding to their skill sets. “I expect them to improve. We need to study the content and understand the TTPs [techniques, tactics and procedures], so we know when an attack is coming.”
At the same time, she said, “we should recognize that their COVID operation was largely a failure.”
Russia may be spending less, achieving more
Russia, on the other hand, spends a fraction of what China is spending on such operations but seems to be getting much more bang for the buck, DiResta said.
That’s attributable to small things such as creating snappy videos that go viral, as well as to the underlying purposes of the two countries’ efforts. China is trying to create a positive view about its country and government, she said, whereas “Russia doesn’t do that.”
The Russian government is interested in identifying groups in the United States and the issues that divide them, she said. Plus, “Russia also goes out to hack,” she said.
The Russian military intelligence agency GRU will hack a public official, as it did in targeting officials in Hillary Clinton’s 2016 campaign, DiResta said, and will “transmit the collateral” to “fake personas” created by entities such as the Internet Research Agency, which turns the content into “memes” on social media. That in turn gets picked up by Russian media and then seeps into Western media.
“That’s how [adversaries] use the system most effectively,” DiResta said. Stories trend on social media and ultimately exploit divisions in society “using vulnerabilities in the information ecosystem,” she said.
“The answer isn’t purely technical or limited to social media,” DiResta said. She called for steps such as more red-teaming to test systems and greater attention to how information operations can affect such systems. – Charlie Mitchell (email@example.com)