Every week, it seems, we learn more about the sophisticated Russian digital subversion campaign targeting the 2016 elections. With so many variables playing into the final result, though, we’ll never know how much difference it made in the actual results. Regardless, we can assume that we’ll see similar tactics again – by actors domestic or foreign.
Overview of the Russian Subversion Campaign
The Russian campaign included many interlinked elements:
Twitter profiles that linked to false, slanted or inflammatory articles hosted on Russia-connected websites or on third-party sites (such as conspiracy-themed sites).
Twitter bots that would automatically retweet and echo these profiles on a vast scale, giving them the appearance of popularity and causing Twitter to distribute them more widely.
Facebook pages that appealed to specific groups in the electorate, including pages that organized local events around hot-button topics like immigration. In fact, Russia-backed Facebook pages actually organized dueling rallies outside an Islamic Center in Texas in the spring of 2016. The cost to promote them? $200.
Facebook advertising, including posts promoted geographically to people living in states crucial to the 2016 presidential elections (Facebook has identified at least $100,000 in spending, but many outside observers believe that the true number will prove to be much higher).
Outright hacking, including of computers associated with DNC officials and the email of Clinton campaign chairman John Podesta, followed by the public release of selected documents. Russian actors also targeted the state-level elections apparatus, though we do not yet know if they changed any votes.
Russian Disinformation Tactics
When creating their 2016 campaign, Russian actors could draw on a disinformation playbook that long preceded the existence of the internet and social media. At a strategic level, they:
Played to people’s existing prejudices. Much of their work involved stories around themes like Black Lives Matter, which they could distort to inflame prejudices against black people and social justice advocates. Likewise, they pushed stories in black communities highlighting Hillary Clinton’s comments about “super predators” from the 1990s, clearly intending to suppress Democratic votes. In each case, they fed and reinforced their audience’s existing suspicions, acting on people’s emotions rather than on reason to take advantage of divisions in our society. Other content appealed to prejudice against immigrants.
Created [fake] content that was primed to be shared. False information has a big advantage over the truth, since it can be crafted to be “sexier” than the real thing. The more outrageous a story is, the more likely people are to share it. The content Facebook has presented to congressional investigators is nothing if not inflammatory, and Twitter users appear to have shared more false “news” stories than true ones in 2016. Several (fake) stories that briefly became media sensations during election season (such as Clinton’s “serious illness”) got a major boost from Russian pages and bot nets, showing the power of sensationalized content to jump from social media to broadcast media.
Promoted their content to people likely to respond to it, such as white working-class voters in Rust Belt suburbs and exurbs or black voters in Rust Belt cities.
Facebook provided fertile ground for these efforts, in part because many people pay less attention to the source of information on social media than they might in other channels. The New York Times and a Russian-run page are functionally equal in their presentation on Facebook — only the byline distinguishes them. As for the results, reporter Alec MacGillis observed: “Can't overstate how many of the swing-state voters I spoke with who had the wildest anti-Hillary notions relied on Facebook for their news."
Lessons for Advocacy and Campaign Communicators
Political campaigns and advocacy groups must plan for the possibility that they may be the next target. Botnets are for hire, and we may see campaigns or third-party groups dropping a few dollars to help boost negative stories about their opponents. Likewise, much of the Russian 2016 campaign apparently aimed to sow distrust in the electoral system broadly, and campaigns at all levels could be next on their target list if Putin’s troops try to pull off a sequel.
Social media is now a major news source, where information spreads person-to-person. Wild stories can spread virally overnight, and disinformation can muddy the waters enough that voters lose sight of what’s true.
Careful targeting helps content achieve its goals. The right content will only be truly effective if it reaches people who are likely to engage with it, and paid promotion can make sure it gets in front of the right eyeballs.
Amplifiers are vital. From Twitter bots to your crazy uncle who’ll post anything conspiracy-related he sees, content-amplifiers are incredibly important for the social distribution of content. As people tune out many voices in our crowded information environment, stories posted by friends and family can still cut through the clutter. This dynamic will reward campaigns that can mobilize supporters to become active social media ambassadors.
Create content that people want to share. People only share what they want to, and campaigns need to create stories that spark them to do it.
Content should pack an emotional punch when possible. Stories that speak to people’s hearts will usually spread farther and faster via social media than cerebral ones.
Content should speak to people’s beliefs about the world. We’re more likely to accept information that fits with what we already accept as being true, and campaign communicators should frame messages to mesh with voters’ deeply held values and assumptions.
Hacking is real, and it’s everywhere. Campaigns must implement basic security measures. For instance, get rid of those general passwords shared by many volunteers immediately, and stop spending time on some coffeeshop’s wifi without running the traffic through a Virtual Private Network (robust systems can cost a few dollars per month for individual users). Digital consulting shops should consider teaming up with security firms to protect their clients in bulk.
We’ll never know how much difference the Russians made in the 2016 election, but we’re getting a good idea of what they meant to do — undermine our democracy. The next election-hackers may have more modest goals, but they’ll have access to the same arsenal of digital disinformation tools. Don’t let your campaign be a helpless target.
Colin Delany is founder and editor of the award-winning Epolitics.com, a twenty-year veteran of online politics and a perpetual skeptic. See something interesting? Send him a pitch at cpd@epolitics.com.