The technology to create convincing fake video clips is moving fast. The tools are young and the results still relatively easy to spot, but that won't last. Soon enough, artificial intelligence (AI) and pattern recognition software will come together in systems able to create realistic moving images that fool our eyes, ears and minds.
In fact, I suspect that a fake audio clip may change the dynamics of a political race before fake video. Imagine the fictional equivalent of a Mitt Romney “47 percent” clip, or the recording of Barack Obama's "clinging to guns or religion" conversation. Low production quality need not be an obstacle for someone manufacturing audio, since a scratchy recording may actually sound more authentic than something slick and produced.
Of course, falsified video isn't entirely new: just ask ACORN or Planned Parenthood, which were both victims of selectively edited videos released by James O'Keefe's Project Veritas. Those hit pieces were limited by the availability of source material, since the Veritas editors could only work with words that someone had said. Regardless, they led to the end of ACORN.
The new forms of fake video and fake audio won't share this drawback, since their creators can dream up whatever they want and make someone say it — or at least, make them sound as though they did. This technology has the potential to be truly dangerous. Putting the wrong words in someone's mouth, believably, could start a war.
Fake video and audio thus join fake news (real fake news) as serious threats not just to individual campaigns, but to our political system as a whole. What can campaigns and consultants do? Ultimately, I suspect we'll need some kind of industry-wide — or nationwide — ways to detect outright fake content, and somehow attach a notification or block it.
In practice, we won't have any such comprehensive system in action for 2018. Facebook may be instituting changes to track and reveal political advertising, for example, but a fake-and-juicy video clip could spread to millions of people organically without anyone ever paying a cent to promote it. DSPolitical's Antidote feature will help a campaign target voters who may have seen slanted or manufactured stories on particular websites — useful if you're targeted by the likes of Alex Jones. But tools like Antidote almost certainly won't work on native Facebook content spreading organically. What should campaigns do when a false story takes on a life of its own?
Prepare
Supporters are a campaign's best defense. Build your email list and social media followings with an eye toward rapid-response mobilization. Don't just treat your list like a cash machine, because you may need your supporters to step up and speak out for you in their social circles — something they may be less likely to do if you've abused their inboxes with too many fundraising asks.
Likewise, build relationships with local politicians, activists, journalists, bloggers, editorial boards and other influencers before you need them. If some tries to smear you with a fake video, it helps to have trusted names to vouch for you, or at least to highlight your side of the story.
Monitor
Set up Google Alerts on your candidate's name and your opponents' names, and possibly on search terms related to hot-button local issues or activists. Your digital consultant may provide social media monitoring, and you can also use free tools like CrowdTangle (to monitor many individual Facebook Pages at once) and Hootsuite (to monitor search terms on Twitter) to listen for dark whispers on the social web. Realistically, local activists and politicos may well run across fake content before technology catches it — another reason to build those relationships, so that you'll have many sets of ears listening.
Respond
The great danger in responding to an attack is that the process of rebutting a smear can lodge it more firmly in people's minds. Rather than taking on fake news, or fake video point by point, grab the moral high ground and attack the fact that you're being targeted unfairly. Try to get people talking about the fact someone is after you with fake content, and enlist your supporters directly in the cause. The war is likely to be won over barstools and dinner tables, so make sure that you activate your grassroots via email, social media and person-to-person outreach.
To try to win in the air, try "flooding the zone" with positive content about your candidate, particularly in the medium in which you're being attacked. For instance, if you're the target of a video smear on YouTube and Facebook, you might post several of your own videos of supporters talking about how great you are and why they're voting for you. Or post issue-focused videos featuring your calm and reassuring candidate talking with voters.
In the end, I suspect that we’ll all simply have to become much more skeptical about what we read or see online – particularly if it jibes perfectly with what we want to believe. Publishers like Facebook and Google will develop tools to find and filter fake content, but similar technologies will surely find their way into the hands of dictators and others who want to quash content not because it’s fake but because it threatens them. If you think fake video is bad, wait for the response. Welcome to a brave, new post-Truth world.
Colin Delany is founder and editor of the award-winning website Epolitics.com, a twenty-year veteran of online politics and a perpetual skeptic. See something interesting? Send him a pitch at cpd@epolitics.com.