Did you check out Dean Phillips’s audio AI chatbot? Kudos to you if you did, because you must have moved fast. Just a couple of days after it showed up in the press, artificial intelligence platform OpenAI blocked the app and suspended the developer’s access.
Turns out that OpenAI doesn’t allow campaigning via its technology, something that people in the political tech world have known since last year and that the company itself had reiterated earlier in January. Though the developers will likely try to replicate the bot using another AI tool, any positive press its unveiling might have received has now been buried in “Dean Phillips AI app banned” headlines. Oopsie!
Actually, I’d argue that the problems with the Phillips chatbot stretch beyond the violation of rules its backers should have known about. Some of them do so in ways that shed light on issues that often arise when non-political techies volunteer to help political campaigns.
Don’t get me wrong: tech volunteers help campaigns every cycle, often with excellent results. But a few problems tend to show up repeatedly when they do, and the Phillips AI chatbot illuminates several of them. With an eye toward getting the most value out of any tech volunteer time our campaigns may recruit, let’s start with the biggest issue of all. Why build the thing in the first place?
Don’t Solve the Wrong Problem
Conceptually, an AI that talks like a candidate could have valid applications. It might let vision-impaired people interact with the campaign, for example, if they’re unable to use a more-conventional text chatbot. Moreover, some people might feel more connected to a candidate once they’ve had a virtual conversation with him or her, although the “artificial” part of “AI” may interfere with that process.
But Phillips’s biggest problem isn’t helping people listen to information about him and his policy positions that most could read on his website anyway. It’s getting people to pay attention to him at all, a problem unlikely to be solved by an online audio interface, no matter how life-like it sounds.
When I was first drafting this column, I didn’t see a Google search ad from his campaign when I ran a query on his name. But I did see his ads on more-specialized queries like “Dean Phillips website,” though they were not appearing for me a few days later. And, he did run a whole lot of ads on Facebook and Google properties specifically targeting New Hampshire in the weeks before the Democratic primary, so it’s not as though his campaign isn’t trying to find new supporters online.
Perhaps that need for attention inspired the idea to create the chatbot? Any AI tool is catnip for reporters right now, and you can see how the tech folks backing his campaign — ironically including Sam Altman, CEO of the same OpenAI that eventually blocked the Phillips-bot — might see a relatively novel AI application as a way to get attention. But in politics, the presentation usually matters at least as much as the substance. In this case, headlines about the chatbot’s ban pushed any positive coverage far down the page. As soccer fans would say, the whole project turned into an own goal.
Make Sure They Know the Rules
The Phillips chatbot is so easily mockable, in part, because its failure was so avoidable. OpenAI has not been shy about its feelings about political campaigns using its output. A few seconds of Googling would have turned up the platform’s political ban, which brings us to an issue that comes up a lot more often than “let’s build a chatbot.”
Too often, non-political techies have never heard of the rules political campaigns have to live by. And when you don’t know that the rules exist, you don’t know when to ask the critical “can we actually do that” questions.
Disclaimers, disclosures, expenditure tracking, crucial dates when a campaign must, may or cannot take certain actions — all of these have tripped up amateur political techies in the past. Experienced campaign professionals have seen or heard enough horror stories to understand that an honest mistake can create the opportunity for an opponent to run an attack ad, or for the FEC to get involved, but most volunteers won’t have. Anyone managing tech volunteers should set up a clear approval system for projects that smell like they have a chance of running headlong into either a political or a legal minefield.
Violating Facebook or Google’s rules own specifications for campaigns isn’t as likely to be fatal, since setting up a YouTube channel most likely won’t spark a finance violation. But tech platforms have plenty of political rules and requirements that have tripped up or stalled many a project entirely, and campaign managers should definitely make sure that volunteer techies know to look them up far in advance. Move fast and break things, yes, but don’t break something at the last minute that you absolutely need on Election Day.
Keep Them On Task
Finally, the Pillips chatbot illustrates an issue that comes up all around the tech world, political or not. Techies love shiny objects. Programmers are notorious for going off and building the thing they want to build, whether or not it’s precisely what you need them to build. I’ve heard of political tech volunteers having to be given what are basically make-work projects to keep them busy, because the projects the campaign actually needed didn’t interest them enough.
More than a decade ago, I heard a presentation by an Obama 2012 staffer who’d overseen the campaign’s volunteer tech office in the Bay Area. A tech-forward political operation would seem to be a natural fit for tech volunteers, and hundreds of programmers and engineers contributed in some way over the life of the project.
They didn’t actually end up building much that the campaign needed, though. In part, that was because Obama’s paid staff included a huge contingent of techies already. But the former supervisor noted the problem of keeping people on task, along with the danger that someone essential to a process might simply disappear, as volunteers often do.
So please, welcome tech volunteers onto your campaign. They may help you streamline tasks, format documents, set up ad campaigns, manage voter data and much more. But don’t let them do it on their own. Employ them with care, lest you end up with a chatbot that’s not chatting with anybody.
Colin Delany is founder and editor of the award-winning website Epolitics.com, author of “How to Use the Internet to Change the World – and Win Elections,” a veteran of more than twenty-seven years in digital politics and a perpetual skeptic. See something interesting? Send him a pitch at cpd@epolitics.com.