Polling, we’re told, is now in a state of perpetual crisis. From rising candidates who “won’t be ruled by polls” to the latest hand wringing over accuracy and effectiveness, it seems the industry as a whole is going through a crisis of confidence.
Why, then, do campaigns of all sizes and candidates at every level listen to what public and private polling has to say? Why are the top pollsters still renowned as gurus?
The collective schizophrenia around polling can seem perplexing, but it’s actually fairly simple. Unlike man, all polls are not created equal.
Now, how can you, as a general consultant, campaign manager, or even finance, field or political director tell that the numbers you’re getting are accurate and actionable? As a veteran of campaigns for mayor up to governor to U.S. Senate, I have learned a lot about the industry, and the dos and don’ts of conducting polls.
I have had the great pleasure of working with some of the best pollsters in this industry, on both sides of the aisle. I’ve commissioned polls from Anzalone Liszt Grove Research, North Star Opinion Research, McLaughlin & Associates, Wilson Perkins Allen, Public Opinion Strategies, Southeast Research and a host of others.
I make sure to delineate between polls I’ve commissioned and polls I’ve seen. If I don’t have input on the questions, the methodology or the sample, I’m probably not going to put much stock into a poll’s results. That said, if I know what the methodology and sample are, I can easily tell if I should put much stock in the results.
What I can say about getting accurate results in this time of cellphone-only households, a diversifying electorate, and where auto-dialers seem to only reach an older demographic, is that to get good results you have to be willing to pay what might seem like astronomical amounts of money. The cost curve is not bending, only multiplying exponentially, like Moore’s law.
You also have to be willing to keep your length down, your sample size up, and avoid split testing as much as possible. Too often campaigns try to shoe-horn in everything they can think of into a poll. That makes it too long and unwieldy and contributes to respondent fatigue, which in turn diminishes the quality of the data. You have to define your objectives up front, and be willing to make hard decisions about what to test and what to leave out.
That’s where trust in your pollster is key. You have to choose the ones who will honor your input, tell you when you’re right and when you’re wrong, and be a real partner with the rest of the campaign when it comes to focusing on who your voter targets are, what your message and media choices are, and who’s numbers funders can trust when you’re trying to pry donations out of them.
This isn’t to say the industry isn’t changing. In 2006 there weren’t separate cellphone-only calls in any of the polls we commissioned. Meanwhile, benchmark polls, because of their length and sample, were more expensive. It’s reasonable to allocate 10 percent of your total budget to a benchmark poll and get back a messaging and strategy roadmap that would easily pay for itself many times over. That’s what happened in 2015 when I ran a race with a budget in the half-million-dollar range.
Now, you might be thinking: “How can you justify spending almost 10 percent of your budget on a poll when you could get an auto-dialed push button poll with a 1000-plus sample for a tenth of that cost?”
My answer is that when we did our poll briefing, Zac McCrary, of Anzalone Liszt Grove Research, told me that we started around 51 percent, and if we executed along the lines of the message points, and caught a few breaks, we could expect to get as much as 56.5 percent of the vote. We got 57 percent. We also upped our sample size, and called a full 30 percent of respondents on their cells.
They provided crucial input into not just what the horserace numbers showed, but also what messages to highlight, and where we were soft in our needed coalition. Moreover, it gave the candidate and campaign chairman a needed push to raise and spend the money necessary to fund a strong communications and GOTV plan. Now, we knew what we had to do, and we did it based on the trust in the accuracy of our polling.
To wit, polling isn’t getting less accurate. It isn’t “impossible to poll” the African-American or Hispanic communities. Real pollsters don’t “cook” polls or start with the intention of showing their candidate ahead.
The best pollsters, like Dan Judy and Jon McHenry, Glen Bolger or John Anzalone and Jeff Liszt, know that you’re going to make decisions that effect the trajectory of the campaign, your own career and that of your candidate, and their own ability to continue getting top-tier clients. Their incentive is accuracy, not ego service.
David Mowery is the founder and president of Mowery Consulting Group.