The amount of information and content that the average consumer receives online and on our phones today dwarfs previous generations—and you’d be hard pressed to find any marketing expert that anticipates this trend abating.
In fact, it was estimated that about $107 billion was spent in America alone in 2018 on digital advertising. For campaigns last cycle, that amount is estimated to be between $900 million and $1.8 billion. This is clearly a big range. But it’s a tough number to pin down based on the way expenditures are reported. For comparison, in 2014 digital spending was estimated at about $250 million.
The challenge in the political world is how to measure what’s being done and the effectiveness of that effort. Digital consultants have their metrics such as impressions, clicks, et cetera. But are there other ways to understand the reach and effects of the money being spent?
While campaigns have been a little slow to make adjustments to digital, the same can be said of the political research world. Traditional survey research isn’t set up to measure small changes. Back in the days of three major networks and 1,000 points on these stations being a near guarantee of reach around the country or state, a survey before and after the run of the TV ad would suffice. But digital advertising works much differently, and research needs to adjust accordingly.
Understanding how humans react to information in our digital age is an ongoing effort across the globe, which admittedly is having a tough time keeping up with the changes in usage and behavior. That being said, this is an incredibly exciting time in the areas of research and measurement. It’s allowing us to increase our understanding of what makes people tick, and more importantly what makes people change their minds after being exposed to ads.
My firm, for instance, has developed a tool to track changes in opinion through online panels that are asked to log in about once a week to answer a short series of questions. The result is a longitudinal data set that allows us to get an understanding of how opinions are changing over time.
Since we now live in a world where information is changing by the minute, it’s unreasonable to expect your digital ads to be the one and only influence on opinions. And it’s also unreasonable to think that ads will behave similarly with outside events changing. Indeed, external events can not only alter opinions but also how digital ads are viewed by their intended audience. Take one of the more partisan events of the past cycle: the Brett Kavanaugh Supreme Court confirmation hearings.
The graph below shows the percentage of Democratic and Republican voters who offered very strong support for their respective party in the generic ballot for the House, broken down by different time periods around the Kavanaugh hearings. For example, before the hearings began in early September, 76 percent of women who ended up voting for a Democratic candidate gave very strong support for Democrats. At the same time, only 64 percent of women who voted Republican in November gave very strong support to the Republican Party candidate.
As the hearings progressed and became more contentious, that level of strong support increased (as it did with Republican men). By the time Kavanaugh was approved by the Senate on October 5, the percent of Republican women who strongly supported a Republican candidate was up 8 percentage points.
But it wasn’t just connections with the two parties that changed, voters’ views on their likelihood to vote also shifted.
The trendline above tracks two groups from July through Election Day: those who reported voting for a Democratic candidate post-election and those who voted for Republicans. The vote likelihood levels were steady from July through late August. Once the Kavanaugh hearings began, volatility increased dramatically. For eventual Democratic voters, their enthusiasm initially went up and then dropped quickly through the final confirmation vote in early October. Future Republican voters essentially decreased their enthusiasm to vote throughout, including an especially precipitous decrease during what appears to be around the time actions of Sen. Jeff Flake led to a week-long delay in the vote.
Once the vote was finished and Judge Kavanaugh became Justice Kavanaugh, Republican voters immediately reclaimed their enthusiasm, eventually returning to a similar level that they reported in the summer. It took a little longer for Democrats to find their footing, but a late surge took their chance of voting a couple points higher on Election Day than it had been throughout the summer and fall.
This example shows what’s possible to measure these days, but also shows an important lesson. Research should not only be there to measure but also to inform. And now that the ability to do this almost in real time is available, a campaign’s digital team can become much more effective. We know that people will react differently to messaging based on their mood. Television ads are, and always have been, too expensive for most campaigns to run multiple versions or multiple ads at any one given time. Additionally, it’s more difficult to quickly adjust to outside events other than taking down an ad or putting a new ad up that has already been shot, edited, and sent to the stations. Digital is much more cost effective and much easier to adjust as events change.
In the past, campaigns have been relatively siloed. Paid communications teams do their thing, the research team tries to measure effect. But now we are entering an age where the research should not just measure but inform throughout the process.
As more and more of a campaign’s budget moves from TV to digital, the possibilities of increasing effectiveness are exciting to think about and also means moving away from measurements like impressions and shifts the focus to where it should be: how we can change the hearts and minds of our audience.
Stefan Hankin is founder and president of Lincoln Park Strategies, a Washington D.C.-based public opinion firm. Follow him on Twitter at @LPStrategies.