How a Direct Mail Test in New Jersey Helped Sway Voters
Not all direct mail messaging is created equally.
That’s one of the key takeaways from a new study conducted by the Republican-aligned Center for Campaign Innovation, asserting that campaigns that fail to test different creative aren’t playing it safe – they’re wasting money.
The experiment, which focused on New Jersey voters who cast ballots in 2024 but didn’t vote in the previous midterm election, tested different mail treatments across three different groups of voters: one featuring a standard “get-out-the-vote” reminder, a second spotlighting an election integrity message and a third that referenced voters’ past election participation and urging them to continue their voting “streak” from 2024.
A fourth set of voters was set aside as a control group that didn’t receive any communications in the experiment.
The study found that voters who received direct mail referencing their past behavior – their voting streak – turned out at a higher rate than any other group. According to CCI’s Executive Director Eric Wilson, the experiment serves as proof that creative testing in campaigns isn’t a luxury. It’s a necessity.
Wilson spoke with Campaign & Elections this week about the experiment and what campaigns can learn from it:
C&E: First off, why look at direct mail as opposed to, say, digital creative?
Wilson: So, a few reasons we wanted to study direct mail. One, it still is a significant line item for a lot of campaigns and outside organizations. And so, even finding small gains can really make a big difference.
I think a second thing is that, for young people, direct mail is actually one of the least crowded channels that they have. So when it comes to reaching younger voters, mail should be on the table.
But one of the practical reasons we chose it for this experiment is because with mail, we know exactly who gets a piece and who doesn’t. It allows for better segmentation of our treatment versus our control. Obviously, I don’t know if someone saw the piece or not, but we can say with certainty that one group was targeted with this creative versus another.
C&E: What’s your big takeaway from this study?
Wilson: There are two layers here. I think at the meta level, there’s just a confirmation of the need to keep testing. We saw that one creative performed better than the other creatives, and even within our own small experiment, that would translate into almost more than 200 added votes. And if you layer that to bigger groups or bigger states, it can really add up.
So the first takeaway is that testing matters. It’s one of the few areas that campaigns can eke out gains in these really tight, competitive races, where a lot of people are still focusing on volume or on quantity. The actual creative – the actual message – is one of the dials that’s available to us, and we have the ability to test it now.
The second takeaway is that when it comes to getting out the vote, messages that tie towards behavior and are personalized to someone’s past behavior are more effective than more-ideologically driven appeals or generic get-out-the-vote messages. These are people who are infrequent voters. But that’s because something comes up or something gets in their way. And if you add some more motivation to that, you may get some of them off of the sidelines and turn them out.
C&E: Was there anything that surprised you in the findings?
Wilson: I was surprised at how high turnout was with our control group. You know, we specifically chose an audience that had 2024-presidential-year history, but didn’t vote in a midterm. It turns out that those people, once they start voting, they vote at very high rates. Turnout among the control group was 65 percent, and some of that has to do with the fact that it was a really competitive race. A lot of money was spent on it. So that’s the thing that jumped out at me. We obviously didn’t expect that, because that makes it harder to detect smaller effects in an audience.
C&E: One thing that stood out to me was that the study found that, in some cases, the control group — those people who didn’t receive any mail — outperformed the groups that received the generic GOTV message and the election integrity message. Any idea why?
Wilson: Well, we’ve certainly seen that in other tests before – where you get a backfire effect with a piece of creative. But in this instance, we don’t want to read too much into that, just because some of the calculations you make around statistical significance probably don’t support that.
But what we look at is: directionally, do patterns hold up? And we saw consistently that one message outperformed the other messages. And that’s really where our confidence lies.
C&E: What insights can you share on which messages resonate with which voters? Are there any common themes?
Wilson: A couple of things stand out. The ideologically driven message – in our case, we were doing election integrity – isn’t necessarily motivating for infrequent voters. If they were ideological, they would already be voting. That’s not what motivates them.
It sure seems like reminders of their past behavior – being consistent with their previous actions – plays into this. It’s like the same psychological principle behind Wordle streaks, right? And it’s effective, especially for young voters. Young voters are infrequent, because they just don’t have that habit of voting built.
Now, that behavioral messaging started to break down once we got to voters 76-plus [years old]. These are people who have been voting in elections for, you know, 50-plus years, and so they have pretty well-worn voting habits. But with younger voters, who don’t have a habit, these little nudges can make a big difference.
