In a digital ad environment often overrun with content, engagement is vital for campaigns. When views, clicks, and shares skyrocket, it’s often seen as a clear win for an ad effort. But a recent study led by digital strategists at Harmony Labs tells a story about the relationship between engagement and persuasion that’s far more nuanced.
The Meme Factory, as the strategists termed it, created and tested a series of videos on Facebook with the goal of seeing whether the content with the highest rates of engagement was truly the most persuasive. They found viral ads don’t necessarily translate to success when it comes to persuasion, and more views could even mean less persuasive power in some instances.
The findings connect to larger industry-wide discussions about the role of platforms like Facebook more broadly, and how best to get out of our individualized social media echo chambers. It also asks us to consider the ways in which digital content shaped the 2016 election and how it will impact this year’s midterms.
C&E sat down with one of the strategists who played a leading role in the design and execution of the study, Nathaniel Lubin, the founder of Lubin Strategies LLC. He’s also the former director of digital marketing at Obama for America and he served as director of the Office of Digital Strategy at the White House.
C&E: Tell us a bit about The Meme Factory. What’s the overarching goal, what’s the style of content you put out, and how you are designing experiments around that?
Lubin: With that project, we raised money to do a short-term set of experiments that were intended to figure out the most efficient, easiest ways to either curate or develop content from scratch that was at the intersection of engagement and persuasion. So we were looking at material that was less polished, a little rougher and easier to produce than a lot of other ways you might do this for a final, commercial structure. We were trying to put together a method where we could learn as quickly as possible, by making those cycle times to get to an interesting result as short as possible. We did that in three sets of experiments.
C&E: How would you sum up the central takeaways of your initial study? What works and how do you know?
Lubin: What we found, unsurprisingly, is that different audiences respond to different kinds of content differently. There’s a bit of a sobering reality there, even for some of the most accomplished people who’ve invested a lot of time in trying to understand this stuff. It still requires actually doing the experiments and checking the data out to really understand that very often our guesses and intuitions are not correct. And one thing we saw was that, and this is a takeaway that’s been reported elsewhere, engagement and persuasion really are uncorrelated. Just because something is higher in one dimension really doesn’t mean that it’s higher in the other.
That’s probably unique to some extent. You might have a brand where people are engaging more deeply with the content, and that might actually mean they’re more likely to be positively influenced by it. That’s true to some degree here, in theory, but it really seems that it’s measuring a different quality, and you have to come up with smart ways to really analyze both of these different attributes in order to find things that work in both dimensions.
I think we found as a consequence of that, the overall winners in these experiments were neither the most engaging nor the most persuasive. Really, that intersection point can be somewhere else in that realm: someplace where you never would have thought it would be the most effective, unless you did a methodology like this.
C&E: Can you zero in on the metrics that you looked at to gauge effectiveness?
Lubin: On the engagement side, the metric we were most interested in was ‘time spent with the content,’ on the theory that if people don’t spend time with it, it can’t possibly do anything. We also looked at more traditional things, like share rate and the other interaction metrics. On the persuasion side, the methodology we were looking at, working with a product called Swayable and a couple other partners from various places, was trying to compare a control audience that was exposed to a neutral piece of content to a treatment one with our actual assets. Those were: having a few questions that were the ‘intention’ questions, you know, pushing people in a particular direction, understanding how they were moving. If projects were not political, then trying to understand it from an informational standpoint. It was intended to see how a neutral audience compares with that kind of treatment, and there’s a bunch of different ways you can do randomized controlled experiments. This was a good way to do it on a rapid timeline.
C&E: What role do Facebook’s business model and platform design play here, and what are some things they could do to improve on those fronts?
Lubin: It’s a complicated question. I think they’re working on a lot of stuff, and it’s a rapidly changing environment. On the security side, the information side, they’re obviously investing a lot to improve that. But I think additional investment in ways of doing evaluation which are more native to the platform and tied to whether or not people actually move, in addition to just engaging, would be interesting and useful. Platforms are not shy about that: that’s why they built some of these tools. It’s just a question of helping make them more accessible to more people.
C&E: What are some common misconceptions about what targeting on Facebook can and cannot do?
Lubin: I think the targeting capabilities are very good. That’s why they’re successful at what they do. I think there are also other types of information that can help inform campaigns when they’re done well. Not in the world of stealing data or doing things that are against the rules; just, for instance, using offline polling or other sorts of models to come up with targets that you think would be responsive to the campaign, or need to be compelled to get engaged by a campaign, that can supplement. Another thing for the Meme Factory experiment was that we deliberately chose to use the targeting capabilities of Facebook. We saw that as a feature for that particular experiment, because we wanted to keep things relatively simple.
C&E: Facebook is often accused of insulating users from different beliefs and perspectives. Can content counteract that polarization?
Lubin: Well, I think it can be used to help or to hurt. The existence of advocacy efforts is dependent on who’s doing the advocating and what they’re saying. Figuring out ways to put out material that is born out of the reality of the world, trying to be humble instead of overconfident, presupposing we know the answers, that sort of thing. You know, people may not be exposed to information that would be important to them, and that is a thing the best programs are able to do. But they’re doing that from a perspective, and they have an opinion and a goal that they’re trying to achieve. All kinds of topics and issues where there might be a gap between people’s perceptions and reality, those are the places where an intervention could be helpful, and that’s the point of these kinds of things. The point of the program is to figure out a way to do that efficiently and significantly.
C&E: Are campaigns and organizations making the right investments to understand what works and what doesn’t?
Lubin: It’s a challenge, because this requires having the time and resources to invest in new kinds of approaches. That’s not the right solution for every organization. If you don’t have enough resources, you shouldn’t. I think one thing groups should do is just recognize that it’s more important to come at this from a place of understanding what you know, and what you don’t know. And this doesn’t mean you should be paralyzed about making decisions. It just means you should not be overconfident in knowing the answers in advance. And I think there’s still a lot more room for future investment in digital generally. There’s been a lot more recently, but it’s not there yet to get to the level that it should be, and that needs to change.