How should companies measure creativity?
It’s time for a new approach to idea testing.
Tyler England has been measuring marketing ideas for decades. As the Vice President and Head of Customer and Marketing Insights/Strategy at HP, he is charged with understanding every detail concerning HP’s customers and finding the insights that turn customer loyalty into sales.
When I spoke with him about how he handles this process, he described to me his balanced approach. “When measuring creative ideas, the debate is often around which system should weigh heavier — the more automatic and emotional system or the more rational and deliberate system.”[i]
Over the years that England has spent studying the topic of creative measurement, he’s learned that judging the success of creative ideas isn’t always black and white. Even with a tremendous amount of data and modern techniques, the data can quickly get muddy. It turns out, measurement requires as much intuitive thinking as coming up with the original ideas.
Here’s how England explains it, “Stakeholders often want a definitive answer on a certain piece of creative. They assume that because we conduct a research project, we should have all the answers. But the true answer is that it depends. There are so many factors to weigh, and people don’t always respond to research in a rational manner. They hide their emotions. In order to get more clear answers, we have to nail both the System 1 and System 2 thinking of our customers. System 1 helps with engagement and memory, while System 2 helps translate it all into action and purchase. And arguably, System 1, or the emotional system, is increasingly the gate keeper to System 2.”
England understands that measurement isn’t limited to logic and rational thought. You don’t just ask a question and record the answer. It’s extremely emotional as well. And we have to use other techniques to understand people’s emotional responses. Techniques based on behavioral economics and human psychology.
He’s used every method available over the years, but lately he finds that the best research companies use multiple measurement methods to understand both systems.
Companies like Millward Brown and BrainJuicer are two examples of research testing firms that partner with HP because they have proprietary methods to measure both emotive and rational systems. They base testing on many of the same insights shared by behavioral economists.
“Their tests are designed to isolate the creative, rather than throw it directly into an in-market A/B test, which can muddy the results with other executional and targeting factors. For instance, I like BrainJuicer’s use of an emotive scale based on facial expressions to capture a System 1 response. It may not be perfect, but it forces a response a bit more toward an emotional reaction. Then they follow it up with a logical survey with questions to capture more of a rational response.”
In the past, much of research ignored the emotional half of our responses. It tried to stay as rational as possible with very direct and precise questions. The problem, similar to focus groups, is that respondents put on a different face when asked to answer. They slowed down and only used their logical brains to solve the research questions. But today, now that we better understand how we think, we work toward getting results that question both systems. Both left and right hemispheres.
We need more measurement and research strategies that understand and follow this balanced approach. Unfortunately, because of cost, this type of research is typically reserved for the high-cost, higher-risk, anchor creative of a campaign.
But that doesn’t mean we can’t apply the learnings from the big research projects on a balanced approach with everyday testing. Overall, England feels that research testing should focus on metrics around four universal truths in order to figure out the right balance for each brand.
“Working with some of the lead thinkers and agencies on this subject, combined with observing this stuff for years, there are several universal truths that need to be followed. These four are: Emotive, Convincingness, Connection to the Product, and Branding.”
Emotive helps with breakthrough and is key to retaining a memory about the experience. Convincingness is the rational side of things and helps with differentiation and understanding the benefits. A strong connection to the product is needed by making sure the product or service is baked into the message, as this helps drive purchase interest and connects the emotion and logic to the brand memory. And finally, Branding is always important so that you have a strong anchor or location to file the memory, otherwise the brand doesn’t benefit from both short-term and long-term memory credit — that ultimately translates into loyalty.
It’s exciting to see companies put an equal amount of emphasis on both sides of the equation in order to get more human results. And when we tie these universal truths back to neuroscience and how we consume content, create memories, store memories, and recall those impressions, it gives us a more balanced approach to really understanding how effective creative ideas can be.
Agile versus waterfall.
Some may question why we spend any resources on creative measurement. Because at the end of the day, isn’t sales or the bottom line the ultimate judge of the success of an idea?
And with the advances in data driven marketing, do all these other methods still have a purpose? Can’t we just measure our ideas through the many types of digital and customer analytics and get a real answer on what is effective or not?
The answer is it depends. Certainly there is no debate that sales are the final word for a company’s success. At least for those businesses who are looking for a profit. But when it comes to measurement, there are many factors to consider and having an understanding of all methods is important. Otherwise you are simply focused on one strategy and one point of view on effectiveness.
For example, there are many companies that want insight on an idea long before it becomes a sales statistic. They may have long sales cycles, and by the time a sale happens, they’ve already made the gamble. So they are looking for insight early in the process. They want quick insights so they can adjust the plan.
If we’ve learned from recent lessons of web production, taking a more agile approach, rather than a waterfall approach, is the best way to succeed. With an agile approach, you try something, test it, analyze the results, discover the insights, then go back and try some more. You’re constantly iterating, tweaking, prototyping and learning as you go. If you use a waterfall approach, you build the whole thing and then see how it turns out. You may find that you’ve spent a ton of time and money working on the wrong features. Just looking at sales results is like a waterfall approach. Often it’s too late to do anything about it and you haven’t learned much about what isn’t working.
There are also other companies that use more organic and non-digital communication methods. And even though many of us claim to have analytics on non-digital media, there are still limitations. We can certainly measure sales spikes and attribute them to offline media, but when calculating lift in a campaign, it can quickly get fuzzy.
Finally, as England mentioned in the previous section, there are so many things to measure, we often muddy the water in terms of really knowing the result of research and testing. For a day-to-day insight, an A/B test may be ideal. But it needs to be very specific, or it may confuse more than guide. When approaching a test, we need to be very careful of the scope and variables in the test.
Over the years, I’ve worked with teams and vendors on a variety of testing methods. Some of these cover everything from user testing, in market A/B tests, multi-variate tests, heat mapping, message testing, individual interviews, and even eye tracking systems that date back to the mid 1990’s when digital was emerging.
Many of these methods offer very specific benefits. For example, user testing for a website helps us know if customers can figure out the UX and if they are clicking the things we want them to click. It’s not testing their opinions or feelings. It’s simply discovering details about the user experience. Many users may express their feelings on design, but that emotional response is usually just a sign of their confusion or frustrations on certain design elements or flow.
And even with very linear testing like this, you can easily lead the witness to get them to talk or respond to things that they wouldn’t have explored on their own. Or you can get a false positive result. The environment you choose, the questions you ask, the guidance you give can all affect the outcome of the data. Many of these methods are observational science. And that doesn’t always give us insight into how a person is really thinking.
In essence, data on its own isn’t everything. It isn’t the final word. And you shouldn’t put all your faith on one type of testing. Use all methods at your disposal, including your gut, to get an idea of how effective your ideas really are. And remember that testing results require systematic results from multiple experiments. The beauty of getting answers to our hypothesis is testing, but it takes work.
Data on its own isn’t all powerful. It needs help. The software you use, the analysts, the people who take action — all these variables can change the outcome. It’s still a very human endeavor. The data driven robots haven’t taken over the world, even though many act like they are firmly in charge and that this process is perfected.
Data and creative ideas must work together. Data inspires ideas and ideas are proven out with data. As Ryan Pizzuto put it, “All testing starts with a bright idea, but you need to validate those ideas with data.”[ii]
I’ve had many conversations with fellow marketers who say something like, “We tested creative ideas with our operational emails and we got a better result with straightforward facts. So emotional ideas don’t work.”
Again, keep in mind that this is a very limited test and point of view. One or even several A/B tests aren’t the ultimate truth. Especially when testing highly emotional things such as a subject line or headline.
Your results may show that a direct headline worked best. But consider this, was the fact or stat you used emotionally charged? Or perhaps that audience had already built up enough loyalty through previous creative experiences with the brand. They were willing to give up a little emotional capitol to get a specific job done quickly.
There are so many factors in every subjective piece of communication that a single or a few A/B tests aren’t able to give you holistic answers on something as massive as the effectiveness of creative ideas.
Or perhaps that specific test is correct, and in an operational email deep in the funnel, customers aren’t interested in emotional ideas. But a boring and direct message may be totally off brand. And that email could benefit from a bit of brand personality to keep the loyalty strong.
There are many tactics that aren’t emotional or creative but still work effectively. Bad ideas still work. Couponing works, but that doesn’t mean certain brands should use it. Couponing could ultimately destroy a brand and make it a commodity product based on cost rather than a premium brand based on an emotional appeal. So even if you test something and get a certain result, that doesn’t mean it’s the right thing to do.
I’m simply asking for us to be more open minded. Don’t take all data as the ultimate truth. We learn a little with each test. Even if we connected all your customers up to an fMRI machine we would learn some insights. But because of the environment, we may lose some truth as well. Advocate a more holistic and balanced approach to learn as much as we can about subjective ideas in order to know what will work better.
Just because a measurement tactic sounds scientific and digital that doesn’t mean it’s all encompassing (and our cultures place a premium on anything that seems super rational). Remember, it’s a tool. There are many tools we can and should use. But keep a balanced approach when measuring creative ideas.
For far too long, we have based measurement of creative ideas with only a rational approach. But humans react in both rational and emotional ways. Let’s take advantage of what neuroscience has taught us recently and find ways to incorporate emotional testing as well.
[i] Tyler England, personal interview, June 2016
[ii] “Strength in Numbers: Best practices in data-driven marketing,” Adobe webinar, https://seminars.adobeconnect.com/_a227210/p77ute27cf2/.