Pre-testing ads doesn’t kill creativity. Apparently.

There’s a guest post on Mumbrella titled ‘Pre-testing ads doesn’t kill creativity’, from Darren Poole, Millward Brown’s Chief Client Officer.

The post was written in response to comments from Geoff Ross that “… advertising is completely in a quagmire, multiple layers, pre-testing, post-testing. In the end, nothing good is going to survive”.

Darren sets out to articulate why this isn’t the case, and why pre-testing delivers significant benefit to the creative process.

I want to find something illuminating in what he’s saying. I’m tired of the utter predictability of the ‘pre-testing is nonsense’ response. I worry that maybe disparaging research is my obligation as a member of the advertising industry (like over-using Hipstamatic or coveting Moooi).

But try as I might I can’t find anything here that changes my view. In fact he makes a couple of comments that only reinforce it. One is this:

While none of us like the idea of someone evaluating our work (hands up who loves performance reviews?) the reality is that advertising is a most public profession and the target audience is the ultimate judge of creativity.

I think this rather succinctly captures the central issue.  Because the target audience isn’t the ultimate judge of creativity.  The target audience is the ultimate judge of persuasiveness.  It’s the failure to grasp the distinction between these two things that is at the heart of the pre-testing problem.

Clients don’t make ads to be creative.  They make them to persuade. Creativity is just the means to the persuasive end.

And the reason this is important is that people don’t really like the idea that they’re being persuaded. They actively avoid the sense of being manipulated – so much so that they will deny it ever happens.  So an effective ad has to persuade covertly, meaning that a great ad will be persuasive for reasons that people struggle to identify and explain.  It’s the unexpectedness, the under-the-radar stealth that makes a great ad persuasive. So a great ad has to be persuasive before anyone realises they’re being persuaded.

Which is why you can’t make the target audience the ultimate judge of creativity.  Not because, as is often suggested, they know nothing about advertising or creative. But because they know something about advertising and creativity.  So when you ask them what they’d do to an execution they’ll happily tell you. They’ll give you lots of helpful suggestions based on things they’ve seen and liked before. But this can only mean (and I think you can see where I’m going with this) that they’re all things they’ve seen before, making them less persuasive, and therefore less effective.  So if you identify a problem with the persuasiveness of an execution, the creative answer is not to try and persuade in a more obvious way. The creative answer is to try and persuade in a more original way. And that originality is not going to come from the target market.

I have a second issue with Darren’s argument. He gives a series of examples that highlight the improvements made to some executions that Millward Brown has pre-tested.

“I can’t give away too much detail, but some of the work we have done with CPG brands in the past 12 months has seen a soundtrack transform from something viewed as a little strange to become quirky and impactful. Another great example is where we helped characters evolve from being creepy and a bit disturbing to wonderfully eccentric. We also indentified the route to help a brand develop one of the most popular ads of the Summer, and we’ve helped make a brand’s consumer benefits more evident in more than one case’.

I don’t doubt that there are occasions on which pre-testing has molded a soundtrack or character into something that has subsequently worked.  Darren proudly, if vaguely, lists some examples of how pre-testing has improved the quality of some advertising. But I also don’t doubt that there are a great many more occasions on which pre-testing has squeezed the life out of something that was potentially interesting – times when it’s taken what could have been a quirky and impactful soundtrack and turned it into something familiar and innocuous, or turned a potentially eccentric and memorable character into a boring and forgettable one.  Pre-testing has consistently resulted in advertising that is more predictable and less original, in the process making it less persuasive and enjoyable. And on that basis I suppose you could argue that pre-testing has improved advertising, in the same way that you could argue that Simon Cowell has improved music.

I think the advertising industry embraces the idea that research can help us understand how the target audience might be persuaded. We’re very happy to learn more of how the audience feel about the issue at hand. (We’ll even put aside the fact that they, like you and I, are transparently incapable of describing their genuine feelings on most subjects, quite unable to rationally explain our irrational attitudes.) But if research can provide some clarity here, let’s have it.

But what we don’t embrace is what Darren’s note exposes.

The research industry’s belief that the target audience should be the judge of creativity is flawed.  The judge of persuasiveness, perhaps, but please don’t confuse the two things. And the research industry’s belief that the pre-testing process consistently highlights ways of improving ads is equally flawed.  We know pre-testing consistently highlights ways of changing them. But improving them? That’s another issue entirely.

(Two interesting, and contradictory, pieces here from WARC. The first details a TNS study showing the improvement in effectiveness delivered by pre-testing. The second shows exactly the opposite.)

Pre-testing ads doesn’t kill creativity. Apparently.

One thought on “Pre-testing ads doesn’t kill creativity. Apparently.

  1. Many thanks for this piece.

    From my point of view even persuasiveness is almost impossible to test in a typical pre-testing.
    Her in Germany one professor I know actually quantified the number of cases where pre-testing was “right” and when it was “wrong”.
    As you probably know there are two types of “wrong” – pre-test rejects good work and pre-test aproves less god work. I will look for those numbers an post them…

    I remember though, that on a very broad scale – given the fact that agencies do present rather awkward and strategically questionable proposals very often – research simply does make sense… To kill the really stupid stuff. But it’s probably not good in judging really creative work or really “hard work”.

    You see, there’s a funny simple way to let an idea live a little longer before being killed: present it as a “new media” idea without any ads. Then Milward Brwon won’t have benchmarks for that:-)

    Thanks again


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s