I read a great post today called “Throw Everything you Know About Ads Out the Window”. The author describes how he ran a very simple test of two ads to see which would work better. You can see the two ads here.
The first ad was very professional looking with good looking graphics, nice fonts and a green call to action button. The second ad was in his words “some shit ad I made in 5 mins in Microsoft Paint.” The ad was a hand drawn picture of a car with the hand written words “Need for Speed!!! Play free!!” He tested the 2 ads for 15K impressions each and found the low tech ad generated a clickthrough rate of 0.137% versus 0.049% for the more professional looking ad.
Whoa. That’s quite a difference.
His conclusion that, “every idea you have is worth testing no matter how crappy it is” is a smart one in my opinion, but also trickier than it might sound.
Your Test Results Tell you What Works but not Why
So we know the second ad generated a better CTR. Now what? Here is a list of reasons the second one might have gotten more clicks:
- Novelty – we are inundated with ads and something that looks very different is interesting and click-worthy.
- Free – research shows that this a bit of a magic word for folks and it appears much more prominently on the second ad.
- Less Text – the second ad had much less text and is easier to read.
- Single Image – A single image might make the ad easier to process.
- Simplicity and Flow – The second ad is much simpler and it flows simply from top to bottom. The “professional” ad is more complex and flows right to left and top to bottom.
- Weird Psychology – Maybe that hand-drawn ad reminded us of the doodles we made we were 5 years old and a Proust-ian nostalgia swept over us and gosh darn it we just had to click!!
OK, so the last one isn’t all that likely but hey, anything’s possible. So what does this test tell us? It tells us that we can improve our CTR in one (or maybe many) of these ways. It’s giving us some clues about what hypothesis to test next but without those tests, the “why” around the increased CTR is not clear.
CTR is not the Same as Conversions
Another important thing worth pointing out is that there was no mention of conversions after the clicks. If I look at some of the possible reasons that the CTR might have been higher I could see that maybe some folks are clicking just to see what the heck this crazy ad is all about but aren’t really serious about taking any other action. While the test might have proven the second ad generated more clicks, it did not prove the second ad “worked” better from a business perspective.
A/B Testing: Knowing What Works Doesn't Tell you Why http://t.co/8mOPn4Yc
@aprildunford Great quick article on the role of A/B testing. http://t.co/Sq3uH1hq Any chance a Pt.2 is coming on landing pages?
A/B Testing: Knowing What Works Doesn't Tell you Why http://t.co/8mOPn4Yc
A/B Testing: Knowing What Works Doesn’t Tell you Why http://t.co/XQYF70dN
A/B Testing: Knowing What Works Doesn’t Tell you Why – The results of an experiment where a hand-drawn doodle outpe… http://t.co/rIxJ330C
A/B Testing: Knowing What Works Doesn’t Tell you Why http://t.co/IRcERsrg #productmarketing
A/B Testing: Knowing What Works Doesn’t Tell you Why http://t.co/IRcERsrg #productmarketing
RT @aprildunford A/B Testing: Knowing What Works Doesn't Tell you Why http://t.co/oQhAH2Aq [or: don't test too many variables]
A/B Testing: Tester ce qui fonctionne n'explique pas pourquoi celà fonctionne http://t.co/elMqR6jt
A/B Testing: Knowing What Works Doesn't Tell you Why http://t.co/42yt4Not @aprildunford #ProdMktg
blog post: A/B Testing: Knowing What Works Doesn't Tell you Why http://t.co/8mOPn4Yc
RT @aprildunford: A/B Testing: Knowing What Works Doesn't Tell you Why http://t.co/pWfr9qIf
A/B Testing: Knowing What Works Doesn’t Tell you Why http://t.co/Je1Iyw8y
A/B Testing: Knowing What Works Doesn’t Tell you Why http://t.co/D7wzBqZO
Interesting…
To my mind, the most interesting conclusion is that successful marketing (and I agree – success is a complicated notion) is much more complicated today than it was 20 years ago. Consumers are far more sophisticated and discriminating and flat, static and ‘slick’ advertising (content AND design) just doesn’t cut it.
There are many other factors impacting consumption. Attention spans are shorter. Accessibility is greater. Sensitivities are duller. Technology is more complex.
My gut instinct tells me that authentic and engaging trumps all other marketing strategies these days. And the doodle was both.
Hi Ruth – great points and I’m totally with you on this. I think that “Ad blindness” is a big problem if your business relies on ads. We’re exposed to so much of this stuff that we’ve become great at filtering it out. I also agree that people respond to authenticity.
But at the same time I still think it’s difficult to predict when people will respond to slick vs. authentic. For example we have people responding to very beautiful yet completely useless infographics that seem to be the opposite of authentic and helpful yet somehow still manage to be engaging. But maybe that’s because we haven’t trained ourselves to recognize the difference between the good and bad of every type of content quite yet. I also find Apple’s marketing to be extremely slick and not particularly human or authentic and yet they are regularly held up as an example of the best marketing in the world. Buy hey, when your product is that good, maybe it just doesn’t matter.
Pesky buyers. If only they were easier to figure out 🙂
April
RT @aprildunford: A/B Testing: Knowing What Works Doesn't Tell you Why http://t.co/TJ7a5Zoo
RT @aprildunford: A/B Testing: Knowing What Works Doesn't Tell you Why http://t.co/ipMRp3gM
A/B Testing: Knowing What Works Doesn’t Tell you Why http://t.co/5fuNE6y3
RT @aprildunford: A/B Testing: Knowing What Works Doesn't Tell you Why http://t.co/EjEgUYhj
[…] A/B Testing: Knowing What Works Doesn’t Tell you Why […]