Do You Really Need to Test Your Site to Improve Your Conversion Rate?

December 19, 2016
Aaron Polmeer

By Jacob Baadsgaard

Talk to almost any online marketer and you’d think that they held a PhD in psychology. At the drop of a hat, they can tell you all about what button colors, typeface, contrast, spacing, line of sight, hero shots, etc you should use to subconsciously drive a website visitor to convert.

But do marketers really have these incredible mind powers? Do they wield the awesome power of psychology to control the rest of the world?

Testing the Testers

Recently, Chris Dayley, my head of CRO, ran a little experiment at the SearchLove conference in Chicago. Chris wanted to see how well marketers could use their understanding of psychology, marketing best practice or even gut instinct to predict which landing page design would produce the best conversion rates.

So, he presented an example A/B test from a real client of ours. There were four page variants and one of them had generated a 146% increase in leads. The room of marketers was given a link to a survey where they could examine each variant and submit their guesses as to which page had been the winner.

Take a look for yourself. Which one would you have picked as the winner?

Here’s how the marketers voted:

  • Original: 0%
  • V1:32%
  • V2: 42%
  • V3: 26%

Now, only one of the variants actually produced 146% more conversions, so—if we assume that V2 actually was the winning variant—at least 58% of these marketers were wrong.

But that wasn’t the real trick of the survey.

While the marketers were guessing which page had won in Chris’s A/B test, he was actually running an A/B test on them!

Only half of the surveys showed the page variants in the order seen above. The other half saw a scrambled version with the original and V3 switched like this:

Here’s where things got a little crazy. In the second group of marketers, no one voted for the “original” page—even though that page received 26% of the votes in the first group!

Even more intriguingly, V2 received the most votes in both groups:

But here’s the thing, V2 wasn’t actually the top-performing page—V3 (the “Control” in group 2) was. That meant the actual champion only got 13% of the popular vote!

The question is, why? How did the vaunted best practices and gut instincts of so many marketers fail them? To answer that, let’s take a step back from marketing psychology and take a look at the psychology of marketers:

Newer is Better

The most obvious thing highlighted by Chris’s experiment was that all the marketers assumed that the highest performing page variant couldn’t be the original. In both groups, the variant labeled “original” didn’t receive a single vote…even when it was the actual winner.

Now, there’s a dangerous assumption at play here. Everyone who puts an A/B test together would like to believe that he or she is going to shake things up and make them better. But can you assume that newer is better?

Null Hypotheses

In scientific testing, there’s a concept called the “null …read more

Source:: Kiss Metrics Blog