Nobody wants a losing A/B test…

But some of the most interesting tests are the “losers” – the ones where everything you thought gets thrown into question. The assumptions you made prove to be, at least for the tested website, wrong.

It’s with the losing tests that we often learn the most.*

Today’s example is no exception. In fact, it’s had my mind racing since the split-test in question ended this September. So I thought we’d share it with y’all here and see if you can make heads or tails of it…

The Test in Question:
Text Links vs Buttons on Metageek.net

Over the past few months, we’ve been running the Summer of Buttons, a mega-experiment in which we ran A/B/n tests on the buttons of 20+ startup sites. Founders and marketers on our newsletter list were invited to apply to let us assess their calls to action and present split-test recommendations, which they then ran. We did this sort of focused mega-experiment previously with The Great Value Proposition Test, and we’re sure to do it again. (So sign up if you wanna be considered next time.)

For the sites that had the traffic to run a test to completion – unfortunately quite a few sites did not – we had some great wins.

We saw 20 to 95% lifts in signups on a range of SaaS sites, and we saw a paid conversion lift of 20% on an ecommerce site. Just by changing calls to action.

We also saw 2 head-scratching losers. This is one of them.

Testing on Metageek.net
Metageek describes itself as, “software-based 2.4 GHz ISM Band spectrum analyzer using the USB bus for 802.11 (Wi-Fi) networks.” And that sort of language actually makes sense to their prospects. My eyes glaze over when I look at such a description, but it’s important to note that not all audiences are comprised of Joanna Wiebes, or so my therapist tells me.

Metageek has 2 very popular downloads: Wi-Spy and inSSIDer. And this is among their most highly visited page, the top of which looked like so during the A/B testing period:

Metageek A/B test on Copy HackersWe worked with Taylor and Wendy at Metageek to identify the goals of this page. Naturally, the number one goal was to provide more inSSIDer downloads. But they were also interested in learning if it might be feasible to charge for the popular free InSSIDer Home product down the road, so we took that into consideration when proposing test creative.

Click on the tabs to see the 4 treatments we proposed Metageek split-test:

[tabs style=”default”] [tab title=”Treatment B”]

TREATMENT B

Text Link Test - Metageek Treatment B

Research Questions:
Will replacing text links with visually engaging buttons that include messaging and indicators highlighting value and immediacy – such as a download icon – increase clicks? Will introducing those buttons with small headers facilitate an increase in clicks?

[/tab] [tab title=”Treatment C”]

TREATMENT C

Text link test - Metageek Version C

Research Questions:
The same as for Treatment B, plus: Will reorganizing the high-traffic real estate used by inSSIDer such that the calls to action are easy to compare and contrast in a horizontal line / row facilitate decision-making and increase clicks?

[/tab] [tab title=”Treatment D”]

TREATMENT D

Treatment D for Metageek A/B text link test

Research Questions:
Same as for Treatment B, plus: Will reorganizing the high-traffic real estate used by inSSIDer such that the calls to action occupy enough space to allow transactions to occur at this point – for the Office trial in particular – increase trial conversions?

[/tab] [tab title=”Treatment E”]

TREATMENT E

Treatment E for Metageek a/b test of text links vs buttons

Research Questions:
Same as for Treatment D, plus: Will positioning the Home product as paid with a trial and implementing a lead gen from impact conversions to the end that MetaGeek is better prepared to charge for Home going forward?

[/tab] [/tabs]

The resounding answer we got to each of our research questions was the same:

No.

Well, it was really more of a “Hellz no!”

Our primary goal or success metric was clicks or engagement on this page. Here’s how that looks in Google Experiments:

Test goals in Google Content Experiments

Our secondary goals were Home downloads and Office downloads. We lost on all counts. Seriously. Now, it’s question time for you…

Which of the 4 Treatments Had the Most Shamefully Awful Impact on Conversion?

Let me add that we didn’t run this test just once. Nooo. On seeing the losses the test generated the first time around, we thought, hmm, maybe something was buggy. So we re-ran it…

And the results were no better the second time.

So, if you’ve voted by now, let’s see if you were right. Which treatment sucked most?

Test Results

The worst? Treatment… [drumroll]… E!

Now, to be fair, we knew that Treatments D and E were reaching. They introduced friction by introducing forms (which is the same as assigning your visitor unpaid work). But we had no idea just how poorly they’d do. The overarching lesson we can learn from those 2 treatments is to keep the UX and offerings as they are for now. Don’t try to shorten the Office download flow to 1 page, and don’t charge for the 100% free Home offering.

Here’s the breakdown of losses:

Treatment B: -6% lift

Treatment C: -14% lift

Treatment D: -20% lift

Treatment E: -80% lift

Thank goodness Metageek uses Google Experiments’s multi-armed bandit mode. That ensured that the losing treatments were exposed to fewer people than in a standard 5-way equal-split test. So we got the insight, and they only had to see a fraction of their prospects walk away (vs seeing 80% of them vanish) in order to get that insight.

Why Did Things Go So Horribly, Horribly Wrong?

Okay, I’m being mildly dramatic. Nothing was horrible here. But things certainly did not go the way we’d expected.

So why not?

Possible Flaw A: Fear of Spam / Scam
A lot of sites rank high in the SERPs for software downloads and use large, attractive buttons the way a fisherman uses a dazzling lure to hook a trout. Visitors have been tricked into believing what they’re about to download is legit – only to be attacked my pop-ups, additional conditions for downloading, or any number of junk.

Button vs text link

Possible Flaw B: Too Much Clarity
We’ve seen TMI cause drops in conversion – and it’s possible that on some of the better-performing treatments, like Treatment B, we simply clarified content that was buried in the Control. By introducing headers above each button, we may have introduced the user desire to contrast Home against Office and find Office disappointing for not being free.

Buttons - Test Results

As Wendy of Metageek said: “It’s often a plain text link these days that gets you the clean download. We could be experiencing ‘seasoned internet user’ behavior on the download page.”

What do you think?

Our big thanks to Taylor, Wendy and the Metageek team for participating in this test – you guys rock! Keep watching this blog and others for more of the interesting results we saw in the Summer of Buttons.

~ joanna

*There are no losers in the testing world; if you’re learning, you’re winning.