The Day a Text Link Outperformed a Button

Nobody wants a losing A/B test…

But some of the most interesting tests are the “losers” – the ones where everything you thought gets thrown into question. The assumptions you made prove to be, at least for the tested website, wrong.

It’s with the losing tests that we often learn the most.*

Today’s example is no exception. In fact, it’s had my mind racing since the split-test in question ended this September. So I thought we’d share it with y’all here and see if you can make heads or tails of it…

The Test in Question:
Text Links vs Buttons on Metageek.net

Over the past few months, we’ve been running the Summer of Buttons, a mega-experiment in which we ran A/B/n tests on the buttons of 20+ startup sites. Founders and marketers on our newsletter list were invited to apply to let us assess their calls to action and present split-test recommendations, which they then ran. We did this sort of focused mega-experiment previously with The Great Value Proposition Test, and we’re sure to do it again. (So sign up if you wanna be considered next time.)

For the sites that had the traffic to run a test to completion – unfortunately quite a few sites did not – we had some great wins.

We saw 20 to 95% lifts in signups on a range of SaaS sites, and we saw a paid conversion lift of 20% on an ecommerce site. Just by changing calls to action.

We also saw 2 head-scratching losers. This is one of them.

Testing on Metageek.net
Metageek describes itself as, “software-based 2.4 GHz ISM Band spectrum analyzer using the USB bus for 802.11 (Wi-Fi) networks.” And that sort of language actually makes sense to their prospects. My eyes glaze over when I look at such a description, but it’s important to note that not all audiences are comprised of Joanna Wiebes, or so my therapist tells me.

Metageek has 2 very popular downloads: Wi-Spy and inSSIDer. And this is among their most highly visited page, the top of which looked like so during the A/B testing period:

Metageek A/B test on Copy HackersWe worked with Taylor and Wendy at Metageek to identify the goals of this page. Naturally, the number one goal was to provide more inSSIDer downloads. But they were also interested in learning if it might be feasible to charge for the popular free InSSIDer Home product down the road, so we took that into consideration when proposing test creative.

Click on the tabs to see the 4 treatments we proposed Metageek split-test:

TREATMENT B

Text Link Test - Metageek Treatment B

Research Questions:
Will replacing text links with visually engaging buttons that include messaging and indicators highlighting value and immediacy – such as a download icon – increase clicks? Will introducing those buttons with small headers facilitate an increase in clicks?

TREATMENT C

Text link test - Metageek Version C

Research Questions:
The same as for Treatment B, plus: Will reorganizing the high-traffic real estate used by inSSIDer such that the calls to action are easy to compare and contrast in a horizontal line / row facilitate decision-making and increase clicks?

TREATMENT D

Treatment D for Metageek A/B text link test

Research Questions:
Same as for Treatment B, plus: Will reorganizing the high-traffic real estate used by inSSIDer such that the calls to action occupy enough space to allow transactions to occur at this point – for the Office trial in particular – increase trial conversions?

TREATMENT E

Treatment E for Metageek a/b test of text links vs buttons

Research Questions:
Same as for Treatment D, plus: Will positioning the Home product as paid with a trial and implementing a lead gen from impact conversions to the end that MetaGeek is better prepared to charge for Home going forward?

The resounding answer we got to each of our research questions was the same:

No.

Well, it was really more of a “Hellz no!”

Our primary goal or success metric was clicks or engagement on this page. Here’s how that looks in Google Experiments:

Test goals in Google Content Experiments

Our secondary goals were Home downloads and Office downloads. We lost on all counts. Seriously. Now, it’s question time for you…

Which of the 4 Treatments Had the Most Shamefully Awful Impact on Conversion?

Vote by selecting one of the 4 options here:

View Results

Loading ... Loading ...

Let me add that we didn’t run this test just once. Nooo. On seeing the losses the test generated the first time around, we thought, hmm, maybe something was buggy. So we re-ran it…

And the results were no better the second time.

So, if you’ve voted by now, let’s see if you were right. Which treatment sucked most?

Test Results

The worst? Treatment… [drumroll]… E!

Now, to be fair, we knew that Treatments D and E were reaching. They introduced friction by introducing forms (which is the same as assigning your visitor unpaid work). But we had no idea just how poorly they’d do. The overarching lesson we can learn from those 2 treatments is to keep the UX and offerings as they are for now. Don’t try to shorten the Office download flow to 1 page, and don’t charge for the 100% free Home offering.

Here’s the breakdown of losses:

Treatment B: -6% lift

Treatment C: -14% lift

Treatment D: -20% lift

Treatment E: -80% lift

Thank goodness Metageek uses Google Experiments’s multi-armed bandit mode. That ensured that the losing treatments were exposed to fewer people than in a standard 5-way equal-split test. So we got the insight, and they only had to see a fraction of their prospects walk away (vs seeing 80% of them vanish) in order to get that insight.

Why Did Things Go So Horribly, Horribly Wrong?

Okay, I’m being mildly dramatic. Nothing was horrible here. But things certainly did not go the way we’d expected.

So why not?

Possible Flaw A: Fear of Spam / Scam
A lot of sites rank high in the SERPs for software downloads and use large, attractive buttons the way a fisherman uses a dazzling lure to hook a trout. Visitors have been tricked into believing what they’re about to download is legit – only to be attacked my pop-ups, additional conditions for downloading, or any number of junk.

Button vs text link

Possible Flaw B: Too Much Clarity
We’ve seen TMI cause drops in conversion – and it’s possible that on some of the better-performing treatments, like Treatment B, we simply clarified content that was buried in the Control. By introducing headers above each button, we may have introduced the user desire to contrast Home against Office and find Office disappointing for not being free.

Buttons - Test Results

As Wendy of Metageek said: “It’s often a plain text link these days that gets you the clean download. We could be experiencing ‘seasoned internet user’ behavior on the download page.”

What do you think?

Our big thanks to Taylor, Wendy and the Metageek team for participating in this test – you guys rock! Keep watching this blog and others for more of the interesting results we saw in the Summer of Buttons.

~ joanna

*There are no losers in the testing world; if you’re learning, you’re winning.

Unlike any copywriting or marketing newsletter! Weekly tips. Exclusive offers 'n' freebies. Stuff others would charge for - free...

  • vladmalik

    I’m skeptical of the statistical validity of multiarmed bandit experiments and case studies that don’t state the raw data. How many visits and conversions were there on each variation. What constituted a winner? I see the conversion rates, but were these numbers statistically significant?

  • http://j.mp/10hymN1 Hubert Iwaniuk

    Interesting findings, thank you for sharing!

    Out of curiosity: what GA Experiments says when you run A/A test and give it enough traffic?

    • Joanna Wiebe

      That’s a really good Q, Hubert, but I’m not the one to ask — I’m never hands-on with Google Experiments when my clients use it (though I am with Optimizely and VWO). It’s just a little too technical for this copywriter.

      If anyone else can help Hubert, please do!

  • http://www.coreyquinn.com/ Corey Quinn

    Echoing to the previous comments, the [Download Now] and [Get It Now] buttons look rather “commercial” to me when compared to the text links.

    “I expect that by clicking them, I’ll be asked to pull out my cc. Perhaps I can find a similar program elsewhere that is freeware. Back to Google.”

    Did treatments b,c,d,e result in higher exit rates vs the control?

    • Joanna Wiebe

      We didn’t measure exit rates as part of this test – just clicks – but it’d be interesting to know. A lack of click can often mean an exit, of course.

    • Jeff Sararas

      Agreed- Maybe it’s old mental programming from DOS based screens (the days of BBS!) but I expect a text link to open a download dialog. A graphical button- I’m not so sure, meaning I don’t have that same rooted expectation.

  • David Larsen

    I’d love to see the experiment run with dead simple buttons or a download icon vs. the original links. I’m envisioning the the visual simplicity of the green buttons here on copyhackers.

    Especially because you’re selling a product that deals with network security, anything that’s going overboard to convince me to download it feels fishy.

    Also, flashy guarantees aren’t.

    • http://www.toppingtwo.com/ Lance Jones

      This +1! That’s the beauty of iterative testing. Some companies would move on to testing other page elements, but there is still an opportunity to test these links/would-be buttons.

  • Taylor @ Metageek

    Thanks again for having us as part of the Summer of Buttons! Like you said, if you learn, you win… Looking forward to reading the comments here as well :)

    • Joanna Wiebe

      Thanks, Taylor! It was awesome working with you guys, and we see that the page has since been redesigned and/or is being tested, with the text links intact. Very interesting.

  • Andrea Grassi

    I have always be fascinated by how people read.

    During a normal & Nielsen conference I learned te obvious: we prefer black text on whit bg instead of the opposite.

    What I can say about this test is that the focus (to me) flew directly on the cloud icon.
    Skipping text on button and ignoring the text before.

    The link was far more clear and understandable and required less thinking.

    Be well

    • Joanna Wiebe

      Yeah, I’ve seen studies that show we prefer black on white when reading small print and white on black when reading large print (like large enough to be on a movie screen). Of course, if you ask a bunch of people who’ve been sitting in the dark for 3 days which they prefer, you might get a different answer — which is why we test to see wassup with our particular audiences. :)

  • James Barron

    I agree with the seasoned internet user being the problem, especially considering the likely target audience. Those download buttons look just like an advert from someone trying to trick you in to downloading something other than inSSIDer. I bet people actually looked for the ‘real download’ links ;)

    • Joanna Wiebe

      So curious — we look at prominence as a key factor when optimizing a button, but every element on the page gives off cues that we can only partly understand, as marketers. I would never have thought that a well-designed button would look suspect, but, alas, that is why it’s always safest to test than to make your own assumptions, even if you feel those assumptions are grounded in some solid shizzle (e.g., 100s of split tests run, with over a dozen on buttons in the last 3 months alone).

  • http://dmix.ca dmix

    Those button designs look kind of spammy. They are more noticeable but they look like the fake download buttons you see on filesharing sites.

    You have a technical audience for this product, I’d be wary of treating them like the average simple user. Keep it clean/simple/logical. Not flashy.

    • Joanna Wiebe

      You won’t be surprised to hear that, with a startup audience filled with developers, we’ve heard that argument before: “we don’t like fuss; we’re too logical for that; marketing doesn’t work on us”. And then we run tests. And we find that, in fact, developers and people who do highly technical work for a living respond perfectly well to fuss, irrational emotional stuff and marketing.

      So although I agree that it’s possible — and clearly was the case here — that buttons may read “don’t trust me” for certain audiences, I have yet to see compelling evidence to support this belief that technical people respond best, by nature, to “clean/simple/logical”. If so, I strongly doubt we’d see as many technical people at, say, Star Wars conventions. :) An awful lot of flashiness in Star Wars….

      • http://dmix.ca dmix

        Agreed overall about “marketing-not-working” but thats a fine line. That button design looks spammy and over-used now by spammers.

        A standard, more understated bootstrap or Zurb foundation style button (with noticeable red/green/blue color) would likely have been more/equally effective as the text link.

        I’d bet money on it :)

      • Joanna Wiebe

        You *would* or you *will* bet money on it? :)

      • http://dmix.ca dmix

        Yes I would gamble real money on it. But the point it to help people not win.

        I would bet a button styled just like the blue one on dropbox.com that says download would convert higher than the text links. (otherwise same UI, no forms or anything).

        Except I’d make it a red button, because my own tests confirm that color converts best.

        Secondly I would tell them to stop complicating things with home/office decision when it comes to downloading a trial (the customer can decide when it comes to buy). That’s actually a whole UX issue in itself that A/B test patchwork wont fix.

        Third, if this is there most popular page, they are missing the opportunity to educate users more (for ex, wtf is home vs office, give me a link to a comparison chart).

  • Diana OVW

    I agree with burchd. I also think these related factors might have had an influence: 1) The plain text buttons did not look like they were asking for a big commitment, making them more ‘clickable’. 2) The page looked more sophisticated without the big, colorful buttons. The big, colorful buttons might have made the product look less sophisticated, more generic and possibly less trustworthy (this ties in to the point about a “spam” association with this type of button). …Interesting test, thanks for sharing the results!

  • robindotadams

    Very interesting. I’m guessing (and hey, I’m no expert) that Metageek is targeted at geeks(!) and thus shiny flashy buttons are less likely to tantalise? No doubt the geek in question is probably the ‘seasoned internet user’ too. Also wondering about the competition in the sector. If they all use ‘flat’ text links does the shiny link images stuff scare them or make them suspicious? Always more to learn, but this article gets you thinking… which is why I always come back for more :-)

  • burchd

    I think it’s ‘seasoned internet user’ behavior on the download page.

    • burchd

      “Don’t tell us what to do, we can figure it out on our own.”

  • http://conversionscientist.com Brian Massey

    It’s hard when the client falls in love with the losing design.

    Inline image 2 is broken, BTW, but the post is still excellent.

    • Joanna Wiebe

      Thanks, Brian! Fixed now. :)