Growth Marketing

Our 13 Favorite AB Tests of 2013 (Plus, a Giveaway)

For Copy Hackers, 2013 was the year of The Great Value Proposition Test and The Summer of Buttons, two separate mega-experiments in which we helped some 30+ startups run A/B tests on their headlines and primary calls to action.

With big wins. And some losses.

But always with this goal: to learn.

It’s the tests that make our tickers hurt – not just the ones that make our clients’ accounts grow – that we love best. Which is why we’ve decided to do our first ever roundup.

Take a walk down memory lane with us as we discuss our 13 favorite AB tests of 2013 – only 2 of them from our blog 🙂 – and then vote for your fave at the end! Oh, and the best comment wins these 3 great biz books of 2013 (US, UK, CAN, AUS & NZ only):

Win top biz books 2013

In no particular order…

Favorite Test 1: 36% More Purchases

Tell ‘Em Why They Need Your Product

Favorited By: Lance

We talk a lot about the importance of creating a clear, compelling value proposition… and the two primary elements of a great UVP: uniqueness and desirability. But don’t forget to give your visitors context for your product — the where, when, and [especially] why people need it in the first place.

This simple test was based entirely on giving visitors additional context for an automotive repair product, the outcome of which was an amazing boost in sales.

marketing-experiments-top-2013

Why We Loved This Test: It’s so simple that you can’t NOT test this on your own site.

Read more about this and other thought-provoking tests on the Marketing Experiments blog

Favorite Test 2: 10% More Sales

Big Image Beats Tiny, Squishy Content-Cram

Favorited By: Joanna

People often ask us if they should use short copy or long copy – as if you only need one or the other. As tests like this one, courtesy of VWO, show us, the closer a prospect is to converting, the less they often need to read your pitch.

So, as in all negotiations, once you’ve made your point, stop talking and let the prospect convince him or herself.

Best A/B tests of 2013

Why We Loved This Test: Because it sparks an interesting and – depending on whom you’re talking to – often short debate on copy vs design.

Read more about this and other tests on I Love Split-Testing (by VWO)

Favorite Test 3: 11% More Clicks

Clean Button Beats Button Featuring World’s Strangest Icon

Favorited By: Joanna

Copywriter Michael Aagard always shares the most fascinating A/B test results – and I’m looking forward to takin’ him on in the “CROpywriting Battle” at Conversion Jam 2014 in Stockholm.

In this test he shared on the [highly recommended] Unbounce blog, Michael showed how simplifying the visual design of a call-to-action button led to an 11% lift in clicks:

Favorite A/B Test of 2013

Why We Loved This Test: Because the control is sortuv hilarious. And because the verb “get” in a button wins yet again.

Read more about this and other button tests on Unbounce

Favorite Test 4: 211% More CTA Clicks

Who Says You Can’t Buy Friends?

Favorited By: Lance

There’s a good chance you’re too close to your own product or service to immediately recognize shortcomings in your copy. HubSpot shares a terrific example of how Friendbuy used data to identify the source of a poorly converting landing page and overcame the “forest for the trees” problem.

friendbuy-top-2013

Why We Loved This Test: The folks at Friendbuy identified an amazing opportunity for their CTA, worked their theory, simplified the copy, and emerged victorious!

Read more about this and other tests on HubSpot’s Inbound Marketing

Favorite Test 5: 621% More Clicks

Download Is A Dirty Word

Favorited By: Lance

VWO shares a cautionary tale of choosing the wrong label for an important call to action — in this case, the seemingly innocent word, “Download”. The team at Price Charting tested a couple of alternate versions of their main CTA and were surprised by how much of a difference a couple of words can make.

price-charting-top-2013

Why We Loved This Test: Improved relevance = impressive conversion gains. Are your CTAs relevant to your visitors’ goal?

Read more about this and other tests on I Love Split-Testing (by VWO)

Favorite Test 6: 632% More Email Signups

Team Romney Actually Wins Something

Favorited By: Joanna

Although technically run in 2011-2012, this A/B test case study wasn’t published on Optimizely’s blog until 2013 – so it qualifies for this roundup.

For Team Romney, each new email address they collected was worth nearly $8 in donations or revenue. By moving their email sign-up fields into the prime real estate of the hero banner, the Republicans saw an impressive lift in leads.

Romney A/B test 2013

Why We Loved This Test: Because even the best-performing A/B tests can’t help sell certain products.

Read about 3 Romney A/B tests here

Favorite Test 7: 18% More Scheduled Appointments

High-Converting Copy Can Still Make People Smile

Favorited By: Lance

The debate about using clear copy versus clever copy will likely never be fully resolved, but we recently shared the results of a test that challenges assumptions. The lesson here was not which variation won, but that you should be brave enough to test something you may have serious doubts about.

jcd-repair-variation3

Why We Loved This Test: Sometimes your best customers are those with hangovers.

Read more about clever versus clear on Copy Hackers

Favorite Test 8: 235% More Events Logged

Mobile Button UI Beats Blinds UI

Favorited By: Joanna 

If mobile is already here and taking over our world, then how come we’re seeing so few A/B tests in mobile environments? This test intrigued us largely because it was one of the few mobile case studies we saw this year.

With this UI test, the team at RunKeeper was trying to increase the number of non-running events tracked. Their winning treatment led to 235% more non-running activities (e.g., yoga) being tracked by the app’s users.

Mobile A/B Test 2013

Why We Loved This Test: Because the design of the winner is so elegant, we wanted it to win.

Read more about this mobile test on Localytics

Favorite Test 9: 34% Fewer Leads

Be Careful What You Ask For

Favorited By: Lance

For companies whose primary conversion metric is generating leads, there is typically a trade-off between the quality and quantity of leads. How much can you ask visitors to provide before they stop submitting a form?

The good folks at Monetate recently shared the results of introducing a phone number field into their lead form and the impact it had on the number of leads.

monetate-top-2013

Why We Loved This Test: Monetate changed one variable on their form and got a clear signal from visitors about that change.

Read more about this and other tests on Monetate’s blog

Favorite Test 10: 144% More Form Submissions

“I Declare Bankruptcy!” — Michael Scott, The Office

Favorited By: Lance

It never gets old. Visitors tend to respond well to on-site messages that reinforce off-site messages. Marketing Experiments highlighted one such example of a client who brought its PPC ad headline copy into the landing page headline. The results, while intuitive, are surprising in magnitude.

marketing-experiments-2-top-2013

Why We Loved This Test: This uber-simple headline change resulted in 2.5x more leads. Pay attention to your PPC ad to landing page congruency.

Read more about this and other tests on the Marketing Experiments blog

Favorite Test 11: 19% More Downloads

No Such Thing As A Silver Bullet (But There Are Better Bullets)

Favorited By: Lance

We appreciate Michael Aagaard’s transparency about the things he tests on his own site, ContentVerve.com. This compelling test involved telling visitors how much time they’d have to invest in a product to extract value.

content-verve-top-2013

Why We Loved This Test: It offers more proof that you can meaningfully impact visitors’ decisions with seemingly small copy changes.

Read more about this and other tests from Michael at ContentVerve.com

Favorite Test 12: 14% Fewer Downloads

Love for the Humble Text Link

Favorited By: Joanna

The cool peeps at Metageek let us run this test on their much-visited download page for inSSIDer. We ran 4 variations against the text-link dominant control… and they all lost, but the one shown here was the most disappointing ‘loser’ for me.

A/B Test Loser

Why We Loved This Test: Because it runs counter to the CRO world’s long-held belief that an easy-to-click button will always beat a text link… and it opens up a new discussion about perceptions of ‘slick design’ as a UX flag.

Read more about this text link vs button test on Copy Hackers

Favorite Test 13: 95% More Clicks

The Asinine Button Color Test

Favorited By: Joanna

People hate button color tests. I’m one of ’em. But for the purposes of this year’s Summer of Buttons, we tested everything related to buttons – including their colors. And we found, time and again, that button color is pretty damn important.

This shouldn’t come as a surprise.

What is a surprise is just what a statistically significant difference the color of a button can make. Now, the following test wasn’t a pure button color test; the copy was also tested. But if you read the explanatory post, you’ll see why we feel good about saying that the higher the contrast your button has against its page – or the more noticeable it is in its environment – the better that may be for conversion. 

Button color test 2013

Why We Loved This Test: Because we hate button color tests! But we loved this! It changed our minds and softened our cold, jaded hearts…

Read more about this and other buttons tests on this Copyblogger post

For 2014, we’ve got even more free CRO help in store – so sign up here if you’d like to work with us without paying a penny.

And remember to leave a comment below for your chance to win the 3 great biz books we mentioned above. Cut off for comments is Monday, Jan 6. (Make sure your Disqus name links to your Facebook or Twitter so we can reach you if you win.)

About the author

Joanna Wiebe

Joanna Wiebe - Copywriter and author of "Copy Hackers"

  • Ruben

    #1 because there was only one other one on the list that showed ‘paid lift’. Without knowing how much more revenue the others generated compared to 1 and 2, at the end of the day what matters is, “does it generate enough money to pay the bills, improve the service, and/or increase your marketing reach.”

  • James Barron

    That’s quite a list you have compiled; I think I had 3 favourites (it’s the jam experiment all over again). In third place was The Asinine Button Colour Test as the level of detail was impressive and went far beyond a colour test. The Mobile Test came second as anything about testing on mobile I find very interesting and there isn’t much on the subject. Big Image Beats Tiny, Squishy Content-Cram came first as it’s such a simple change that has a large impact. Everyone can make a change like this but you still see so many websites using tiny images. Worse still are tiny images that have an enlarge option but then show you the same tiny image.

    • Joanna Wiebe

      Tiny images kill me. Tiny text kills me. Tiny buttons kill me. There’s no need to make people squint to read your stuff or click your buttons!!

  • #7 for me. Specially if the message is thrown at the prospective customers in the right time (after new year eve is a got opportunity)

    • Joanna Wiebe

      haha! Yeah, I’m sure seasonality – which is a nice way to put “drunken idiocy” – has a lot to do with the success of a business that fixes cracked smartphones.

  • Jen

    Thanks for sharing all of these. My absolute favorite is #7. No surprise. I’m on a mission to let the world know that copy with personality and copy that converts don’t have to be mutually exclusive. As far as I’m concerned, they’re so much better together – kind of like peanut butter and jelly or martinis and olives.
    Sure, there’s no upside in being clever just for clever’s sake (as you wrote in that other post) but depending on the product or business, an added dose of personality can make a big difference. You just have to know who you’re targeting and be smart about it.

    Oh, and always test!

    • Joanna Wiebe

      Stellar mission, Jen! As more and more tests show that a little personality goes a long way, I hope your mission gets easier – and more peeps seek your help…

  • Lizard Jam

    I’ve always believed color theory to be a major player in anything I create that is meant to grab attention. So The Asinine Button Color Test gets my vote. The orange gives it that feel of “I better hurry and jump in on this”

    • Joanna Wiebe

      Yeah, the contrasting orange really works here. Of course, all button tests seem to be about contrast more than color, right?

  • Josh Margulies

    Once again, success boils down to the FUNDAMENTALS–you’ve got to know what the A/B “sees”

  • The “No Such Thing As A Silver Bullet” is my favorite from the group since it’s something I’ve noticed I enjoy in the last few days. I got a Kindle Paperwhite for Christmas and a feature I love is that it tells me how much time it will take for me to finish the chapter as well as the book. This often keeps me reading to finish the chapter when I would have otherwise quit.

    That case study is helpful on its own, but it is also a good reminder to notice what I enjoy about products and services and see if I can replicate and test those elements in my own copy.

    • Joanna Wiebe

      Kindle Paperwhite tells you that? Amazing. I read on my Kindle app on iPad, and it doesn’t do that. 🙁

  • Thanks for the mind bending inspiration on how to improve our site. Now all I need is to pause time for about 3 months to get some work done 🙂

    • Joanna Wiebe

      Well what biz owner doesn’t have 3 months to spare, Corey? 😉

  • Belle

    Thank you for curing my phobia of the word “get”! I love that you’ve included conversion metrics which makes it easier to justify copy decisions to the Powers That Be!!

    • Joanna Wiebe

      “Get” is the magic marketing word! It may be frowned upon in literary circles, but marketing ain’t literary. 🙂

  • Joanna Wiebe

    John didn’t mention it to you? Maybe he’s still working on the details. How funny!! He was saying the battle will see attendees give us both challenges for improvements to copy… then we take our shots at it… then the attendees actually go and test our variations and send the results to John (and us). Crazy, right? Crazy fun. 🙂

    Totally looking forward to meeting you, too! Have a great one.

  • Conor Pendergrast from Resifle

    I really like the RunKeeper UI test. It reminds me that UI and design changes can have a very positive impact on user actions (well duh!). Need to keep this in-mind for future dev 🙂

    • Joanna Wiebe

      I also thought the RunKeeper test was cool. We don’t see a lot of published UI studies – and we don’t tend to test in-app UI around here (as opposed to ecommerce, which we test constantly) – so it’s great to be able to share the few UI test case studies we find.

  • Robert Campbell

    Thanks for once again obliterating my copywriting sensibility.

    • Joanna Wiebe

      hahaha! That’s what we’re here for, Robert. 😉

  • Colleen Saunders Brown

    The discoveries you point out are so valuable because they tie ROI to design decisions.

    • Joanna Wiebe

      Totally, right? If you’re going to get people on board with investing in proper design (which includes copy), you need to show them the money…

  • Brandon Gilmore

    Great round up of AB tests. Love the diversity of the posts and the great results you found online and through your own experiments. Keep up the great work in 2014. I particularly enjoy the subtle differences and seeing the huge impact they can have.

    • Joanna Wiebe

      Thanks, Brandon! We’ve got some fun stuff planned for 2014 — most of it subtle. 🙂

Blog writing software

We built a million-dollar business on blogging

Amazing blog posts build businesses and print money. Now Copy Hackers is teaching indies and teams to write kick-ass posts in half the time. Get notified when we're live.

Unsubscribe anytime. 100% privacy. Powered by ConvertKit