How We Optimized the Mad Mimi Pricing Page

I recently had the pleasure – gut-wrenchingly terrifying as it was – of speaking at Copyblogger’s Authority Intensive about buttons…

One of the many case studies I discussed was for the pricing page of Mad Mimi, an awesome email marketing solution you should check out…

Because I barely scratched the surface of this particularly interesting A/B/C test in that presentation… and because pricing pages are so crucial to startup success… let’s take a look at how we optimized the Mad Mimi pricing page – so you can go forth and conquer your own…

Optimizing the Funnel Entry Point:

Are You Treating Your Pricing Page Like the Huge Opportunity It Is?

Pricing pages are my absolute favorite SaaS pages to optimize. They represent the point at which a visitor is getting very, very close to making the leap from prospect to customer – or at least to trial user – which can mean growth and success and happiness for startup teams.

When I’m testing a pricing table, I start with the basics, which I’ve seen work effectively time and again:

  1. Optimize the button copy
  2. Optimize the order of the plan options
  3. Ensure 1 plan looks most desirable instantly
  4. Simplify and/or clarify the line items

Those are 4 very simple tweaks that anyone on your team can make to run a test pronto. Unlike renaming plans and unlike price testing - both of which require technical work, making them prohibitive for many marketing teams - these 4 tweaks are so straightforward, you can easily experiment with them and run them in any A/B testing tool. (I prefer VWO, which we used for the Mad Mimi test.)

Our goal with Mad Mimi was to increase clicks on any plan. It wasn’t to bring in paid conversions, completed signups or upgrades.

Why not?

Because the 4 tweaks I mentioned above aren’t meant to do that. And your goal must match your tactic.

Rather, those 4 tweaks are simply meant to help users get through a page easily. To grease up a funnel, as it were, by reducing friction.

They do the light lifting. They pluck the low-hanging fruit.

They don’t necessarily overcome objections or neutralize anxieties; they don’t amplify value; and they don’t highlight incentives, in most cases. And that’s okay. When you’re optimizing a funnel, it’s perfectly valid and useful – as you’ll see – to run tests that focus purely on helping your lizard brain make better, faster decisions. Button copy and color tests may not persuade more people to choose you, but they’re likely to move more people from Home to Tour, from Tour to Pricing, and from Pricing to Create Account.

So, with that in mind, here’s the control for Mad Mimi:

CONTROL

Pricing Table Optimization - Control

Take a look at the Control. What would you test right off the bat?

Now, thinking of the 4 bullets I mentioned above, we developed the creative for Variation B, in which we:

  • Rewrote the button copy for the plans, changing it from “Sign Up!” to “Get Started Now” (not on the yellow sticky)
  • Reordered the plans from most expensive to least in order to anchor the price and leverage primacy effect (more about this in my KISSmetrics post here)
  • Removed the Gold Plan from the table and subordinated it to a line of text above the Free Plan messaging; we felt confident subordinating the Gold Plan because its price was so high at $1,049/mo that it bordered on ‘enterprisey’
  • Clarified the line items by 1) replacing the infinity symbol, which may not be immediately understandable, with the word “unlimited” and 2) adding “premier support” to the list

Those 4 tweaks resulted in the following variation:

VARIATION B

Pricing Table Optimization for Mad Mimi

After creating that variation, we felt unsatisfied at the number of changes. Having changed 4 things, how could we know what might cause the lift, if we were to get any lift?

So we developed a third variation – one that was much closer to the Control but that incorporated just 2 changes, which were also seen in Variation B:

  • We updated the button copy for the plans from “Sign Up!” to “Get Started” (not on the yellow sticky); this copy is shorter than Variation B’s “Get Started Now” because the 4-column layout here wouldn’t allow the room for an extra 4 characters per button
  • We clarified the line items by 1) replacing the infinity symbol, which may not be immediately understandable, with the word “unlimited” and 2) adding “premium support” to the list

Those 2 changes to creative resulted in the following variation:

VARIATION C

Mad Mimi Pricing Table Optimization

As I mentioned, our primary metric was click-thru rate.

Because there were 5 buttons to track here, we set up these five goals:

  • Clicks Basic
  • Clicks Pro
  • Clicks Silver
  • Clicks Gold
  • Clicks “sign up” (the button under Customize a plan)

So scroll on up the page to take a look at Control, Variation B and Variation C again…

…and think about which one you believe won for each goal.

Which should get more people to click Pro?

How about Silver?

Do you think it’s Variation B?

Or Variation C?

Or maybe Control beat both the variations?

The Rather Jaw-Dropping Results of This Pricing Table Test

Although we saw lift within the first 3 weeks for Variations B and C, we kept the treatments running in VWO a full 6 weeks to see if anything would change… but things only got better.

Now, for the drumroll….

The winner?

Both Variation B and Variation C outperformed the Control for the duration of the 6-week test. 

Only exception: the Sign Up goal. On that goal, we didn’t reach confidence. Can you guess why? It’s because we didn’t change anything there! Across Control, B and C, the “sign up” button on the yellow sticky note stayed exactly the same. So it’s a good thing there was no lift… or we’d have problems. :)

Here’s how the results for the other 4 goals shook out:

Pricing page test results

On every single goal, both Variation B and Variation C beat the hell outta the Control. Massive lift.

Variation C largely outperformed Variation B, bringing in more conversions overall. 

Pricing pages side-by-side

It’s only on the Choose Gold goal that Variation B performed best – and if you want to know what the Helsinki was going on there, read this post

Finally, for skeptics, here’s a representative graph of the activity, where you can see Variation B and C rising above the Control and staying there…

Mad Mimi Plans and Pricing a/b test graph

BTW, the only harm in running the test longer, as we did here, is that we may have given up more prospects we could have won over had we called a winner when calculators told us to. But it’s far better to run tests longer in case of false positives. Which is why we did and usually do…

Why Did Variation C Outperform Variation B?

Wasn’t Variation B a Sure Thing???

If you’d asked me to put money on which of the 2 variations would perform better in this test, I would have laid $100 down on Variation B.

Easily.

After all, with Variation B, I applied all of the basic 4 tweaks that most often lead to winners, which you’ll recall are:

  1. Optimize the button copy
  2. Optimize the order of the plan options (in this case, including reducing options)
  3. Ensure 1 plan looks most desirable instantly
  4. Simplify and/or clarify the line items

Yet more people chose Basic, Pro and Silver when there were 4 options and when the least expensive product appeared first in the row. Interestingly, in Variation B, where we led with the most expensive solution (Silver), lift was less than half on that expensive option versus Variation C. Why??? What was going on??

We hypothesize that:

  • Mad Mimi’s visitors may be price-conscious. When we led with the $199 Silver plan, we may have inadvertently turned budget-conscious prospects off, anchoring the solution as “too expensive”.
  • Plan selection isn’t flexible for Mad Mimi prospects. Because one decides on a plan based on how large one’s list is right now, it may be harder to move people up and down among the plans - unlike with many SaaS solutions, where plans are based on the addition/subtraction of features and support levels.
  • The presence of an uber-expensive solution (Gold) may have positively impacted decision-making. Could there be something aspirational about choosing a $10 solution that’s just 3 steps away from the solution the ‘big guys’ use? Could the dramatic difference between the $1049 solution and the $10 solution have made the $10 solution look like a no-brainer?–and could the low-friction buttons then have sealed the proverbial deal?
  • The presence of a solution meant for large businesses (Gold) may have suggested a more robust email platform. Although I’ve seen loads of market research that says many small businesses want to stay small, it could be that online businesses – the type that might be considering Mad Mimi – want to grow and are more likely to wish to choose a solution that will grow with them vs one that ‘caps out’ at 50,000 contacts.
  • Silver performs better with a higher-priced option (Gold) to contrast against. We know we make decisions by comparing and contrasting options and narrowing option sets down to pairs, at best. In both variations, prospects could contrast Silver against another option: in Variation B, it was against Pro; in Variation C, it was against Gold. Perhaps Mad Mimi’s visitors felt more assured when contrasting Silver against a higher-priced plan than against a lower-priced plan.

Finally, given that Variation C beat the living daylights out of the Control, which it was very similar to, there’s no mistaking this: Button copy matters. Line-item clarity matters. And be sure to mention support. 

Any other takeaways you can think of?

If this were your test on your site, what would you do next, knowing what you now know? What should Mad Mimi do next?

~joanna

 

Unlike any copywriting or marketing newsletter! Weekly tips. Exclusive offers 'n' freebies. Stuff others would charge for - free...
  • Frank C. Siraguso

    Do initial caps make a difference? Does it depend on context?
    These are the kinds of things that drive people like me (writers) bananas:

    The buttons on B say, Get Started Now
    The buttons on C say, Get started

    • Joanna Wiebe

      It might all make a difference, yes. And if I can drive just one writer bananas per day, I’ve done my job. ;)

      • Frank C. Siraguso

        I love it when you’re strict! Sooo, ini caps or no?

      • Joanna Wiebe

        Impossible to know! Must be tested. :)

  • Joanna Wiebe

    Thanks, Adolf! I’ll do my best. :)

  • Pete

    Var. B: Could the plan names have monkeyed your results a bit? Here the naming convention shifts from levels to colors (Basic, Pro, Silver, Gold). Since Var B delegated the Gold to a secondary position and unbalanced the naming convention parings, may the shift have disoriented the visitor and created a ‘what’s behind the Silver door? Where’s the Gold door? Is there a platinum door?’ scenario (slide 22 from your presentation)? Would MadMimi be open to a test in which you remove the level names and simply identify the number of subscribers; 500, 10,000, 50,000 and 350,000? I wonder if just naming the plans what they are would minimize the anxiety.

    • Joanna Wiebe

      You referenced my prez in your comment! I officially love you, Pete. :) I think that’s a very, very interesting hypothesis. I can def write to Emma @ Mad Mimi and see if they’re okay with renaming their plans — if there aren’t tech issues, etc. Kudos (or blame :) ) goes to you!

  • Frank C. Siraguso

    “Could the dramatic difference between the $1049 solution and the $10 solution have made the $10 solution look like a no-brainer?”
    Maaayyybeee 500 contacts/$10 vs 2,500 contacts/free sounded like a better nobrainer and users ignored the $1049 plan.

    • Joanna Wiebe

      Yeah, that’s a great point, Frank. My bad for not making clicks on the Free plan a goal — totally should have. That might have shed a little light on what happened here.

  • Gidget Media

    I just went and check out your Copyblogger talk. Love the lizards! Thanks so much for sharing.

    • Joanna Wiebe

      Thanks! Glad you like. :)

  • Ruben Aguirre

    Joanna, I’ve noticed that some of the tests you run aren’t kept on the sites. Why would MadMimi still have the old version on their website. When we find something that works, we stick with it and don’t look back (yet continue looking forward for more improvements).

    • yazinsai

      Ditto. Just noticed it here: https://madmimi.com/service_agreements/choose_plan

      Maybe they’re A/B testing and we both got the dud?

      • Joanna Wiebe

        Mad Mimi actually is running another A/B test on its pricing page right now. :) But you’re totally right – a lot of bizzes never actually update their page when a test wins. (Acuity Scheduling, I’m looking at you!!) Why that is the case is another post entirely, I’m sure – and one I’d have to research extensively because I have noooo idea what’s holding people back from hard-coding winners.

      • http://inbounddealer.com Vince Green

        One word, 3 letters – Ego.
        Have you tried gaining agreement with clients prior to the test that the changes will be made? Or might that threaten the work prior to actually doing it?

  • Joanna Wiebe

    You’d think, right? I was thinking the same thing. But then why didn’t the Control perform much better? These are not small increases in clicks. So if the presence of Gold has positive effects, you’d think Variation B (subordinated Gold) would’ve tanked. But it didn’t. Any ideas why not?

  • http://www.designhandyman.com/ DesignHandyman

    I love this, Joanna. Thanks for sharing.

    • Joanna Wiebe

      Glad you dig it! Hopefully it’ll help you on your pricing (or even catalog) page. :)

  • http://www.designforfounders.com/ Heidi Pungartnik

    I honestly expected the B version to win simply because it has only 3 options instead of 4. But I guess in this case this factor didn’t mean as much since, as you already noted, customers choose a plan based on their subscriber count.
    I always enjoy these write-ups. Give us more :)

    • Joanna Wiebe

      Reducing 4 to 3 has worked SO many times for me. I was floored. But, yeah, it makes sense why it didn’t work. :) Lance saw the same thing when he was optimizing the TurboTax pricing page, too (where you choose a plan based on your tax situation, which is what it is).