Pricing Page Optimization: Case Study Takeaways

  • Button copy matters and it should ALWAYS be tested
  • Line-item clarity matters and not all symbols are universally understood
  • Customer support is valued and should be featured on your pricing page

I recently had the pleasure – gut-wrenchingly terrifying as it was – of speaking at Copyblogger’s Authority Intensive about buttons…

One of the many case studies I discussed was for the pricing page of Mad Mimi, an awesome email marketing solution you should check out…

Because I barely scratched the surface of this particularly interesting A/B/C test in that presentation… and because pricing pages are so crucial to startup success… let’s take a look at how we optimized the Mad Mimi pricing page – so you can go forth and conquer your own…

Optimizing the Funnel Entry Point:

Are You Treating Your Pricing Page Like the Huge Opportunity It Is?

Pricing pages are my absolute favorite SaaS pages to optimize. They represent the point at which a visitor is getting very, very close to making the leap from prospect to customer – or at least to trial user – which can mean growth and success and happiness for startup teams.

When I’m testing a pricing table, I start with the basics, which I’ve seen work effectively time and again:

  1. Optimize the button copy
  2. Optimize the order of the plan options
  3. Ensure 1 plan looks most desirable instantly
  4. Simplify and/or clarify the line items

Those are 4 very simple tweaks that anyone on your team can make to run a test pronto. Unlike renaming plans and unlike price testing – both of which require technical work, making them prohibitive for many marketing teams – these 4 tweaks are so straightforward, you can easily experiment with them and run them in any A/B testing tool. (I prefer VWO, which we used for the Mad Mimi test.)

Our goal with Mad Mimi was to increase clicks on any plan. It wasn’t to bring in paid conversions, completed signups or upgrades.

Why not?

Because the 4 tweaks I mentioned above aren’t meant to do that. And your goal must match your tactic.

Rather, those 4 tweaks are simply meant to help users get through a page easily. To grease up a funnel, as it were, by reducing friction.

They do the light lifting. They pluck the low-hanging fruit.

They don’t necessarily overcome objections or neutralize anxieties; they don’t amplify value; and they don’t highlight incentives, in most cases. And that’s okay. When you’re optimizing a funnel, it’s perfectly valid and useful – as you’ll see – to run tests that focus purely on helping your lizard brain make better, faster decisions. Button copy and color tests may not persuade more people to choose you, but they’re likely to move more people from Home to Tour, from Tour to Pricing, and from Pricing to Create Account.

So, with that in mind, here’s the control for Mad Mimi:

CONTROL

Pricing Table Optimization - Control

Take a look at the Control. What would you test right off the bat?

Now, thinking of the 4 bullets I mentioned above, we developed the creative for Variation B, in which we:

  • Rewrote the button copy for the plans, changing it from “Sign Up!” to “Get Started Now” (not on the yellow sticky)
  • Reordered the plans from most expensive to least in order to anchor the price and leverage primacy effect (more about this in my KISSmetrics post here)
  • Removed the Gold Plan from the table and subordinated it to a line of text above the Free Plan messaging; we felt confident subordinating the Gold Plan because its price was so high at $1,049/mo that it bordered on ‘enterprisey’
  • Clarified the line items by 1) replacing the infinity symbol, which may not be immediately understandable, with the word “unlimited” and 2) adding “premier support” to the list

Those 4 tweaks resulted in the following variation:

VARIATION B

Pricing Table Optimization for Mad Mimi

After creating that variation, we felt unsatisfied at the number of changes. Having changed 4 things, how could we know what might cause the lift, if we were to get any lift?

So we developed a third variation – one that was much closer to the Control but that incorporated just 2 changes, which were also seen in Variation B:

  • We updated the button copy for the plans from “Sign Up!” to “Get Started” (not on the yellow sticky); this copy is shorter than Variation B’s “Get Started Now” because the 4-column layout here wouldn’t allow the room for an extra 4 characters per button
  • We clarified the line items by 1) replacing the infinity symbol, which may not be immediately understandable, with the word “unlimited” and 2) adding “premium support” to the list

Those 2 changes to creative resulted in the following variation:

VARIATION C

Mad Mimi Pricing Table Optimization

As I mentioned, our primary metric was click-thru rate.

Because there were 5 buttons to track here, we set up these five goals:

  • Clicks Basic
  • Clicks Pro
  • Clicks Silver
  • Clicks Gold
  • Clicks “sign up” (the button under Customize a plan)

So scroll on up the page to take a look at Control, Variation B and Variation C again…

…and think about which one you believe won for each goal.

Which should get more people to click Pro?

How about Silver?

Do you think it’s Variation B?

Or Variation C?

Or maybe Control beat both the variations?

The Rather Jaw-Dropping Results of This Pricing Table Test

Although we saw lift within the first 3 weeks for Variations B and C, we kept the treatments running in VWO a full 6 weeks to see if anything would change… but things only got better.

Now, for the drumroll….

The winner?

Both Variation B and Variation C outperformed the Control for the duration of the 6-week test. 

Only exception: the Sign Up goal. On that goal, we didn’t reach confidence. Can you guess why? It’s because we didn’t change anything there! Across Control, B and C, the “sign up” button on the yellow sticky note stayed exactly the same. So it’s a good thing there was no lift… or we’d have problems. 🙂

Here’s how the results for the other 4 goals shook out:

Pricing page test results

On every single goal, both Variation B and Variation C beat the hell outta the Control. Massive lift.

Variation C largely outperformed Variation B, bringing in more conversions overall. 

Pricing pages side-by-side

It’s only on the Choose Gold goal that Variation B performed best – and if you want to know what the Helsinki was going on there, read this post

Finally, for skeptics, here’s a representative graph of the activity, where you can see Variation B and C rising above the Control and staying there…

Mad Mimi Plans and Pricing a/b test graph

BTW, the only harm in running the test longer, as we did here, is that we may have given up more prospects we could have won over had we called a winner when calculators told us to. But it’s far better to run tests longer in case of false positives. Which is why we did and usually do…

Why Did Variation C Outperform Variation B?

Wasn’t Variation B a Sure Thing???

If you’d asked me to put money on which of the 2 variations would perform better in this test, I would have laid $100 down on Variation B.

Easily.

After all, with Variation B, I applied all of the basic 4 tweaks that most often lead to winners, which you’ll recall are:

  1. Optimize the button copy
  2. Optimize the order of the plan options (in this case, including reducing options)
  3. Ensure 1 plan looks most desirable instantly
  4. Simplify and/or clarify the line items

Yet more people chose Basic, Pro and Silver when there were 4 options and when the least expensive product appeared first in the row. Interestingly, in Variation B, where we led with the most expensive solution (Silver), lift was less than half on that expensive option versus Variation C. Why??? What was going on??

We hypothesize that:

  • Mad Mimi’s visitors may be price-conscious. When we led with the $199 Silver plan, we may have inadvertently turned budget-conscious prospects off, anchoring the solution as “too expensive”.
  • Plan selection isn’t flexible for Mad Mimi prospects. Because one decides on a plan based on how large one’s list is right now, it may be harder to move people up and down among the plans – unlike with many SaaS solutions, where plans are based on the addition/subtraction of features and support levels.
  • The presence of an uber-expensive solution (Gold) may have positively impacted decision-making. Could there be something aspirational about choosing a $10 solution that’s just 3 steps away from the solution the ‘big guys’ use? Could the dramatic difference between the $1049 solution and the $10 solution have made the $10 solution look like a no-brainer?–and could the low-friction buttons then have sealed the proverbial deal?
  • The presence of a solution meant for large businesses (Gold) may have suggested a more robust email platform. Although I’ve seen loads of market research that says many small businesses want to stay small, it could be that online businesses – the type that might be considering Mad Mimi – want to grow and are more likely to wish to choose a solution that will grow with them vs one that ‘caps out’ at 50,000 contacts.
  • Silver performs better with a higher-priced option (Gold) to contrast against. We know we make decisions by comparing and contrasting options and narrowing option sets down to pairs, at best. In both variations, prospects could contrast Silver against another option: in Variation B, it was against Pro; in Variation C, it was against Gold. Perhaps Mad Mimi’s visitors felt more assured when contrasting Silver against a higher-priced plan than against a lower-priced plan.

Finally, given that Variation C beat the living daylights out of the Control, which it was very similar to, there’s no mistaking this: Button copy matters. Line-item clarity matters. And be sure to mention support. 

Any other takeaways you can think of?

If this were your test on your site, what would you do next, knowing what you now know? What should Mad Mimi do next?

~joanna