UPDATE: August 5, 2014

I recently received a note from the Peek team:

As you can imagine, coming up with questions that would return useful feedback on any website was not an easy task. Once we had some, we were kind of wary to change them (since Peek was working and we were getting positive feedback), but your suggestions just made a lot of sense to us. Based on your recommendations we worked with our research team to tweak the questions on Peek.

It’s just amazing to me that a successful company like UserTesting.com (no affiliation) could read our post and take action. That’s what Joanna and I love about writing these posts — the ability to influence others in a positive way. Thank you for reading, and for the follow-up, Team Peek! Best of luck with your new service.


 

If you’re reading this post, it’s very likely that you care about how people use and perceive your website. It’s also likely that you spend some portion of your time optimizing your site.

Effective optimization requires input. In other words, you need to know where to focus and what to “fix”. There are many potential sources of optimization input, but all [should] involve your visitors: email or on-page surveys, chat session transcripts, A/B tests, heat maps, session video captures, analytics, and user tests.

Through my experience optimizing websites over the past 15 years, I’ve tried every single technique. Multiple times. And nearly every “CRO” tool out there (trying out a new CRO tool is like a massive sugar rush for me, so if you’re thinking of trying one out, shoot me an email – lance at this domain – and I’ll give you my unfiltered opinion).

With so many available options, my favorite technique is still usability testing (or simply user testing). Traditionally, usability testing involved rounding up 8-12 customers or prospective customers in an on-site or rented lab and moderating a 1-hour session with each individual.

We would watch people try to complete several tasks using the software/website being tested, recording their mouse movements and computer screen – as well as their faces – and gently prompt participants for feedback when they were experiencing a strong emotion.

The metrics we’d collect in a typical study included:

  • Task success
  • Time on task
  • The number of issues encountered and their severity
  • Some type of satisfaction rating that we could tie back to their performance

Read this part if you’re a CRO keener:

Surveys, heat maps, analytics, and A/B tests are quantitative research techniques, requiring large samples to measure differences in the data. Additionally, surveys are attitudinal (i.e., people are asked for their opinion), whereas heat maps, analytics, and A/B tests are behavioral (i.e., there’s no direct feedback).

User testing is a form of qualitative research that relies on behavioral and attitudinal metrics. Bonus!

A new form of user testing has emerged over the past decade… a derivative of traditional lab-based user testing: remote, unmoderated user testing. This newer approach addresses common objections to running an in-person study:

1) High cost… it’s expensive to rent a lab or purchase the equipment, although this has improved dramatically over time

2) Turnaround time… traditional lab studies require moderator travel and participant recruitment

3) Available expertise… effective moderators require training and experience

Remote testing usually involves desktop video and audio capture of participants – as they follow a pre-configured script of tasks and follow-up questions that you create.

The process is asynchronous, too; you hit the “Go” button and sometime later, you receive a video of the session. Everything, outside the user test itself, is typically handled by software.

With this type of testing, you’ll likely want to use a third party service to run the study. Joanna and I love UserTesting.com

With UserTesting.com, it’s super easy to set up a study.

You choose a web page as the starting point for users (it could be your own site, a competitor website, or even Google if you want to see how people might find your site), select the “tasks” for users to complete (using their pre-fab templates or by creating your own custom tasks), make some basic demographic selections, and launch.

In about an hour, you’ll have a 10- to 15-minute video of someone completing tasks on your website, complete with audio soundtrack (audio can be extremely helpful, because users are encouraged to “think aloud” as they move through the tasks) and written responses to any associated follow-up questions.

Doing this with 5 users will cost you ~$250 USD, and while it sounds pricey, if you set up your test correctly, the ROI could be massive.

To demonstrate the potential value of remote user testing on a larger scale, the good folks at UserTesting.com recently developed a service that lets you get a taste of their solution, but at no cost. It’s called Peek.

You can head on over to their site to claim your free, 5-minute user testing video – for any site of your choosing. We did. So did the team at Inbound.org (the marketing resource co-founded by Rand Fishkin of Moz and Dharmesh Shah of HubSpot).

Here’s the Peek video that was shared publicly on Inbound

User testing services reviewed

 

And here’s the video of our Peek review (on User Hue)

Peek user testing

After reviewing our own video and Inbound’s, it became clear that we should share our impressions of this new service, so that you know what to expect and how to make the most of the test results.

(As you read our feedback, keep in mind that we’ve been using UserTesting.com for 6 years – for our own projects, our clients, and at places like Intuit, where Joanna and I spent 5+ years in-house).

Our biggest questions/concerns about Peek are:

1) Is the site you want to test relevant to the “random” participant?

For a site to be useful, it has to have (1) utility (by offering the features you need) and (2) usability (by making it easy to find and use the desired features). If the site you’re testing isn’t relevant to a particular user, it’s not going to be very useful (to them).

As such, it’s going to be challenging to get reliable “usefulness feedback” from someone who hasn’t expressed a need for a particular type of product, service, or site.

With Peek, it’s our understanding that you’re getting one of their existing panelists to take your test – with no assurance of relevancy. That’s okay, as long as you understand it going in.

With the full-featured UserTesting.com offering, you can specify key requirements for participants (e.g., you must be interested in using software to complete your income taxes), and you have the option to recruit your own site visitors for tests. But for the free Peek version, this is going to be an issue.

2) Are the participant’s answers reliable / representative?

With any user test, the quality of your results is a function of the follow-up questions you ask. This is especially true with Peek, because the test is comprised only of follow-up questions, and not tasks (see our next big question).

Here are the questions Peek poses to participants:

Q1: What’s your first impression of this site? What is the site for?

Copy Hackers take on it: These are perfectly reasonable questions to ask, providing the participant gets the right guidance… meaning there are only certain kinds of feedback you can reliably give about your first impressions, and they tend to be visual in nature.

Q2: What would you do next on the page? Describe your experience.

Copy Hackers take on it: Asking about future intent is a problem. People are pretty bad at predicting their future. And they are only slightly better at recalling the past. It would be better if this question were turned into a task:

Please review the page and click on anything that interests you. Why did you click those elements? (It’s okay if nothing interests you, but please let us know.)

Q3: What did you like? Not like? Would you return to the site in the future?

Copy Hackers take on it: If you watched the Peek video for Inbound.org, you would’ve heard the following feedback:

“I like the colors”

“I don’t like the layout of this”

With only 1 participant in your test, what will you do with this type of feedback? It’s too subjective and impossible to act on.

The first two questions would’ve been better posed as:

“What stood out to you on this website? What, if anything, frustrated you during this exercise?”

The “Would you return?” part is also problematic. Normally, we’d suggest that the question be rephrased, “How likely are you to return?” but since we know the participant is not necessarily a target customer of the website, the answer would also not be reliable. As such, we’d recommend removing that final question.

3) How is task success measured? (Or is there a task?)

Based on the above questions that Peek poses to participants, this test is not a standard user test. You might call the first question a task, but it’s really more of an implied task. In reality, it’s a follow-up question to a task that is undefined (at least as far as the video shows).

Without an explicit task (e.g., “Spend a few moments to get a sense for what the website offers.”), participants appear to struggle to give useful feedback – and end up focusing on what they like or don’t like.

What we’ve noticed while watching several Peek videos is that participants tend to talk about what they’re seeing without really spending any time absorbing what’s on the page. If you’ve ever tried it, you know it’s impossible to read a newspaper while talking.

In closing…

UserTesting.com is a wonderful service, but Peek has some problems to solve before we can give it a ringing endorsement, which we’re looking forward to doing. (BTW, we’re in no way affiliated.) That’s not to say you shouldn’t run right over there to try it out. Do it! But when you’re reviewing your free video, just remember that Peek is a [featured-limited] taste of a much larger service that, when used correctly, could be invaluable to your CRO activities.

Here’s the bottom line for your optimization efforts: The existing suite of CRO tools all come with their own quirks and compromises, and you need to be aware of the pitfalls to avoid taking action that harms your conversion rate instead of helping it. Naturally, much of what you find could become fodder for A/B tests…

Joanna and I would love to hear about your own experiences with user research tools (any kind). What’s worked well for you? What techniques would you pass on in the future?

Happy optimizing!

~lance

PS: Like reviews of biz services? Check out our popular review of Stripe payments (we’re not affiliates)