Are we listening to our customers’ voices?

30 April 2015

Usability testing is a funny thing. On the surface, it’s a vital key to unlock the all-important “voice of the customer”. But how do you know it’s the customer’s voice you’re hearing?

When I built my first intranet, it never occurred to ask my ‘users’ what they thought of it. I just polished and published my pages and hoped for the best.

Back then, just getting people to look at the intranet was an achievement.

It wasn’t until I launched my first major corporate website that I thought to test it. I’d been reading Steve Krug’s Don’t Make Me Think and it got me thinking.

So I drummed up a bit of budget. I got a whole $500 to buy thank you gifts for my test subjects. I think we gave them Gold Class movie tickets. And I cajoled our training team into recruiting test subjects for me, from the customers they trained each day. $50 tickets to spend an hour with me, telling me how easy / hard it was to complete simple tasks on our site.

Well… did I learn a lot!!  The site wasn’t half as easy to use as I thought. None of my test subjects could find our Contact Us page (the third most frequent task on the site).

Embarrassing, but I certainly heard my  customers’ voices. I sat right next to 10 of them and listened!

That was my first usability testing experience and the first time I sat in the facilitator’s seat.

My second time around was a bit more complicated and a lot more costly.

A year after my first tests, I went back to senior management and asked for $40k to hire a professional usability firm to run end to end testing on the new booking process for our training courses.

A bit of back story

When I first launched our website, customers booked training courses by:

  1. Sending an email to our training team.
  2. Getting a phone call.
  3. Discussing what they wanted and providing a credit card number.
  4. Done.

Then we ran a multi-million dollar CRM project, so that we could better manage our customers. Those same training customers who had taught me so much about my site a year earlier.

With this new CRM, came with a new process for booking training courses, all online, “out of the box”. Now all our customers had to do was:

  1. Search for a course in our catalogue – assuming they knew what course they wanted to do.
  2. Sign up each person who wanted to do the course – you had to sign them up separately.
  3. Set up an account in our system – you know, contact person, address, phone number, name of your first born son etc.
  4. Then, and only then, could they see how much the training would cost, any discounts etc.
  5. Finally they could pay for their course – if they had the right credit card – we only took Visa and MasterCard.

It was when I tried this process myself, in our staging site, that I went to our management team cap in hand

Not my smartest move, I had no evidence that the process was flawed. Only my gut instinct.

The new booking process was launched on our website with no usability testing. Six people out of every ten who tried to book a course dropped out before they completed their booking.

In one month, the training team had a drop in revenue of 40%.

Talk about voice of the customer! More like footsteps of the customer running away to find an easier way to book training courses.

I got my $40k.

When senior management watched recordings of the testing they were happy to invest in a change to the journey.

The good, the bad, and the plain wrong

I was lucky in my first two forays in to usability testing. 

When I tried my hand at it, I had clear and easy to follow instructions from experts like Jacob Nielsen and Steve Krug. I diligently followed all their advice about:

When I outsourced my testing, the company I used were true experts. They facilitated each session calmly and clearly without leading the subjects in any way. They were quick on their feet and followed the subject wherever they went, gently encouraging them to share their thoughts on each aspect of the process.

Both times, my customers’ voices came through loud and clear.

I only wish all my usability test experiences were as good.

More recently, I commissioned a different company to do some simple task analysis across the site I was managing. All we wanted to know was how easy it was to complete key tasks, mainly sales and support tasks, on our site.  Not rocket science. 

I won’t bore you with the details, but one incident stood out.

Towards the end of the third session, we were doing 10 in all, I realised that none of our test subjects had used our Search. Not one.

Thinking about it, I realised that not a lot of the visitors to our live site used Search either. I was curious as to why. Given that our Search box was rather large and very prominent, you’d think people would use it.

Was our navigation structure that good? Were people searching our site via Google? Hearing from our customers would be very useful right now.

I pulled the facilitator aside and asked her to add a question at the end of the next seven sessions. I even gave her the words:

“I notice you haven’t used the search at all. Could you please talk me through why you didn’t feel the need to search.”

Simple, right?  Well apparently not.

I sat through the next session. And no, the subject didn’t search. I waited at the end to see what the subject would say…

The facilitator gets to the end of the session and then:

“Could you complete this next task by using the search box please?”

What?!!  That’s not what I told her to ask.

She wasn’t even listening to my voice, let alone the customers’.

Take out the middle man?

Facilitated usability testing can really fall on its face if you have the wrong facilitator. It’s so subjective.

So, what’s our alternative?

Take the facilitator out of the equation. Let the customer speak for themselves.

These days we have a whole range of non-facilitated testing options. Non-facilitated usability testing applications are all over the web.

For those of you playing at home, these applications let you set up a set of tasks for subjects to complete on your site (or in your online application, as long as it is on a publicly accessible browser, you can test it), add a series of questions and then invite subjects to do the tasks, answer the question and comment.

You can even ask your subjects to turn on their computers’ cameras and talk out loud as they complete your test.

I still have my reservations about these kinds of tests. 

They are just a bit cookie-cutter, artificial intelligence for me.

Sure they’re more objective. And there’s no facilitator to accidentally influence your subject or go off script. They’re cheaper too – easier to get budget from management.

But, in my mind there are two down sides:

  1. Just like the site you’re testing, once you’ve set up your test and hit Go, the test is on its own. You can’t change anything and you can’t see what’s happening.

    So if a test subject gets stuck, or something goes wrong, your test subjects are as alone as they would be on your website. There is no one to ask for help. And you’ll only know about it after the test is over.

    I can almost hear the usability community thinking up how they can test the usability of usability tests.

  2. Even if everything goes to plan, I don’t think the feedback you get from your test subjects is as rich.

    Most of their feedback will be typed – either comments or answers to your questions. There are very few people who are more talkative when typing than when they talk out loud. So you’re not going to get as much feedback as you would from a facilitated test.

    “Ah, you say, what about the camera? Surely you can get feedback from them talking to the camera?” How many people do you know who are comfortable talking to themselves? Particularly when they’re not being prompted.

    Again, I’m not seeing the quantity or quality of feedback you can get from a facilitated test.

So how do we listen to our customers’ voices? 

It seems to me that we have two choices:

  1. Trust our facilitators to stay neutral, not influence the test subjects, and act as the voice of the customer.
  2. Take away the facilitators and turn to the online tests. Our customer’s voices will come out as sound bites and statistics. But at least we’ll hear them.

And that’s the point really, any tool that lets you hear what your customers are saying about your site is better than none.

How else will you know if they can find your Contact us page?