Missed Connections: I Saw You Testing My Website
Usability testing is like flossing: everyone knows it's important, but they don't do it nearly as much as they should. Because let's face it, making sites is a lot more exciting than testing them. Well, this past week we ran our first round of in-house usability tests and wanted to share how we did it, what we learned, and what we need to improve.
This presentation by Meetup on Lean Usability was a big help in putting together our initial plans. It does a good job of laying out the pros and cons of several approaches and points out what you should focus on at the beginning, before you have any experience to guide you. Read it.
How we did it
- We chose to test an important form that we were in process of redesigning for a client. With differing opinions about the approach to certain aspects of the redesign, it was a prime candidate for usability testing.
- We decided to use Silverback to capture the test sessions. We liked it because of the ability to record audio and video of both the user and the screen, giving us an objective record of what happened during the session that could be reviewed later.
- We placed an ad in the "volunteers" section on craigslist. The message gave a quick summary of what we needed the volunteers to do, who we were, and directed them to a Wufoo form where they provided us with their information. We offered $30 for about 30 minutes of their time and ended up having almost 70 responses in less than 24 hours.
- We decided to schedule 5 sessions, one right after the other, during one afternoon. We picked 5 people whose schedules worked with ours and emailed them to let them know we had scheduled a session for them. We also emailed everyone we didn't select, thanking them for contacting us.
- Before the sessions, we wrote up scenarios that would be given to the user. We decided on three tasks, each more complex than the last, that would focus on a key point of the clients' form we were redesigning. We also prepared a release form that let testers know that we'd be recording audio and video of their session, that this would be viewed by thoughtbot and our client, but would not be used for any other purpose.
What we learned
- We sent a confirmation message the morning of the testing to all the testers, and ended up having 4 out of 5 show up.
- The scenarios/instructions are important. Our first tester was completely stumped by the first scenario, and we realized that we had not been as clear as we could have been. We quickly modified the instructions before the next session.
- Silverback has drawbacks. Each of our sessions was between 22 - 30 minutes and takes up more than a GB of hard drive space. Additionally, exporting video of the sessions takes a long time. On a MacBook with an dual core processor and 2 GB of RAM, it takes 2 to 3 hours. (It still takes over an hour on a maxed-out MacBook Pro.)
- It takes time. Recruiting testers, writing scenarios, running sessions, exporting video, sending the results -- all of these things take time. Make a realistic estimate of how much time you'll actually be investing in doing usability testing.
What to improve
- The users we recruited for this round of testing didn't need to have any special qualifications. As we do more testing, we'll likely encounter a situation where we'll need users with specific knowledge related to the site they'll be testing. (Running usability tests on Hoptoad would present this problem.) It's unclear if craigslist will work in these situations.
- While it's impossible to completely remove any confusion that the instructions may introduce, we'll need to do a better job of writing scenarios that are clear.