A few months back I blogged about getting the teams up to speed with user testing. Since then we’ve tried a few things out and learnt a bit along the way about planning and running the tests. We’ve expended a fair bit of effort in getting some processes up and running, but hopefully next time we do testing things will be much more straightforward and worth the initial investment of time.
Before we started our usability testing we did quite a bit of analysis of our Google Analytics and customer feedback data, to find out what the top customer tasks are on the site (i.e. the things our customers do most frequently). We’ve based our user testing around a selection of the top 20 customer tasks.
Writing the test plan and preparing materials is important for any research, because you need to be clear about your objectives for the testing. It’s a good opportunity to think about exactly why you are doing the user testing and what you hope to find out about your users.
For example I wanted to know how our users perceive our web site on a first visit. So we asked participants to spend the first minute or two of the test looking at our home page and telling us what type of things they thought they could do on the site. This was very revealing and made us realise that our current home page really doesn’t say what it does “on the tin”, so to speak.
From our Google Analytics data we knew that around 50% of our website traffic comes via search engines. I wanted to know how findable some of our top tasks are whether users are searching in Google and deep linking into the site, or accessing our home page and browsing from there. So in the test plan I proposed that 50% of participants start the tasks from Google and 50% start from our home page.
I also wanted to collect some demographic information and assess user satisfaction, so I decided to create a pre-test and post-test questionnaire.
As well as the plan we had to:
I’d forgotten how much admin is involved in user testing and if you plan to recruit your own participants, don’t under estimate the effort involved. You can pay a specialist company to recruit participants who match your participant profile, but obviously you pay a fee to the Recruiter.
Our first round of recruitment was pretty unsuccessful. We had a link to a sign up form linked from the home page of our site and advertised for participants in libraries. We are offering participants an incentive (some tokens for a high street store) but even that wasn’t sufficient to entice residents to sign up! So we resorted to recruiting participants face-to-face, one at a time. That approach has been far more successful.
Ideally participants should be contacted by phone a week before and then a day before the session so you don’t run the risk of having ‘no shows’.
We finally invested in a copy of TechSmith’s Morae for our team laptop which enables us to record user tests. The software records real-time screen capture (e.g. mouse clicks and cursor movements), audio and video (a small video of the participant’s face is shown in the bottom right corner of the screen). You could use Silverbackapp to do the same job (and it’s cheaper) but we are PC based at work.
So far we’ve done some user testing sessions and have more planned for this week. First impressions of Morae are very good. As a moderator you can facilitate the session far more easily, without worrying about having to write notes. The note taker was able to focus on logging the times when users experienced problems using the site during the tests, so we could add markers into the Morae timeline afterwards. I have since found out that we could have done this automatically using a Wii remote, so perhaps we can try that next time.
We ran the tests in the local library on a laptop with Morae Recorder installed, connected to the Council network. Using Morae’s Observer software, colleagues back at the office were able to login and observe the tests in real time from a PC. This is a real breakthrough and there is great potential for getting Council services to observe tests remotely, without putting off participants by being in the same room and wincing visibly or, worse still, offering to help when users struggle with their section of the site. We can take the laptop to any Council offices or premises that have a Council network connection, so effectively we now have a mobile testing lab on our laptop!
Another tool we have tried using is Loop11, a DIY remote, unmoderated user testing tool. By setting up tests in Loop11 our participants can test the site from the comfort of their own home in a more relaxed environment. We set up a number of information seeking tasks in Loop11. The interface was very easy to use and setting up the tests was straightforward.
The task instruction is in the top green bar (as shown below) and the participant carries out the task on your site in the main frame of the browser window.
We set up the test so that participants have to answer a multiple choice question about the task they have just completed and then indicate how easy or difficult they found the task with a scalar question. Participants then click a ‘task complete’ or ‘abandon task’ button to proceed to the next task.
Loop11 records the time taken to complete a task and success or failure rate. We opted for Loop11 because, realistically, it was the only software we could afford to use. You can run the same test more than once, so for us it’s an excellent way to benchmark user task performance and measure improvements we make to the web site.
We haven’t analysed data from our user testing yet. So we’ve yet to discover how much time that part of the process takes. I’m also conscious that we need to be aware that we’re relatively novice researchers and we have a vested interest in the site, which may bias our findings.
The final step of this research phase will be to create a usability issue log, prioritise the usability ‘problems’ and document the key findings in a presentation for Council services.
Then we’ll progress to the design phase, which is where it gets really fun!
Some books I referred to when planning the user testing (and as reference for document templates) included:
Last week I did a session for the web teams on usability testing. Unfortunately I only had one laptop and the room was too small to get people sat in groups of 2 or 3 to try out moderating a usability test first hand. So instead I gave an overview of user testing including:
I’m particularly interested in what user testing methods we can use to effectively benchmark user tasks on our website, before and after we make improvements. I’m also interested in what testing methods might be suitable given some of the challenges we face as a local authority web team, for example: