The importance of user testing your copy

The quality of your content can make or break the user experience. Go beyond the user interface, and think about words, too.

I don’t think anyone needs reminding of how important user testing is. Without testing, your product is completely unproven - along with your assumptions of how people will use it. There are so many fast and affordable ways to test that there’s no good excuse for leaving it out.

That said, something is often excluded from user testing. Something quite important. Copy. Content. Words. Call it what you want, it’s amazing just how many designs are tested without it.

I’ve worked with loads of project teams. I can honestly say that it’s rare for designs to be tested with real content. Clients generally have an expectation that copy is prepared after designs have already been signed-off. Worse still, some treat the content as a total afterthought to visuals.

This attitude to this is gradually changing. Many teams are embracing the idea that ‘content is king’. Sadly, old habits die hard. Many clients continue to expect visuals before content, and therefore project teams throughout the industry carry on working this way.

Any designer knows that creating navigation and UI without real content presents a whole heap of challenges. It’s certainly possible, sure. But it is risky business.

The main issue is that no matter how good your interface, the final content will make or break the user experience. Even if your design gets an amazing reception in user testing, it could still fail completely if the words are confusing.

An example of copy-related usability issues

The project

I’ve recently been involved in a project for a large UK supermarket chain. We were creating an employee self-service system for health and wellness.

The idea was that employees suffering from health complaints could come to the system, describe the problem, and then see what kind support the business can give them.

Rather than showing all possible health issues in one great big list, our solution was to help employees diagnose themselves. The site presented users with a series of questions, asked one at a time. Based on the answers, we displayed the resources, advice and contacts the user needed.

What we tested

The content team worked closely with us during the design process. This was awesome, as it meant we had real content in the first version of our prototype.

Our initial version looked pretty darn slick, and we all clapped ourselves on the back for a job well done. At this point, we were all quite happy with the work we’d done.

Here’s a screenshot. Note that in the actual prototype these questions are asked one at a time.

Screenshot of the first design we took into user testing. The copy caused confusion for users.

Naturally we wanted to validate the system with real users. We took it into some of the supermarket’s stores, and asked staff to complete some tasks. What the testing revealed was eye-opening to say the least.

Users really struggled. Many weren’t able to answer the questions accurately, and several people ended up getting the incorrect information.

Why the first version failed

The system didn’t fail due to the UI or even the broad concept. It failed mostly because users didn’t understand the questions they were being asked.

These were usability issues with the content - not the interface.

  • We overestimated the reading level of staff. A lot of the fluffy language and fancy words we often use in our industry leaked into our designs. This confused a lot of people. For many people working in stores, English also wasn’t their first language, either.
  • We used too much jargon. Our copy was written by people with a good understanding of the healthcare industry. Our audience didn’t have that same background. We assumed people knew terms that, in hindsight, normal people just don’t have.
  • We didn’t make the answers easily scannable. All of the answers were presented in a ‘yes’ or ‘no’ format. This meant the question had to be carefully read and understood before an answer could be given. Given the complexity of the questions being asked, this was tricky for a lot of the users.

How we redesigned it

As the old mantra goes ; ‘fail fast and fix it’. By flagging these issues early, we could resolve them in the design stage. Thank goodness we user tested it!

It wouldn’t be fair to say we went back to the drawing board. The concept made sense, we just needed to adjust the execution. This mostly involved tweaks to the content, and how it was presented.

  • We used simpler language. We stuck to more simple wording throughout the process. We didn’t want it to be patronising, but there really was no need for the complicated terms we had before.
  • We provided more descriptive answers. Instead of having a ‘yes or no’ format for every question, we provided more descriptive answers. This made it easier to scan, and also meant that the answers themselves helped the user to understand the question.
  • We used illustrations with the answers. This helped people to scan through answers more easily. It also assisted users who weren’t too comfortable with reading.

Screenshot of the redesign. The copy was adjusted to be far more user friendly.

The bottom line : avoid testing designs with lorem ipsum.

These issues wouldn’t have been identified (and fixed) if we hadn’t tested with real content.

Although the solutions seem like no-brainers in hindsight, we were really happy with the original version when we went into testing. Without the insight we gained here, we’d have released a problematic system that would have confused a lot of people.

Given the serious nature of the content, it could have also reflected really badly on the business.

Content first, test second.

The moral of this story is to work closely with your project team to help content take shape early in the process.

Make sure that when you do come to test, you’re using real words rather than just lorem ipsum. You never know, it might just save your product!

Found this useful? Why not share it?

← back