Sunday, July 20, 2008

Stop writing reports

I’ve had a few questions from readers lately about standardizing reports of usability test results. Why is there no report template in the Handbook? There’s no “template” for a final report because I think you probably shouldn’t be writing reports. Or at least written reports should be minimal. Mini.mal. Though the outline should basically be what’s in the Handbook, what you put in your report depends on
  • The test design and plan
  • What your team needs and can use
And let’s use “report” in the loosest possible way: delivering information to others. That’s it. Your report doesn’t have to be a long, prose-based, descriptive tome. (Not that there’s anything wrong with that.) And the delivery method doesn’t have to be paper.






That leaves a lot of options, from an email with a bulleted list of items, to a “top line” post on a blog or wiki that lightly covers the main trends and patterns. In the middle of the range might be a classic usability test report that describes results and findings in some detail. (I personally dislike slide decks as reports, but a lot of organizations do them.) These will all work for any type of test. For summative tests, you may want to go as far as the CIF, or Common Industry Format, established by the International Standards Organization.
BUT if your team has observed the sessions and attended the debriefs, you probably don’t need much of a report. They won’t read it; everything has been discussed and decided already. Whatever you deliver is simply a record of that set of decisions and agreements.

2 comments:

  1. If you're not writing reports aren't you afraid of throwing away the baby with the bathwater?

    Collecting data helps to benchmark and track improvements overtime. Tullis and Albert have done a great job in their book Measuring the User Experience describing some measurements one can take during a typical usability test. What's your opinion about this?

    For example i like to use SUS to track the difference between participants and track improvements between different version of a site.

    ReplyDelete
  2. You only need big reports if your team is not engaged in the test. If they're on board by working with you to develop the test design and observe the sessions, then you can have an ongoing discussion with them -- and do less reporting. Here you're interpreting observations to make inferences about what to change in the design.

    Keep in mind that I'm talking mainly about exploratory, formative tests here, where you're not measuring a lot. In a summative test, where you're comparing data or doing a lot of measuring with or against benchmarks, of course you must tally, analyze, and interpret that data. That usually means that you have to write some kind of report.

    Not all usability testing needs that.

    The SUS is a whole different discussion. :D

    ReplyDelete