There are 614 guidelines.
This was not a way to check designs to see if the team had gone in the right design direction.
Are you designing or inspecting?
They are not interchangeable, guidelines and heuristics, but many UXers treat them that way. It’s common to hear someone saying that they’re doing a heuristic evaluation against X guidelines. But it doesn’t quite work like that.
Designing is an act of creation, whether you’re doing research, drawing on graph paper, or coding CSS. Inspecting is an act of checking, of examining, often with some measure in mind.
Guidelines are statements of direction. They’re about looking to the future and what you want to incorporate in the design. Guidelines are aspirational, like these:
- Add, update, and remove content frequently.
- Provide persistent navigation controls.
- Index all intranet pages.
- Provide org charts that can be viewed onscreen as well as printed.*
Heuristics challenge a design with questions. The purpose of heuristics is to provide a way to “test” a design in the absence of data by making an inspection. Heuristics are about enforcement, like these:
- Visibility of system status
- The system should always keep users informed about what is going on…
- Match between system and the real world
- The system should speak the users' language….
- User control and freedom
- The system should provide a clearly marked "emergency exit" to leave the unwanted state … **
Creating or diagnosing?
Heuristics are often cast as pass/fail tests. Does the UI comply or not? While you could use the usability.gov guidelines to evaluate web site designs, they were developed as tools for designing. They present things to think about as teams make decisions.
Both guidelines and heuristics are typically broad and interpretable. They’re built to apply to nearly interface. But they come into play at different points in a design project. Guidelines are things to think about in reaching a design; they are considerations and can interact with one another in interesting ways. Heuristics are usually diagnostic and generally don’t interact.
Don’t design by guidelines alone
For example, on the intranet project, we looked at guidelines about the home page. One directive says to put the most important new information on the home page, and the next one says to include key features and company news on the home page. A third says to include tools with information that changes every day. But earlier in the list of guidelines, we see a directive to be “judicious about having a designated ‘quick links’ area.” Guidelines may feel complementary to one another or some may seem to cancel others out. Taken together, there’s a set of complex decisions to make just about the home page.
And it was too late on our intranet to pay attention to every guideline. The decisions had been made, based on stakeholder input, business requirements, and technology constraints, as well as user requirements. Though we were thoughtful and thorough in designing, anyone scoring our site against the guidelines might not give us good marks.
Don’t evaluate by heuristics alone
Likewise, when looking at heuristics such as “be consistent,” there’s a case for conducting usability tests with real users. For example, on the intranet I was working on, one group in the client company was adamant about having a limited set of page templates, with different sections of the site meeting strict requirements for color, look, and feel. But in usability testing, participants couldn’t tell where they were in the site when they moved from section to section.
Guidance versus enforcement
What are you looking for at this point in your design project? In the intranet project, we were much closer to an evaluative mode than a creation mode (though we did continue to iterate). We needed something to help us measure how far we had come. Going back to the guidelines was not the checkpoint we were looking for.
We sallied forth. The client design team decided instead to create “heuristics” from items from the user and business requirements lists generated at the beginning of the project, making a great circle and a thoughtful cycle of research, design, and evaluation.
I don’t know whether the intranet we designed meets all of the guidelines. But users tell us and show us every day that it is easier, faster, and better than the old intranet. For now, that’s enough of a heuristic.
* From "Intranet Usability: Design Guidelines from Studies with Intranet Users" by Kara Pernice Coyne, Amy Schade, and Jakob Nielsen
** From Jakob Nielsen's 10 heuristics, see http://www.useit.com/papers/heuristic/heuristic_list.html
Related:
Where do heuristics come from?
What are you asking for when you ask for heuristic evaluation?
:: :: :: :: :: :: :: :: :: ::
Note: I'm moving!
After 20 years in the San Francisco Bay Area, I'm bugging out. As of September 1, I will be operating out of my new office and home in Andover, Massachusetts. I'm excited about this move. It's big!
You can still find me at www.usabilityworks.net, email me at dana@usabilityworks.net, on Twitter as danachis, and on the phone at 415.519.1148.
Great stuff Dana. It made me think of some related reactions I've encountered. Here's a few (maybe you have some suggestions on how to address them):
ReplyDelete* "are heuristics and guidelines different because heuristics are more like standards (ie: yes or no matches?)"
* "ok, we get that there is a difference in purpose between guidelines and heuristics, but how do we know we have good heuristics?"
* "My org doesn't like/I'm alergic to Jakob Nielsen. How do I create my own heuristics? Where do I start?"
* "Shouldn't our heuristics represent our guidelines and standards? otherwise, what's the point of validating with abstract heuristics that are not all relevant to our design?"
* "What's a better investment, creating guidelines to design from or doing a heuristic evaluation of what we design? How much do they "cost"? How do you pick given limited resources?"
i like ur website. thanks for this comment posting... more templates easy to download
ReplyDelete