USER EXPERIENCE DESIGN-USABILITY HEURISTICS

Johnsonadeoye
2 min readMar 15, 2021

The relevance of Usability Heuristics for User Interface Design to User experience across device platforms.

Firstly, what is Heuristics Evaluation? Heuristic evaluation is a process where experts use the thumb rule to measure the usability of user interfaces in independent walkthroughs and report issues.

There are 10 heuristic evaluation and also how to carry out the heuristics evaluation.

  1. Visibility of system status: Keep users informed about its status appropriately and promptly.
  2. Match between system and the real world: Show information in ways users understand from how the real world operates, and in the users’ language.
  3. User control and freedom: Offer users control and let them undo errors easily.
  4. Consistency and standards: Be consistent so users aren’t confused over what different words, icons, etc. mean.
  5. Error prevention: A system should either avoid conditions where errors arise or warn users before they take risky actions (for example having dialog messages like “Are you sure you want to do this?”).
  6. Recognition rather than recall: Have visible information, instructions, etc. to let users recognize options, actions, etc. instead of forcing them to rely on memory.
  7. Flexibility and efficiency of use: Be flexible so experienced users find faster ways to attain goals.
  8. Aesthetic and minimalist design: Have no clutter, containing only relevant information for current tasks.
  9. Help users recognize, diagnose, and recover from errors: Provide plain-language help regarding errors and solutions.
  10. Help and documentation: List concise steps in lean, searchable documentation for overcoming problems.

HOW TO CARRY OUT HEUERISTICS EVALUATION

There are eight steps on how to conduct heuristic evaluation

  1. Knowing what to test and how — Whether it’s the entire product or one procedure, clearly define the parameters of what to test and the objective.
  2. Know your users and have clear definitions of the target audience’s goals, contexts.
  3. Select 3–5 evaluators, ensuring their expertise in usability and the relevant industry.
  4. Define the heuristics (around 5–10) — This will depend on the nature of the system/product/design.
  5. Brief evaluators on what to cover in a selection of tasks, suggesting a scale of severity codes (e.g., critical) to flag issues.
  6. 1st Walkthrough — Have evaluators use the product freely so they can identify elements to analyze.
  7. 2nd Walkthrough — Evaluators scrutinize individual elements according to the heuristics. They also examine how these fit into the overall design, clearly recording all issues encountered.
  8. Debrief evaluators in a session so they can collate results for analysis and suggest fixes.

--

--