A Comparison of Heuristic Evaluation Cognitive Walkthroughs and

  • Slides: 4
Download presentation
A Comparison of Heuristic Evaluation, Cognitive Walkthroughs and Usability Testing Alfred Kobsa University of

A Comparison of Heuristic Evaluation, Cognitive Walkthroughs and Usability Testing Alfred Kobsa University of California, Irvine

Heuristic Evaluation (“Expert Review”) Normally performed by 3 -5 independent HCI experts (NOT by

Heuristic Evaluation (“Expert Review”) Normally performed by 3 -5 independent HCI experts (NOT by the original UI designers, who do not see their own mistakes easily) • Experts go through guidelines (heuristics) • e. g. , QUIS, David Travis’ workbook, Standards: HHS, ISO 9241 • Rate individually, form consensus • Can be performed on early designs, mockups and prototypes Pros: • Cheaper than usability experiments • Reasonably effective: Rule of thumb: 1 expert ca. 40% of errors, 2: 50%, 3: 60%, 5: 80% Cons: • HCI experts find mostly local problems (when domain expertise is also present, more global problems may also be found)

Cognitive Walkthrough Done by independent experts or original UI designers (individually or in groups)

Cognitive Walkthrough Done by independent experts or original UI designers (individually or in groups) • Analyst(s) imagine performing a task, and thereby walk through design documents • Analysts try to determine whether (types of) end users (“personas”) will be able figure out where to go and how to do things in the design • Can be performed based on early designs, mockups and prototypes Pros: • Quicker and cheaper than usability experiments • Reveals global errors more readily than heuristic evaluation Cons: • Original designers do not see their own mistakes easily • Experts in HCI only may need a long time to understand the task at hand

Usability testing Performed by original UI designers (or by third party) • with users

Usability testing Performed by original UI designers (or by third party) • with users who are representative for the target (sub-)population • planned in teams, carried out by (parallel) sub-teams of 1 -4 people Pros: • • Reveals more usability problems than other methods Finds more global problems Finds more unique problems (? ) Only 5 -8 users needed to find major problems (3 -5 for every subgroup) Cons: • Time-intensive: several weeks to a few months • Does not detect local/minor problems very well ☛ perform heuristic evaluation and/or cognitive walkthrough first