CSM 18 Usability Engineering Evaluation test the usability
CSM 18 Usability Engineering Evaluation: test the usability and functionality of an interactive system Goals of an evaluation • assess the extent of the system’s functionality • assess its usability - see 10 heuristics • assess the effect & affect of the interface on the user • identify any specific problems with the system or with its use Uni. S Department of Computing 1 Dr Terry Hinton 12/6/2020
Evaluation Methods for Interactive Systems Analytical Methods Experimental Methods Observational Methods Query Methods Uni. S Department of Computing 2 Dr Terry Hinton 12/6/2020
Evaluation Methods for Interactive Systems Analytical Methods Predict performance based on a model e. g. analysis of cash dispenser based on number of key strokes required, time needed to press a key, time needed to think, time needed to react. Experimental Methods design experiments in the laboratory e. g. speed of recognition of key words dependent on font & colour Uni. S Department of Computing 3 Dr Terry Hinton 12/6/2020
Evaluation Methods for Interactive Systems Observational Methods - in the field Users: • expert users • typical users • novice users Query Methods - survey opinions, attitudes, easy, enjoyable also consider contextual issues • Interviews • Questionnaires Uni. S Department of Computing 4 Dr Terry Hinton 12/6/2020
Usability defined: usability = efficiency + effectiveness + enjoyment Can’t compute a usability parameter J Neilson proposed 10 Usability heuristics see later heuristics - set of rules for solving problems other than by an algorithm (Collins English Dictionary 2 nd Ed. ) Uni. S Department of Computing 5 Dr Terry Hinton 12/6/2020
Experimental methods - in the laboratory • design an experiment for laboratory conditions • make an hypothesis - testable • select your subjects - sample size • select the variables - change one at a time • statistical measures - time, speed, number of events, number of errors (recoverable, fatal) Uni. S Department of Computing 6 Dr Terry Hinton 12/6/2020
Observational techniques - in the field Observe behaviour - arbitrary activity - set the tasks Task analysis Specify a set of tasks gives insight into usability Specify a goal gives insight into cognitive strategy used Record - actions, time, errors etc. Uni. S Department of Computing 7 Dr Terry Hinton 12/6/2020
Observational techniques - in the field Verbal Protocol - Think aloud Protocol analysis • paper and pencil • audio recording • video recording • computer logging • user notebooks Automatic protocol analysis tools Post-event protocol - teach-back or Post-task “walkthroughs” Uni. S Department of Computing 8 Dr Terry Hinton 12/6/2020
Query techniques - Attitudinal Data Interviews (see page 35 et seq. ) • design an interview schedule Questionnaires • general • open-ended • scalar • multi-choice Uni. S Department of Computing 9 Dr Terry Hinton 12/6/2020
Planning an evaluation Factors to be considered in planning an evaluation • purpose - who are the stake holders • laboratory vs field studies • qualitative vs quantitative measures • information provided • immediacy of response • intrusiveness • resources Uni. S Department of Computing 10 Dr Terry Hinton 12/6/2020
Ten usability heuristics by J Neilson Visibility of system status system should keep users informed about what is going on Match between system and the real world system should speak users’ language - words, phrases and concepts familiar to the user (rather than system-oriented terms) User control and freedom users often choose system functions by mistake - support undo/redo Consistency and standards follow platform conventions (users shouldn’t have to wonder whether diferent words, situations, or actions mean the same thing. Uni. S Department of Computing 11 Dr Terry Hinton 12/6/2020
Ten usability heuristics Error prevention better than error messages Recognition rather than recall make objects, actions, and options visible (users shouldn’t have to remember information) Flexibility and efficiency of use accelerators (unseen by novice users) my speed up interaction for expert users system allows users to tailor frequent actions Aesthetic and minimalist design simplicity is beauty Uni. S Department of Computing 12 Dr Terry Hinton 12/6/2020
Ten usability heuristics Help users recognise, diagnose, and recover from errors Express error messages in plain language (no codes), indicate the problem, and suggest solution Help and documentation Ideally, its better if system can be used without documentation, Most often it is necessary to provide help and documentation. Such information should be easy to search, focused on the user’s task, list concrete steps to be carried out and not be too long. Examples are always helpful. Uni. S Department of Computing 13 Dr Terry Hinton 12/6/2020
Questionnaire Design (see page 35 et seq. ) A simple checklist yes no Don’t’ know copy paste Example of 6 -point rating scale (avoid a middle value) Very useful Uni. S Department of Computing Of no use 14 Dr Terry Hinton 12/6/2020
Questionnaire Design An example of a Likert Scale strongly agree slightly agree neutral slightly disagree strongly disagree An example of a semantic differential scale extremely slightly neutral slightly extremely difficult easy Uni. S Department of Computing 15 Dr Terry Hinton 12/6/2020
Questionnaire Design An example of a ranked order question: Place the following commands in order of usefulness using the numbers 1 to 4, 1 being the most useful. copy paste Uni. S Department of Computing group 16 clear Dr Terry Hinton 12/6/2020
Evaluation in the Design Phase Participatory Design - user is involved in the whole design life cycle Number of methods to help convey information between user and designer • brainstorming • storyboarding • workshops • pencil & paper exercise • role playing Uni. S Department of Computing 17 Dr Terry Hinton 12/6/2020
Evaluating the design Cognitive walk through Heuristic evaluation Review based evaluation Model based evaluation Uni. S Department of Computing 18 Dr Terry Hinton 12/6/2020
Choosing an evaluation method Ref: Dix, A. , Finlay, J. , Abowd, G. , Beale, R. (1994) Uni. S Department of Computing 19 Dr Terry Hinton 12/6/2020
Choosing an evaluation method Uni. S Department of Computing 20 Dr Terry Hinton 12/6/2020
Choosing an evaluation method Uni. S Department of Computing 21 Dr Terry Hinton 12/6/2020
- Slides: 21