Utah School of Computing Lecture Set 16 HCI

  • Slides: 22
Download presentation
Utah School of Computing Lecture Set 16 HCI Validation Richard F. Riesenfeld University of

Utah School of Computing Lecture Set 16 HCI Validation Richard F. Riesenfeld University of Utah Fall 2009

Major Considerations - 1 • Stage of design - Conceptual, preliminary, detail • Novelty

Major Considerations - 1 • Stage of design - Conceptual, preliminary, detail • Novelty of project - Do we know what we are doing? • Number of expected users - How important is this? - How amenable to change will it be? Fall 2009 Utah School of Computing Student Name Server slide 2

Major Considerations - 2 • Criticality of the interface - Are lives at stake

Major Considerations - 2 • Criticality of the interface - Are lives at stake if there are problems? • Cost of product - Allocation for testing • Time available for testing • Experience of designers and evaluators Fall 2009 Utah School of Computing Student Name Server slide 3

Expert Review methods - 1 • Heuristic evaluation - Experts critique it wrt established

Expert Review methods - 1 • Heuristic evaluation - Experts critique it wrt established criteria • Guidelines review - Does it meet “spec” - Can be an overwhelming list - Bureaucratic approach Fall 2009 Utah School of Computing Student Name Server slide 4

Expert Review methods - 2 • Consistency inspection - Experts check of style, function,

Expert Review methods - 2 • Consistency inspection - Experts check of style, function, form, etc. • Cognitive walkthrough - Experts perform role of users - Try to assess its success from experience Fall 2009 Utah School of Computing Student Name Server slide 5

Expert Review methods - 3 • Formal usability inspection - Moot court - Countervailing

Expert Review methods - 3 • Formal usability inspection - Moot court - Countervailing opinions - Can be unweildy Fall 2009 Utah School of Computing Student Name Server slide 6

Comparative Evaluations - 1 • Different experts see different issues - Can get caught

Comparative Evaluations - 1 • Different experts see different issues - Can get caught with conflicting advice* - Limit the number of experts • Get a “bird’s-eye” view in the beginning - Throw images on a wall, etc. * “For every Ph. D, there is an equal and opposite Ph. D” * “ Fall 2009 Utah School of Computing Student Name Server slide 7

Comparative Evaluations - 2 • Formal (statistical) methods - Form a hypothesis - Determine

Comparative Evaluations - 2 • Formal (statistical) methods - Form a hypothesis - Determine dependent variables - Identify independent variables * “ Fall 2009 Utah School of Computing Student Name Server slide 8

Usability Labs, etc • Hard to employ because of time and money constraints in

Usability Labs, etc • Hard to employ because of time and money constraints in product development - Development cycle schedule - Budgets - Corporate/Cultural Attitude Fall 2009 Utah School of Computing Student Name Server slide 9

Controlled Experients • Statistical testing - Establish level of statistical significance - At the

Controlled Experients • Statistical testing - Establish level of statistical significance - At the 95% confidence level we know… • Usability testing - Find flaws in the interface through more informal (inconclusive) methods - Empirical methods Fall 2009 Utah School of Computing Student Name Server slide 10

Human Subjects -1 • Careful, “This isn’t Kansas anymore!” • Many new dimensions need

Human Subjects -1 • Careful, “This isn’t Kansas anymore!” • Many new dimensions need attention • Human respect and dignity - Voice generated check outs violated privacy § Military has NO privacy § Other cultures treat matters differently Fall 2009 Utah School of Computing Student Name Server slide 11

Human Subjects -2 • Real LEGAL issues, so get it right! - Informed consent

Human Subjects -2 • Real LEGAL issues, so get it right! - Informed consent - Understand your liability - Get it in writing, copy to each party • Government, or institutional rules - We are not accustomed to this - Need cognizant approvals § IRBs, etc. § Research proposals, etc Fall 2009 Utah School of Computing Student Name Server slide 12

Observations methods • Have subjects “think aloud” - Will subjects be honest, etc. •

Observations methods • Have subjects “think aloud” - Will subjects be honest, etc. • Use video recording • Field tests - Study the successes/failures of the interface - Getting access - Reliance on memories § “How is it going? ” (We tend to react to most recent) Fall 2009 Utah School of Computing Student Name Server slide 13

Destructive Testing • • Hey, can you break this? Good for security Good for

Destructive Testing • • Hey, can you break this? Good for security Good for games Durability testing appropriate for some environments - ATM in high crime area - Military - Students, they can’t resist a challenge Fall 2009 Utah School of Computing Student Name Server slide 14

Competitive Testing -1 • Consumers Union, Road & Track, style - Take several into

Competitive Testing -1 • Consumers Union, Road & Track, style - Take several into lab and have a “shoot out” • Expensive • Takes skill (like a movie review) - Depends on the criteria - Depends on good and representative judgment Fall 2009 Utah School of Computing Student Name Server slide 15

Competitive Testing -2 Major Limitations - Limited coverage of features - Depends on initial

Competitive Testing -2 Major Limitations - Limited coverage of features - Depends on initial user experiences Fall 2009 Utah School of Computing Student Name Server slide 16

Surveys • Tricky business, can lead to nearly any conclusion - Population selection Question

Surveys • Tricky business, can lead to nearly any conclusion - Population selection Question choices Size Leading questions, other bias • Negative bias – users with complaints Fall 2009 Utah School of Computing Student Name Server slide 17

Online Surveys • More issues… Fall 2009 Utah School of Computing Student Name Server

Online Surveys • More issues… Fall 2009 Utah School of Computing Student Name Server slide 18

Conclusions • • HCI is a new game Not exact science Old methods not

Conclusions • • HCI is a new game Not exact science Old methods not entirely applicable Need newer, faster, light weight, flexible, informal, subjective, intelligent approaches Fall 2009 Utah School of Computing Student Name Server slide 19

Recommendations • Use good judgment • Trust good judgment - Yours - Others, whom

Recommendations • Use good judgment • Trust good judgment - Yours - Others, whom you trust • Be open to criticism and suggestion Fall 2009 Utah School of Computing Student Name Server slide 20

Interpretation • • What is being said? What is the real issue? What is

Interpretation • • What is being said? What is the real issue? What is the real fix? RSI is a problem - Pain - Was cause keyboard or mouse? - Need different devices, or speech, or simply a better mouse and keyboard? Fall 2009 Utah School of Computing Student Name Server slide 21

Utah School of Computing End Lecture Set 16 END HCI Validation

Utah School of Computing End Lecture Set 16 END HCI Validation