Usability and Human Factors Unit 4 Human Factors
Usability and Human Factors Unit 4: Human Factors and Healthcare Lecture b This material (Comp 15 Unit 4) was developed by Columbia University, funded by the Department of Health and Human Services, Office of the National Coordinator for Health Information Technology under Award Number 1 U 24 OC 000003. This material was updated by The University of Texas Health Science Center at Houston under Award Number 90 WT 0006. This work is licensed under the Creative Commons Attribution-Non. Commercial-Share. Alike 4. 0 International License. To view a copy of this license, visit http: //creativecommons. org/licenses/by-ncsa/4. 0/.
Human Factors and Healthcare Lecture b – Learning Objectives • Objective 1: Distinguish between human factors and human computer interactions (HCI) as they apply to usability (Lecture a) • Objective 2: Explain how cognitive, physical and organization ergonomics can be applied to human factors engineering (Lecture a) • Objective 3: Describe how the concepts of mental workload, selective attention and information overload affect usability (Lecture a) • Objective 4: Describe the different dimensions of the concept of human error (Lecture b) 2
Human Factors and Healthcare Lecture b – Learning Objectives (Cont’d – 1) • Objective 5: Describe a systems-centered approach to error and patient safety (Lecture b) • Objective 6: Apply methods for measuring mental workload and information overload (Lecture c) • Objective 7: Describe how human factors analysis can be applied to the study of medical devices (Lecture c) 3
Patient Safety • Healthcare discipline emphasizes reporting, analysis, and prevention of medical error. • Landmark Report: Institute of Medicine (1999) • Magnitude of the problem not known De, A. CC BY-NC-SA 4. 0 4
Harvard Medical Practice Study • Out of reviewed medical charts of more than 30, 000 patients admitted to 51 acute care hospitals in New York State in 1984: – 3. 7% of the cases, an adverse event led to prolonged admission – – – or disability Adverse event: any unfavorable change in health or side effect that occurs in a patient who is receiving the treatment 69% of injuries were caused by errors 27. 6% of adverse events due to negligence 13. 6% led to patient deaths Substantial injury to patients from medical management Many injuries result from substandard care
Why do Errors Happen? • Error is the failure of a planned sequence of mental or physical activities to achieve its intended outcome when these failures cannot be attributed to chance • Inclination to blame somebody – Who is responsible? Often the person closest to the failure becomes the one who gets blamed • Can we isolate a single cause? • “When human error is viewed as a cause rather than a consequence, it serves as a cloak for our ignorance” (Henriksen et al. , 2008). • Systems-centered approach: – Latent Conditions and Active Failures (Reason, 1997) 6
Active Failure • Occur at the level of the frontline operator – Effects are felt immediately • In health care, active errors are committed by providers (e. g. , nurses, physicians, pharmacists) who are actively responding to patient needs at the “sharp end” 7
Latent Conditions (Reason, 1990) Latent Conditions are enduring systemic problems that lay dormant for some time, combine with other system problems to weaken the systems defenses and make errors possible • • • Poor interface design Communication breakdown Gaps in supervision Incorrect equipment installation Hidden software bugs 8
Latent Conditions (Cont’d – 1) (Reason, 1990) • • Fast-paced production schedules Unworkable procedures Extended work hours Staffing problems Inadequate training Aloof management Absence of a safety culture 9
Hindsight Bias • Retrospective after-the-fact analysis of human error is bias prone – Collected data is evaluated against the negative outcome – Difficult to recreate the situational context, stress, shifting attentional demands & competing goals • Hindsight bias masks the dilemmas, uncertainties, & demands – Distorted view of factors contributing to the incident or accident 10
Space Shuttle Challenger Disaster • Space shuttle exploded on takeoff in 1986 killing 8 crew members • Cause: O-ring seal in rocket booster failed at liftoff • Multiple faults including unanticipated cold weather, brittle O-ring seals, communication problems between NASA and contractors, etc. • Latent errors went unrecognized NASA, 1986 11
Nuclear Disasters (Cooling towers of Three Mile Island Unit 2 as of February 2014, 2014) (Markosian, 2016) 12
Deepwater Horizon Explosion “The team did not identify any single action or inaction that caused this accident. Rather, a complex and interlinked series of mechanical failures, human judgments, engineering design, operational implementation and team interfaces came together to allow the initiation and escalation of the accident. Multiple companies, work teams and circumstances were involved over time. ” - BP (United Stated Navy, 2010) 13
Reason’s “Swiss Cheese” Model of Error 4. 1 Figure: Adapted from Reason, J. (2000) 14
Human Errors • Slips: • Mistakes: – Incorrect execution – Correct execution of a correct action sequence of an incorrect action sequence • Errors when routine • Errors in judgment, behavior is perception, misdirected or inference or omitted interpretation 15
Mistakes • Knowledge-Based – Faulty conceptual – – knowledge Incomplete knowledge Biases and faulty heuristics Incorrect selection of knowledge Information overload • Rule-Based – Misapplication of good rules – Encoding deficiencies in rules – Action deficiencies in rules – Dissociation between knowledge and rules 16
Example: Error One • Description of the environment/case study information: – Mr. B is a 45 year old male being treated for dehydration secondary to nausea, vomiting and diarrhea – Mr. B has been in the Intensive care Unit (ICU) for 4 days receiving intravenous fluids via an IV catheter in his right forearm – As Mr. B stabilizes, the physician orders to start P. O. fluids (fluids by mouth) and discontinue the IV fluids • • Note, the order is to discontinue the IV fluids, not the IV Typically, the RN will stop the IV fluid and convert the IV to a saline lock that may be used for intermittent infusions as necessary 17
Example: Error One (Cont’d – 1) • Identification of the error: – The RN removed the entire IV catheter when it should have been converted to a saline lock • Classification of the error: – Slip: automatic use of a well-learned routine that overrides the current intended activity – RN intended to convert the IV to a saline lock; however, she discontinued the entire access
Example: Error Two • Mr. Jones is assigned to a team of nurses for the dayshift • One nurse responsible for giving medication to patients on the team • Other nurse responsible for all assessments & treatments • Mr. Jones complains of pain to the treatment nurse • Rather than delay the pain medication waiting for the medication nurse, treatment nurse obtains narcotic and administers it to Mr. Jones • Treatment nurse forgets to document on medication record that she gave Mr. Jones some Demerol for pain 19
Example: Error Two (Cont’d – 1) • When making rounds, medication nurse asks Mr. Jones if he is in pain • Mr. Jones again replies yes • Medication nurse reviews medication record -- no documentation of pain medication given • She medicates Mr. Jones with Demerol (again) • Within 1 hour, Mr. Jones is lethargic & has respiratory depression • He has to be transferred to ICU for closer monitoring due to Demerol overdose 20
Example: Error Two (Cont’d – 2) • Classification of the error: – Repetition of action slip: repetition of a correctly performed action – Each RN medicated the patient according to the physician's orders; however, due to the error of "no documentation" the patient received a repeated dose of Demerol 21
Interdependence of The Health Care System • “Healthcare is composed of a large set of interacting systems - paramedic, and emergency, ambulatory, inpatient care and home health care; testing and imaging laboratories; pharmacies that are connected in loosely coupled but intricate networks of individuals, teams procedures, regulations, communications, equipment and devices that function with diffused management in a variable and uncertain environment” (p 158) Kohn et al, 2000 22
Systems Approach to Adverse Events in Health Care 4. 2 Figure: Henriksen, 2008 23
Systems Approach to Adverse Events Continued 4. 3 Figure: Henriksen, 2008 24
Time Course of Medical Error 4. 4 Figure: Adapted from Zhang et al, 2004 25
Unit 4: Human Factors and Healthcare Summary – Lecture b • Patient Safety and human error • Reason model of error – Slips and mistakes – Knowledge vs. rule-based mistakes • Systems approach to medical error • Next lecture: Workload, medical devices and mental models 26
Unit 4: Human Factors and Healthcare References – Lecture b Reference Carayon, P. (Ed. ). (2007). Handbook of Human Factors and Ergonomics in Health Care and Patient Safety. Mahwah, NJ: Lawrence Erlbaum Associates. Deepwater Horizon Accident Investigation Report September 8, 2010 Executive Summary, retrieved 4/8/2016 from http: //www. bp. com/content/dam/bp/pdf/sustainability/issuereports/Deepwater_Horizon_Accident_Investigation_Report_Executive_summary. pdf Henriksen, K. , Dayton, E. , Keyes, M. A. , Carayon, P. , & Hughes, R. (2008). Understanding Adverse Events: A Human Factors Framework. In H. R. G. (Ed. ), Patient Safety and Quality: An Evidence. Based Handbook for Nurses (pp. 84 -101). Rockville, MD: Agency for Healthcare Research and Quality Horsky, J. , Kaufman, D. R. , Oppenheim, M. I. & Patel, V. L. (2003). A framework for analyzing the cognitive complexity of computer-assisted clinical ordering. Journal of Biomedical Informatics, 36, 4 -22. Kaufman, D. R. , Pevzner, J. , Rodriguez, M. , Cimino, J. J. , Ebner, S. , Fields, L. , et al. (2009). Understanding workflow in telehealth video visits: Observations from the IDEATel project. Journal of Biomedical Informatics, 42(4), 581 -592. Kaufman, D. R. & Starren, J. B. (2006). A methodological framework for evaluating mobile health devices. In the Proceedings of the American Medical Informatics Annual Fall Symposium. Philadelphia: Hanley & Belfus. 978 27
Unit 4: Human Factors and Healthcare References – Lecture b (Cont’d – 1) Reference Kaufman, D. R. , Patel, V. L. , Hilliman, C. , Morin, P. C. , Pevzner, J, Weinstock, Goland, R. Shea, S. & Starren, J. (2003). Usability in the real world: Assessing medical information technologies in patients’ homes. Journal of Biomedical Informatics, 36, 45 -60. Makary, M. A. , & Daniel, M. (2016). Medical error—the third leading cause of death in the US. BMJ, 353, i 2139. Reason, J. T. (1997) Managing the risks of organizational accidents. Ashgate Publishing, Aldershot, UK. Reason, J. T. (1990) Human Error. Cambridge University Press, Cambridge. Kohn, L. T. , Corrigan, J. , and Donaldson, M. (2000). To Err is Human. Institute of Medicine, National Academy Press. Washington, DC. 28
Unit 4: Human Factors and Healthcare References – Lecture b (Cont’d – 2) Charts, Tables, Figures 4. 1 Figure: Reason, J. (2000). Human error: models and management. Bmj, 320(7237), 768 -770. 4. 2 Figure: Henriksen, K. , Dayton, E. , Keyes, M. A. , Carayon, P. , & Hughes, R. (2008). Understanding Adverse Events: A Human Factors Framework. In H. R. G. (Ed. ), Patient Safety and Quality: An Evidence-Based Handbook for Nurses (pp. 84 -101). Rockville, MD: Agency for Healthcare Research and Quality 4. 3 Figure: Henriksen, K. , Dayton, E. , Keyes, M. A. , Carayon, P. , & Hughes, R. (2008). Understanding Adverse Events: A Human Factors Framework. In H. R. G. (Ed. ), Patient Safety and Quality: An Evidence-Based Handbook for Nurses (pp. 84 -101). Rockville, MD: Agency for Healthcare Research and Quality 4. 4 Figure: Zhang, J. , Patel, V. L. , Johnson, T. R. , & Shortliffe, E. H. (2004). A cognitive taxonomy of medical errors. Journal of biomedical informatics, 37(3), 193 -204. 29
Unit 4: Human Factors and Healthcare References – Lecture b (Cont’d – 3) Images Slide 4: De, A. Retrieved on September 10 th, 2010 from http: //www. flickr. com/photos/andyde/4762081047/sizes/l/#cc_license Slide 11: NASA, (1986). Retrieved on September 10 th, 2010 from http: //grin. hq. nasa. gov/ABSTRACTS/GPN-2004 -00012. html Slide 12: Cooling towers of Three Mile Island Unit 2 as of February 2014. . (2014). Retrieved from https: //commons. wikimedia. org/wiki/File: Cooling_towers_of_Three_Mile_Island_Unit_2. jpg Slide 12: Markosian, D. (2016). A radioactive sign hangs on barbed wire outside a café in Pripyat. Retrieved from https: //commons. wikimedia. org/wiki/File: VOA_Markosian_-_Chernobyl 02. jpg Slide 13: United States Navy, . (2010). Controlled burn following explosion of Deepwater Horizon. Retrieved from https: //commons. wikimedia. org/wiki/File: Defense. gov_photo_essay_100506 -N 6070 S-346. jpg 30
Unit 4: Human Factors and Healthcare Lecture b This material was developed by Columbia University, funded by the Department of Health and Human Services, Office of the National Coordinator for Health Information Technology under Award Number 1 U 24 OC 000003. This material was updated by The University of Texas Health Science Center at Houston under Award Number 90 WT 0006. 31
- Slides: 31