ML22070A159

From kanterella
Jump to navigation Jump to search
PSAM14-391 Presentation on Generalizing Human Error Data with IDHEAS-G
ML22070A159
Person / Time
Issue date: 07/31/2018
From: Chang Y, Jing Xing
NRC/RES/DRA/HFRB
To:
Xing, Jing - 301 415 2410
References
Download: ML22070A159 (21)


Text

Use of IDHEAS to Generalize Human Performance Data for Estimation of Human Error Probabilities Jing Xing, Y. James Chang US Nuclear Regulatory Commission Presentation to AHFE, July, 2018 1

Whats next in human reliability analysis

- DATA, DATA, DATA

  • Existing human error data - from various fields, in different formats, varying context and levels of details
  • Data generalization and use for human reliability analysis -

the Integrated Human Event Analysis System (IDHEAS) has an inherent structure for generalizing and integrating human error data 2

Human error data: The ideal world and reality

  1. of errors (failure mode)

HEP (failure mode under specific context) =

  1. of Occurrence (under the context)
  • Ideal world:

- The same task for a failure mode is repeated thousands of times with the same people under the identical context;

- Do this for all possible contexts Failure modes # Occurrence Context Variety Well-defined Known, sufficient Context clearly Sufficient data for all failure modes number of task defined and failure modes and occurrences repeated contexts 2 3

Human error data: The ideal world and reality

  1. of errors (failure mode)

HEP (failure mode under specific context) =

  1. of Occurrence (under the context)
  • Reality:

X Failure modes unknown X Number of occurrences not reported X Context undocumented and/or unrepeated X Lack of variety - limited failure mode / context tested X Not talking to each other Type of human error data Failure modes # Occurrence Context Variety Statistical X X X Human error analysis X X Operational database Unrepeated Limited 2

Experimental X

Examples of statistical data

  • Statistical study in 2016 - Medical errors are the third leading cause of death in the U.S., after heart disease and cancers, causing at least 250,000 deaths every year (Ref. 1)
  • France - Nuclear Power plant replacement of the Dungeness B Data Processing System - The installation team completed 22,000 plant connections to the new system with a less than 2% error rate. (Ref. 3)

- X Occurrence of the tasks not reported

- X Failure modes unspecified

- X Context undocumented and unrepeated 5

Examples of human error analysis / root causal analysis

  • Percent of error types (failure modes) - Airplane maintenance errors (Ref. 6)

Installation error - 44%

Approved data not followed - 28%

Servicing error - 12%

Poor troubleshooting standards - 0.7%

Poor maintenance practices - 9%

Poor inspection standards - 5%

Misinterpretation of approved data - 2%

  • Percent of Airplane maintenance error contributing factors (Ref. 7)

Information Equipment Configuration Job/Task Training Individual Environmental Organization Supervision Communication

- Failure modes / contributing factors classified and ranked

- X Occurrence of the tasks not reported

- X Relation between failure modes / contributing factors unspecified 6

Examples of observed human error rates in operations (human performance databases)

  • Error rates for nuclear power plant maintenance tasks (Ref. 4):

- 1/7 for transporting fuel assemblies with the fuel handling machine

- 1/48 for removing a ground connection from a switchgear cabinet

- 1/888 for reassembly of component elements

  • Reported error rates in medical pharmacies (Ref. 5):

- 5% for failure to select ambiguously labeled control/package

- 2% for failed task related to values/units/scales/indicators

- 0.6% for procedural omission

- Human error rates reported for the failure modes

- X Relation of failure mode / contributing factors (maybe) unspecified 7

Example: Human error rates in experimental studies The effect of incomplete information on decision-making in simulated pilot de-icing (Ref.8)

Task: Make decision on de-icing in flight simulation under icing weather Failure mode: Incorrectly select or use information for decision-making Context: Incomplete or unreliable information (30%), time pressure Results: Providing additional accurate information improves handling of icing encounters. Performance drops below the baseline when inaccurate information (high uncertainty) is provided in the decision-aid.

% error Accurate and Accurate and Inaccurate additional incomplete additional information information information

% Stall 18.1 30 89

% recovery 26.7 63.8 75

- Failure modes, error rates, and specific context reported

- Quantitative impact of specific context factors reported

- X Not generalized for more complex context with multiple factors 8

Whats next in human reliability analysis

- DATA, DATA, DATA

  • Existing human error data - from various fields, in different formats, varying context and levels of details
  • Data generalization and use for human reliability analysis -

the Integrated Human Event Analysis System (IDHEAS) has an inherent structure for generalizing and integrating human error data 9

Generalizing human error data to inform human error probability estimation HEP = f(states of performance influencing factors)

Data source 1 Data source 2 Tasks Context Tasks Context Failure PIFs Failure PIFs modes modes A generic, adaptable set of failure modes and PIFs 10

Demonstration of IDHEAS-G cognitive failure modes Failure of Failures of cognitive Behaviorally observable macrocognitive process failure modes function D1- Fail to establish D3-1 Primary acceptance-criteria information is not Failure of Detection available D2 - Fail to attend to D3-2 Key alarm or alert Failure of sources of information not attended to Understanding D3-3 Key information D Fail to perceive the information not perceived Failure of D3-4 Information Decisionmaking D4- Fail to verify and misperceived (e.g.,

modify detection failing to discriminate Failure of Action Execution signals, reading errors)

D5- Fail to retain or communicate D3-5 Parameters Failure of Teamwork Information incorrectly monitored 11 10

Demonstration of IDHEAS-G PIF structure Systems and Personnel / team / Task /

Context environment organization situation

- Unfamiliar scenario

- Environmental factors - Procedures

- Multitasking,

- System opacity - Training Interruption, and PIF - Information - Work process distraction

- Tools and parts - Organization - Cognitive

- HSI factors complexity

- Teamwork factors - Mental fatigue and stress

- Alarm not salient - Teamwork - Physical demands PIF

- Mode confusion infrastructure attributes

- Key Information - Distributed teams masking

- Communication

- Ambiguity of equipment Indicators

- Communication protocol 11

Generalizing human error data to IDHEAS-G cognitive failure modes (CFMs) and PIFs Information Task complexity PIF attribute Cognitive function Training HSI - Cognitive failure modes

  • CFM1
  • CFM3 Procedures Multitasking 13

Evaluate data - PIF effects on human errors Error factor (EF) = Error rate at a poor state of the PIF / error rate at the nominal state PIF - Multitasking, Distraction and interruption Ref Context and task Error rates and impact factor (EF)

Ref .8 Experiment on dual task: Airplane Error rate in detecting icing cue alone vs. dual-task:

pilots detecting de-icing cue and 2.8% vs 21% missing cue EF= 7.2 responding to air traffic control 5% vs 20% missing changes EF= 4 information 1% vs 37% wrong diagnosis EF= 37 Ref. 9 Effect of interruption on target Accuracy for no interruption vs interruption detection Simple Spatial .726 (.21) .803 (.11)

Complex Spatial . 549 (.254) .441 (.273)

EF(weak interruption on detection) =1.1 for simple task EF(weak interruption on detection) =0.9 for complex task Ref. 10 Driving simulation with cell phone Missing dangerous targets:

conversation 2.5% without cell phone distraction 7% with cell phone distraction EF(persistent distraction) = 2.8 Ref. 11 Experiment on performing error rate =0.15 for no interruption, sequences of action steps 0.3 for 2.8s interruption, EF(interruption) = 2 0.45 for 4.4s interruption, , EF(longer interruption) = 3 Ref. 12 The effect of interruption on driving 4% for no interruption and and fighting in military weapon 8% with interruption EF(interruption) =2 14 system

Interpret and represent human error data PIF - Multitasking, Distraction and interruption Low impact Moderate High impact PIF state - Distraction impact - Intermingled

- Interruption - Secondary task multitasking Macrocognitive - Prolonged - Concurrently function interruption multitasking Detection EF( weak EF(persistent EF(dual-task) = [5, interruption) = distraction)=2.8 7.5]

[0.9, 1.1]

Understanding EF(intermingled)=37 Decisionmaking EF(interruption on simple decision) =

1.6 EF(interruption on complex decision)

= 1.7 Action Execution EF(2.8s) = 2 HEP (interruption)

EF(4.4s)=3 =2 EF(interruption)=2 Teamwork Undetermined EF(interruption)=2 15

Integrating the data to inform PIF quantification Example PIF - Multitasking, interruption, and distraction Detection Understanding (diagnosis)

Effect on HEP Effect on HEP Performance influencing factor 15

Evaluate data - PIF effects on human errors PIF - Teamwork factors ID Context and task Error rate Nuclear waste handling facility Check-off sheet, low dependence 1E-1 maintenance and operation Check-off sheet, medium dependence 3E-1 Supervisor verification error Check-off sheet, high dependence and stress 5E-1 EF(independent checking) = 5 for high dependence EF(independent checking) = 3 for medium dependence Failure to restore from testing Two persons, operator check 5E-3 Single person, operator check 1E-2 Single person, no check 3E-2 EF(no team verification) = 2 Failure to restore following Two persons, operator check 3E-3 maintenance Single person, operator check 5E-3 Single person, no check 5E-2 EF(no team verification) = 1.7 Experiment of vigilance dual task - Paired team, low target presentation speed 19%

detecting targets (responding to Single person, low target presentation speed 29%

visual alarms) and completing Paired team, high target presentation speed 28%

jigsaw puzzle. Single person, high target presentation speed 38%

EF(team detection) = 1.5, 1.3 for low and high complexity 17

Evaluate Data - PIF effects on human errors PIF - Information completeness and Correctness ID Context and task Error rate 04 Expert judgment of HEPs for NPP HEP (information obviously incorrect) = 3E-2 internal at-power event IHEP (information not obviously incorrect) =8E-2E-1 Information misleading HEP(No information misleading) = 1E-3 EF = 30 for Information obviously incorrect EF=80 for Information not obviously incorrect 40 Experimental study on supporting Error rate - Percentage of early buffet:

decision making and action Accurate information 7.87%

selection under Accurate information but not timely) 20.56%

time pressure and information 30% inaccurate information 73.63.%

uncertainty in pilots de-icing simulation Error rate - Percentage of stall:

Accurate information 18%

Accurate information not timey 30%

(30%) inaccurate information 89%

EF = 1.5, 2.5 for accurate but not-timely or not-organized information EF= 5, 9 for 30% inaccurate information 18

Conclusions

  • Human error data are available, not perfect, but can be used to inform quantification of human error reliabilities
  • IDHEAS provides a framework to generalize human error data for HRA
  • We preliminarily generalized the data to inform the quantification of performance influencing factors on human error probabilities 19

References

1. Makary MA, Daniel M (2016). Medical error-the third leading cause of death in the US. BMJ. 353:i2139
2. The National Motor Vehicle Crash Causation Survey (2015). Critical Reasons for Crashes Investigated in the National Motor Vehicle Crash Causation Survey. DOT HS 812 115
3. N. N. Chokshi ; J. P. Bailey ; A. Johnson ; D. Quenot ; J. F. Le Gall (2010). Integration testing of safety-related systems: Lessons learnt from Dungeness B DPS replacement project, 5th IET International Conference on System Safety 2010
4. Civil Aviation Authority (2015). Aircraft Maintenance Incident Analysis, CAP 1367.
5. Hobbs A, Williamson A (2003). Associations between errors and contributing factors in aircraft maintenance. Hum Factors. 45(2):186-201.
6. Preischl W, Hellmich M (2013). Human error probabilities from operational experience of German nuclear power plants Reliability Engineering and System Safety, 109:150-159
7. Rovira E, McGarry K, Parasuraman R (2007). Effects of Imperfect Automation on Decision Making in a Simulated Command and Control Task. Hum Factors 49(1):76-87 20

Thank you!

Jing.xing@nrc.gov 21