ML20083J313

From kanterella
Jump to navigation Jump to search
P5 Preliminary Analysis Results of SACADA for HEPs
ML20083J313
Person / Time
Issue date: 03/12/2020
From: Chang Y
Office of Nuclear Regulatory Research
To:
J. Chang
Shared Package
ML20083J304 List:
References
Download: ML20083J313 (25)


Text

Preliminary Analysis Results of SACADA Data for HEPs Y. James Chang, Ph.D.

U.S. Nuclear Regulatory Commission James.Chang@nrc.gov Presented at HRA Data Workshop at US Nuclear Regulatory Commission March 12-13, 2020 1

SACADA Development Background Aim to collect operator performance information in simulator training to improve human error probability (HEP) estimates

- Developed software for nuclear plants to use in operator simulator training

- Plant-collected information shared with the NRC Strength Collect large amount of data points to provide statistical evidence of the relation between context and cognitive performance Weakness Data taxonomy needs to accommodate constraints in operator simulator training Strength & Weakness Information entered by operation instructors and crews, not researchers Strength: (1) Enable a long-term, continuous data collection; (2) Data are accurate because entered by the instructors and crew in the simulator training and immediate immediately after the training Weakness: data are entered in a time constraint environment 2

SACADA: Scenario Authoring, Characterization, and Debriefing Application

Data of Interest

  • Basic human error probability (HEPs) of various macrocognitive functions (MCF)

- MCFs: Detection (Alarms vs. Indicators),

Understanding/Diagnosis, Deciding, and Action

  • Effects of individual performance influencing factors (PIFs) and combinations of PIFs
  • MCF-dependent effects
  • PIF combinations having cliff effect
  • How does the PIF combined effect work (multiplication or addition or something else)?
  • Effects not captured in the current data taxonomy 3

Data used in this analysis are partially available to the public in https://www.nrc.gov/data/

4

Performance Influencing Factors of Diagnosis (Partial List)

PIF PIF Attribute 1

Diagnosis Basis o

Skill-based o

Procedure-based o

Knowledge-based 2

Familiarity o

Standard o

Novel o

Anomaly 3

Information Integration

Timing of information

Ambiguous information

Integration required 4

Information specificity o

Specific o

Not specific 5

Information quality

Missing Information

Misleading Information

Conflicting Information 6

Workload o

Normal o

Concurrent Demandes o

Multiple Concurrent Demandes 7

Time Criticality o Expansive Time Available o Nominal Time Available o Barely Adequate Time Available

5

Data Unit: Training Objective Elements Example of Malfunction of SGTR in 1B SG 6

Training Objective Element MCF UNSAT Ratio 1

Diagnose SGTR in B SG Diagnosis 0/13 2

Direct a reactor trip and safety injection based on increasing RCS leakage.

Diagnosis 0/13 3

Performs Immediate actions of 0POP05-EO-EO00, including RNO actions for Throttle Valve stuck open.

Deciding 0/13 4 -7

0/13 8

Transitions to EO30 SGTR Deciding 0/13 9

Completes isolation of ruptured S/G: (4 isolation tasks)

Action 0/13 10 Properly select and maintain target temperature for cooldown based on the chart provided in EO30.

Diagnosis 0/13 11 Directs/initiates RCS cooldown Action 0/13 12 Directs/stops RCS cooldown and maintains < target temperature Action 0/13 13 Depressurize RCS to meet SI termination criteria before either of the following occur: (1) SG PORV or Safety Valve opens (2) SG Narrow Range level goes Off scale high Action 0/13 14 Terminate SI and control RCS pressure and makeup flow so that RCS pressure is at SG Pressure and stable before the end of the scenario Deciding 0/13 15 Declares an Alert Based on SGTR greater than the capacity of one CCP (FA1).

Diagnosis 0/13 16 Other items to discuss None 0/13

Overview of the Data (As of 4/2019, available in https://www.nrc.gov/data/)

7 Cognitive Type Data Points

  1. of Unique Context Available
  1. of Possible Unique Context*

Alarm detection 2099 61

~ 60,000 Indicator detection 2198 95

~ 80,000 Diagnosis/

Understanding 2613 116

~ 3,000,000 Deciding 6019 147

~ 60,000 Action 5167 159

~ 3,000,000 TOTAL 18096 578

8 Overview of Operator Performance 0

0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1

< 0.01 0.01 - 0.1 0.1-0.2 0.2-0.3 0.3-0.4 0.4-0.5 Portion of Data Points UNSAT Ratio Calculated based on the Training Objective Elements with the same context UNSAT Ratio Distributions Detect Alarm Detect Indicator Diagnosis Deciding Action

Preliminary Analysis Results (Grouped Based on Same Context, in https://www.nrc.gov/data/)

UNSAT Ratio MCF Average (All Context)

Highest Value*

(A Specific Context)

Highest Value/

Average Detection (Alarm) 6.2E-3 (13/2099) 0.19 (5/26) 31 Detection (Indicator) 7.7E-3 (17/2198) 0.21 (3/14) 27 Diagnosis 1.1E-2 (28/2613) 0.5 (3/6) 45 Decisionmaking 5.3E-3 (32/8019) 0.13 (2/15) 25 Action 1.1E-2 (59/5167) 0.33 (4/12) 30 9

  • Calculate based on the crew performance of a Training Objective Element of a specific scenario

Single PIF Effects

  • Based on available data, the effects were calculated by changing the status of one-and-only-one PIF (before and after)
  • The effects may or may not only caused by difference of the PIFs status PIF because the data populations before and after are exclusive to each other.
  • Many context groups have zero UNSAT which are not discussed in this analysis.
  • The directions of effects are generally as expected.

10 Skill-Based Rule-Based KB-Based

Single PIF Effects - Alarm Detection (Change a PIFs status only)

PIF Status 1 UNSAT Ratio 1 Status 2 (2)/(1)

Status 3 (3)/(1)

Alarm Board Status Dark 2.1E-3 (2/953) 1 Busy 5.0E-3 (5/991) 2.4 Overloaded 3.9E-2 (6/155) 18.4 Detection Mode Self Revealing 2.1E-3 (4/1872) 1 Awareness

/Inspect 5.1E-2 (9/177) 23.8 11 Note: the other PIFs statuses are likely different in the above data points.

The effects (UNSAT ratio 2/UNSAT Ratio 1) may include the other factors effects.

Single PIF Effects - Indicator Detection (Change a PIFs status only)

PIF Status 1 UNSAT Ratio 1 Status 2 UNSAT Ratio 2 UNSAT Ratio 2/

UNSAT Ratio 1 Detection Mode Procedure Directed Check 3.4E-3 (3/870) 1 Awareness/

Inspection 6.4E-3 (5/782) 1.9 KB-Driven Monitoring 1.9E-2 (8/432) 5.4 Change Degree Distinct Change 1.0E-2 (13/1291) 1 Slight Change 4.4E-3 (4/907) 0.44 Meter Type Meter 2.4E-2 (11/463) 1 Computer 2.9E-3 (3/1019) 0.1 12 Note: the other PIFs statuses are likely different in the above data points.

The effects (UNSAT ratio 2/UNSAT Ratio 1) may include the other factors effects.

Single PIF Effects - Diagnosis Detection (1/2)

(Change a PIFs status only)

PIF Status 1 UNSAT Ratio 1 Status 2 (2)/(1)

Status 3 (3)/(1)

Diagnosis Familiarity Standard 7.6E-3 (13/1718) 1 Novel 8.8E-3 (7/800) 1.2 Anomaly 1.2E-1 (8/69) 15.3 Poor Information Timing Not Exist 9.5E-3 (24/2524) 1 Exist 4.5E-2 (4/89) 4.7 Ambiguous Information Not Exist 8.1E-3 (19/2350) 1 Exist 3.4E-2 (9/263) 4.2 13 Note: the other PIFs statuses are likely different in the above data points.

The effects (UNSAT ratio 2/UNSAT Ratio 1) may include the other factors effects.

Single PIF Effects - Diagnosis Detection (2/2)

(Change a PIFs status only)

PIF Status 1 UNSAT Ratio 1 Status 2 UNSAT Ratio 2 Integration Required Not Exist 9.5E-3 (23/2429) 1 Exist 2.7E-2 (5/184) 2.9 Information Specificity Specific 7.7E-3 (10/1293) 1 Not Specific 1.5E-2 (16/1077) 1.9 14 Note: the other PIFs statuses are likely different in the above data points.

The effects (UNSAT ratio 2/UNSAT Ratio 1) may include the other factors effects.

Single PIF Effects - Decisionmaking (Change a PIFs status only)

PIF Status 1 UNSAT Ratio 1 Status 2 (2)/(1)

Status 3 (3)/(1)

Decision Familiarity Standard 5.1E-3 (24/4691) 1 Adaption Required 6.6E-3 (7/1062) 1.3 Anomaly 1.1E-2 (1/92) 2.1 Decision Uncertainty Clear 4.9E-3 (22/4460) 1 Uncertain 6.3E-3 (4/634) 1.3 Competing Priority 8.7E-3 (6/693) 1.8 15 Note: the other PIFs statuses are likely different in the above data points.

The effects (UNSAT ratio 2/UNSAT Ratio 1) may include the other factors effects.

Single PIF Effects - Action (Change a PIFs status only)

PIF Status 1 UNSAT Ratio 1 Status 2 (2)/(1)

Status 3 (3)/(1)

Action Type Simple and Distinct 1.0E-2 (22/2151) 1 Order 1.1E-2 (21/1876) 1.1 Maintaining 1.5E-2 (12/793) 1.5 Action Guidance Procedure

-Based 1.1E-2 (55/4786) 1 Skill-of-the-Craft 8.9E-3 (2/224) 1.3 STAR(Faulted Hardware) 1.3E-2 (2/157) 3.2 Additional Mental Effort Required Not Exist 1.0E-2 (52/4960) 1 Exist 3.4E-2 (7/207) 3.2 16 Note: the other PIFs statuses are likely different in the above data points.

The effects (UNSAT ratio 2/UNSAT Ratio 1) may include the other factors effects.

Workload Effects in Various MCFs Normal (1)

Concurrent Demands (2)

Multiple Concurrent Demands (3)

Alarm detection 1.2E-3 (1/801) 1 9.6E-3 (12/1255) 7.7 0

(0/29)

Indicator Detection 2.8E-3 (2/711) 1 7.8E-3 (10/1289) 2.8 2.5E-2 (5/198) 9.0 Diagnosis 1.3E-2 (9/674) 1 8.5E-3 (15/1759) 0.6 2.9E-2 (4/138) 2.2 Deciding 3.7E-3 (5/1342) 1 5.9E-3 (23/3899) 1.6 5.2E-3 (4/762) 1.4 Action 9.2E-3 (9/983) 1 9.4E-3 (30/3181) 1.0 2.0E-2 (20/987) 2.2 17 Note: the other PIFs statuses are likely different in the above data points.

The effects (UNSAT ratio 2/UNSAT Ratio 1) may include the other factors effects.

PIF effects may be MCF-dependent. Need detailed analyses for confirmation.

PIF Combined Effects -

Alarm Detection 18 PIF Base PIF Status PIF Status 1 PIF Status 2 Alarm Board Status Dark 2.1E-3 (2/953) 1 Busy 5.0E-3 (5/991) 2.4 Overloaded 3.9E-2 (6/155) 18.4 Workload - Alarm Detection Normal 1.2E-3 (1/801) 1 Concurrent Demand 9.3E-2 (12/1255) 7.7 Multiple Concurrent Demands 0

(0/29)

Workload Alarm Board Status Dark Busy Overloaded Normal 1.5E-3 (1/687) 1 0

(0/111) 0 (0/3)

Concurrent Demand 4.0E-3 (1/252) 2.7/

1 5.7E-3 (5/880) 3.9/

15.8 4.9E-2 (6/123) 33.5/

12.3 Multiple Concurrent 0

(0/0) 0 (0/0) 0 (0/29)

PIF Combined Effects -

Indicator Detection 19 PIF Base PIF Status PIF Status 1 PIF Status 2 Detection Mode Procedure-Directed Check 3.4E-3 (3/87) 1 Awareness/

Inspection 6.4E-3E-2 (5/782) 1.9 KB-Driven Monitoring 1.9E-2 (8/432) 5.4 Workload - Indicator Detection Normal 2.8E-3 (2/711) 1 Concurrent Demand 7.8E-3 (10/1298) 2.8 Multiple Concurrent Demands 2.5E-2 (5/198) 9.0 Workload Indicator Detection Mode Procedure directed Check Awareness/

Inspection KB-Driven Monitoring Normal 0

(0/324) 4.3E-3 (1/235) 1.6 0

(0/50)

Concurrent Demand 2.7E-3 (1/376) 1 7.5E-3 (4/533) 2.8 1.4E-2 (5/366) 5.1 Multiple Concurrent 1.2E-2 (2/170) 4.4 0

(0/14) 0.21 (3/14) 80.6

Are PIFs Effects Context Independent?

PIF Poor Info Timing Ambiguous Info Require Integration Info Not Specific PIF Effect 4.7 4.2 2.9 1.9 20 Poor Infor Timing No Yes Ambiguous Info No Yes No Yes Require Integration No Info Not Specific No 1

(3/1116)

(0/24) 31 (3/36)

(0/0)

Yes 2.7 (6/812) 16 (9/209)

(0/13)

(0/0)

Yes No 12.7 (4/117)

(0/0)

(0/0)

(0/0)

Yes (0/0)

(0/0)

(0/0)

(0/0)

Single and Combined PIFs Effects

  • A PIFs effects should be evaluated in the condition when the other factors statuses remain identical; otherwise, the results may be misleading.
  • Based on currently available for analysis, it does not conclude whether the PIFs combinational effect is addition or multiplication.

21

Support IDHEAS-ECA Method UNSAT Ratio IDHEAS-ECA (2)

MCF Average (1)

Base HEP (2)

(1)/(2)

Detection (Alarm) 6.2E-3 1E-4 62 Detection (Indicator) 7.7E-3 1E-4 77 Diagnosis 1.1E-2 1E-3 11 Decisionmaking 5.3E-3 1E-4 53 Action 1.1E-2 1E-3 11 22 All IDHEASs base HEPs are less than the average UNSAT ratios of SACADA.

Ideally, data in optimal operational conditions should be used to inform for base HEPs. Indirect indications may be able to inform base HEPs with limited data.

Does task Importance Affect Human Performance SACADA data does not support operator performance is affected by task importance The table shows statistics of SACADAs TOEs classified in four importance levels Item (1) the scenario-specific TOEs met UNSAT ratio greater than 10%; and

>= 2 crews dispositioned as UNSATs The higher (1)/(2) value in critical tasks (the most important tasks) is attributed to scenario design Critical tasks are typically appear in the end of scenarios whether the situation is generally more complex).

23 Importance Level

  1. of Items (1)

Total # of Items (2)

(1)/(2) 1 (Critical Task) 2 71 2.8%

2 (Safety related) 11 822 1.3%

3 (professional tasks) 20 1619 1.2%

4 (General) 8 559 1.4%

Qualitative Context Related to TOEs with high UNSAT Ratios General suspects with scenario-system specific information

  • Unfamiliarity + Information masking

- Fail to trip RCPS because tripping criteria were determined not met

  • HHPI is running and indicating flow rate
  • The local valve was closed so the flow did not reach to the targeted location
  • Not a trivia to identify the flow did not reach to the target area
  • Multiple concurrent tasks

- Fail to trip ESF DG manually before its automatic trip

  • Monitor ESF DGs parameters and detect the lube oil temperature was increasing abnormally
  • The crew was tie up to responding a previous malfunction 24

Ongoing SACADA Work

  • Recently completed SACADA-2 tool development (simulator data portion)
  • Welcome collaborations on using SACADA to collect operator performance data
  • Analyzing SACADA data to improve IDHEAS-ECA
  • Welcome collaborations on analyzing SACADA data

Contact:

James.Chang@nrc.gov 25