ML23355A249: Difference between revisions
StriderTol (talk | contribs) (StriderTol Bot insert) |
StriderTol (talk | contribs) (StriderTol Bot change) |
||
Line 16: | Line 16: | ||
=Text= | =Text= | ||
{{#Wiki_filter:A regulatory perspective: Have we done enough on grasping automation failure? | {{#Wiki_filter:A regulatory perspective: Have we done enough on grasping automation failure? | ||
technologies in NPPs. Before discussing the target paper, we provide a landscape of | Jing Xing, Niav Hughes Green U.S. Nuclear Regulatory Commission Jing.xing@nrc.gov Niav.Hughes@nrc.gov | ||
Abstract This paper responds to Skraaning & Jamieson target paper The Failure to Grasp Automation Failure. We acknowledge that the target paper made several important contributions to automation research in the human factors community: i) It analyzed automation failure events in complex operational systems in contrast to the vast majority of laboratory research on human-automation interaction; ii) The analysis was performed on the human performance framework beyond the traditional human information processing models; iii) The analysis focuses on the causes and mechanisms of automation failure; the paper presented an initial taxonomy of automation failure; this expands the themes of human-automation research regarding enhancements to automation implementation; iv) | |||
The analysis and taxonomy demonstrate the integration of approaches to grasping automation failures from system instrumentation & controls, human factors engineering, (and human reliability analysis. W e present an overview of the regulatory framework related to use of automation in nuclear power plants and examine the framework elements on whether they adequately address the issue of Failure to Grasp Automation Failure using the taxonomy in the target paper. Overall, we believe that the target paper could enhance the consideration of potential automation failures in the design and regulatory review process of automation technologies. | |||
: 1. Introduction | |||
This paper responds to Skraaning & Jamieson target paper from the perspectives of regulatory applications. The authors of this paper have worked at the U.S. Nuclear Regulatory Commission (NRC), Office of Nuclear Regulatory Research for nearly two decades, and the content of this paper represents the authors, not the NRCs technical opinions. Both have research background in cognitive science and human factors engineering. Both lead the development of regulatory guidance and methods for reviewing human factors engineering (HFE) and human reliability analysis (HRA) in the design of new technologies in nuclear power plant ( NPP). The regulatory guidance and methods require technical basis of state-of-art research. Studies of human-automation interaction on complex process control can address the challenges in regulatory activities. T he target paper presents the results that can enhanc e the technical basis for reviewing automation technologies in NPPs. Before discussing the target paper, we provide a landscape of t he NRCs regulatory activities related to the emergent use of automation. | |||
Advanced nuclear reactor technologies present new challenges. High level automation is expected to be prevalent in advanced NPPs and modernization of existing NPP control rooms. The NRC have approved the design of technologies proposing higher levels of automation including the Westinghouse AP1000 [1] and NuScale [2]. Two AP1000 units are authorized for operation by the NRC [3, 4, 5]; each unit is operated from a nearly fully digital control room. Meanwhile, modernization activities also engender implementations of control room automation. For example, modifications to traditional plants in the U.S. seek to employ DI&C on safety systems. Knowledge about the implications of automation implementations on operator performance will enhance the technical basis for NRCs regulatory activities. | Advanced nuclear reactor technologies present new challenges. High level automation is expected to be prevalent in advanced NPPs and modernization of existing NPP control rooms. The NRC have approved the design of technologies proposing higher levels of automation including the Westinghouse AP1000 [1] and NuScale [2]. Two AP1000 units are authorized for operation by the NRC [3, 4, 5]; each unit is operated from a nearly fully digital control room. Meanwhile, modernization activities also engender implementations of control room automation. For example, modifications to traditional plants in the U.S. seek to employ DI&C on safety systems. Knowledge about the implications of automation implementations on operator performance will enhance the technical basis for NRCs regulatory activities. | ||
Modernization efforts involve using digital instrumentation and | |||
Modernization efforts involve using digital instrumentation and control s (DI&C), as well as automating operator manual actions. Novel elements of the design will likely include more advanced automation and, thus, will be targeted for HFE review to determine whether the applicant has reasonably assured the effective integration of automation and operators, and that the design supports safe operations. The NRC has the guidance DI&C-ISG-06 ((6] to review license amendment requests associated with safety-related DI&C equipment modifications. For modifications that may involve HFE considerations, an HFE safety evaluation should be performed in accordance with NUREG-0711 [7], Human Factors Engineering Program Review Model; and NUREG-1764 [8] and Guidance for the Review of Changes to Human Actions. More recently, the NRC staff developed the draft Guidance Development of Scalable Human Factors Engineering Review Plans (DRO-ISG-2023- 03) | |||
[10], under the Risk-Informed, Technology-Inclusive Regulatory Framework [9] for HFE review of advanced reactors. The guidance provides a risk-informed and performance-based process to identify the most safety-significant systems, systems with likely human factors challenges, and novel elements of the design. | [10], under the Risk-Informed, Technology-Inclusive Regulatory Framework [9] for HFE review of advanced reactors. The guidance provides a risk-informed and performance-based process to identify the most safety-significant systems, systems with likely human factors challenges, and novel elements of the design. | ||
Research has been conducted on human-automation systems for NPP control rooms. For example, the Halden Human Technology Organization (Halden HTO) at the Institute for Energy Technology (IFE) consolidated its two decades long research results studying human-automation interaction in NPP simulators [11]. Several organizations have reported human performance studies of computerized procedures with automation functionality embedded [12]. Operational experience review has documented many DI&C and automation events in NPPs. Most DI&C events involve issues in HFE considerations. There is also research regarding automated vehicles related to the mitigation of attribution error in the design of automated and autonomous nuclear power plants [13]. The target paper by Skraaning and Jamieson, The Failure to Grasp Automation Failure [14], reviewed recent aviation accidents involving automation failures and proposed an initial taxonomy of automation failure and automation-related human performance challenges. Xing and Green | |||
Research has been conducted on human-automation systems for NPP control rooms. For example, the Halden Human Technology Organization (Halden HTO) at the Institute for Energy Technology (IFE) consolidated its two decades long research results studying human-automation interaction in NPP simulators [11]. Several organizations have reported human performance studies of computerized procedures with automation functionality embedded [12]. Operational experience review has documented many DI&C and automation events in NPPs. Most DI&C events involve issues in HFE considerations. There is also research regarding automated vehicles related to the mitigation of attribution error in the design of automated and autonomous nuclear power plants [ 13]. The target paper by Skraaning and Jamieson, The Failure to Grasp Automation Failure [14], reviewed recent aviation accidents involving automation failures and proposed an initial taxonomy of automation failure and automation-related human performance challenges. Xing and Green | |||
[15] reported that deficiencies of human-automation integration led to automation failure events in several ways: i) The automation system worked as expected but the deficiencies in human-automation interaction led to human failures; ii) The automation failures led to, or aggravated, human-automation integration deficiencies and then led to human failing to identify or recover from automation failures; iii) The DI&C element had unusual behaviors or deviated from operators understanding, thus leading to or aggravating human-automation integration deficiencies and then leading to human failures. Taken together, these different sources of information can provide an understanding of automation reliability and how that reliability impacts the operators use of automation. | [15] reported that deficiencies of human-automation integration led to automation failure events in several ways: i) The automation system worked as expected but the deficiencies in human-automation interaction led to human failures; ii) The automation failures led to, or aggravated, human-automation integration deficiencies and then led to human failing to identify or recover from automation failures; iii) The DI&C element had unusual behaviors or deviated from operators understanding, thus leading to or aggravating human-automation integration deficiencies and then leading to human failures. Taken together, these different sources of information can provide an understanding of automation reliability and how that reliability impacts the operators use of automation. | ||
In 2022, an interdisciplinary team of NRC staff working in DI&C, HFE, and risk analysis systematically evaluated the findings from investigative reports of BOEING 737 crashes | |||
[16]. The team examined the recommendations in these investigative reports for their potential implementation in the NRCs D I&C regulatory process. The team recommended four areas of focus to enhance DI&C licensing and regulatory oversight, two of which address HFE in DI&C reviews: | |||
- The NRC staff should continue to improve integration and communication among DI&C technical reviews, HFE reviews, and subsequent inspection oversight for new or significantly different applications from conception to installation. | |||
- The NRC staff should develop guidance for assessing systems engineering approaches for the DI&C design and human factors life-cycle evaluation, which are important for ensuring that approved D I&C designs are appropriately integrated to maintain safety functionality. | |||
These underscore the importance of effective communication and integration among HFE and DI&C for the review of advanced reactor technologies. | These underscore the importance of effective communication and integration among HFE and DI&C for the review of advanced reactor technologies. | ||
: 2. General comments on | : 2. General comments on t he target paper from regulatory application perspectives | ||
i) | |||
ii) | We assert that the target paper made several important contributions to human-automation research: | ||
iii) The analysis focuses on the causes and mechanisms of automation failure; the taxonomy of automation failure expands the themes of human-automation research that have been focused on making automation work better. It is prevalent that laboratory experiments study the effects of human-automation interaction characteristics such as level or type of automation on measures such as workload, situational awareness, and level of trust. However, there has been little evidence that those characteristics and measures can predict human performance, in particular the reliability of humans ability to grasp automation failure. | |||
iv) | i) The authors analysed automation failure events in complex operational systems in contrast to most laboratory research on human-automation interaction. Previous studies and our own experience of developing human factors engineering guidance demonstrated that results from human-automation-interaction laboratory experiments in simple multi-task contexts generally do not predict human performance with automation in complex control process systems [16]. | ||
ii) The authors analysed automation failure accidents with a human performance framework beyond the traditional human information processing models. The human performance framework considers the broad context of complex cognitive process tasks, while human information processing models typically represent a slice of the overall cognitive processes involved in complex operational tasks. | |||
iii) The analysis focuses on the causes and mechanisms of automation failure; the taxonomy of automation failure expands the themes of human-automation research that have been focused on making automation work better. It is prevalent that laboratory experiments study the effects of human -automation interaction characteristics such as level or type of automation on measures such as workload, situational awareness, and level of trust. However, there has been little evidence that those characteristics and measures can predict human performance, in particular the reliability of humans ability to grasp automation failure. | |||
iv) The taxonomy incorporates findings on automation systems, design logic and components, DI&C, human-automation integration, and human and organizational factors. It calls the attention that grasping automation failure needs integrated approaches from all these aspects. This echoes the recommendations made by the NRC team on integration of DI&C, HFE, and risk analysis. | |||
v) The taxonomy could inspire experimental research scenarios that are more industry relevant. It could serve as a roadmap for the design of future domain specific experimental research aimed at how different automation features influence performance in complex operational contexts. System designers could use the framework to anticipate and identify systemic automation failures. Likewise, after proper evaluation, regulators may consider using a similar taxonomy as a checklist to evaluate if a particular design is vulnerable to systemic automation failures. | |||
: 3. Potential to enhance regulatory guidance for reviewing new NPP technologies. | |||
3.1 A Framework for Integrating DI &C and Human Factor Engineering | |||
Regardless of the vast landscape of advanced technologies, NPP operation share the critical functional aspects in common: | |||
3.1 | |||
* Operators work within complex control systems; | * Operators work within complex control systems; | ||
* Operators are in control of the systems, although the systems can run at high or full automation modes; | * Operators are in control of the systems, although the systems can run at high or full automation modes; | ||
Line 47: | Line 61: | ||
planning, manipulation / control, and teamwork; Use of higher levels of automation may result in more monitoring tasks for the human operator versus manipulation and control types of tasks. | planning, manipulation / control, and teamwork; Use of higher levels of automation may result in more monitoring tasks for the human operator versus manipulation and control types of tasks. | ||
* Operator responses are procedure-based. | * Operator responses are procedure-based. | ||
Xing and Green [15] generalize the NRCs regulatory and licensing activities in DI&C, HFE, and HRA into a framework depicted in Figure 1. This framework represents how human-automation integration works: i) The DI&C elements achieve functions of automation systems; behaviors of DI&C elements impact elements of human-automation integration; ii) | |||
Xing and Green [15 ] generalize the NRCs regulatory and licensing activities in DI&C, HFE, and HRA into a framework depicted in Figure 1. This framework represents how human - | |||
automation integration works: i) The DI&C elements achieve functions of automation systems; behaviors of DI&C elements impact elements of human-automation integration; ii) | |||
Human-automation integration should ensure that the automation functions do not cause human errors and failures of DI&C elements do not propagate to human failures, iii) The human cognition system should ensure that human operators are capable of identifying and recovering from automation failures. In the diagram, the box on the left represents DI&C elements that constitute an automation system. The box in the middle of the diagram represents HFE elements that support human-automation interaction. The box on the right represents the elements of human cognitive task performance. | Human-automation integration should ensure that the automation functions do not cause human errors and failures of DI&C elements do not propagate to human failures, iii) The human cognition system should ensure that human operators are capable of identifying and recovering from automation failures. In the diagram, the box on the left represents DI&C elements that constitute an automation system. The box in the middle of the diagram represents HFE elements that support human-automation interaction. The box on the right represents the elements of human cognitive task performance. | ||
Figure 1. A framework for integrating DI&C systems and HFE | |||
The taxonomy of the target paper is organized by automation-induced human performance challenges in four columns1) Elementary Automation Failures 2) Systematic Automation Failures and 3) Human Automation Interaction Breakdowns, and 4) Negative Human Performance Outcomes that may occur in the presence of automation and, may be the result of a lack of adherence to human factors design principles. Next, we discuss how the target paper taxonomy may enhance the elements in the framework presented in Figure 1. | |||
3.2 DI&C systems | |||
The NRC staff conducts DI&C design review s to ensure DI&C system safety for NPP operations. The DI&C design review identifies potential design hazards and analyzes system or component failure modes. The first column of taxonomy contains Elementary Automation Failures, such as Automatic functions are missing or lost, and Loss of power supply to automation, corresponding to design hazards of automation systems. The listed hazards can enhance existing hazard identification methods that are not specifically developed for automation systems. | |||
The second column of the taxonomy contains Systematic Automation Failures. Examples are Automation works as intended but operates outside the design basis, and Automatic systems works in parallel but compromise each other. This list can enrich our understanding of potential failure modes in DI&C elements of an automation system and help identifying potential automation failures that induce human performance challenges. | The second column of the taxonomy contains Systematic Automation Failures. Examples are Automation works as intended but operates outside the design basis, and Automatic systems works in parallel but compromise each other. This list can enrich our understanding of potential failure modes in DI&C elements of an automation system and help identifying potential automation failures that induce human performance challenges. | ||
human limitations. The current proposed 10 CFR Part 53 rule § 53.440 (n)(4) states, A functional requirements analysis and function allocation must be used to ensure that plant design features address how safety functions and functional safety criteria are satisfied, and how the safety functions will be assigned to appropriate combinations of human action, automation, active safety features, passive safety features, or inherent safety characteristics. [9] | 3.3 Human-Automation Integration | ||
The third column of the taxonomy presents a list of Human-Automation Interaction Breakdowns. Examples are Automation provides misleading support to operators, and Critical operator actions are unsuitably blocked by automation. Essentially, all the listed breakdowns are deficiencies in functional requirement analysis and functional allocation. | |||
The HFE review can benefit by incorporating the breakdowns as a | The NRC staff review human-automation integration using guidance contained in Human Factors Engineering Review Model documented in NUREG -071 1 [7]. The HFE model includes twelve elements, some of which are shown in the middle box of Figure 1. The element Functional Requirements Analysis and Function Allocation is most relevant to automation design. NUREG -0711 states The purpose of this element is to verify that the applicant defined those functions that must be carried out to satisfy the plants safety goals and that the assignment of responsibilities for those functions (function allocation) to personnel and automation in a way that takes advantage of human strengths and avoids human limitations. The current proposed 10 CFR Part 53 rule § 53.440 (n)(4) states, A functional requirements analysis and function allocation must be used to ensure that plant design features address how safety functions and functional safety criteria are satisfied, and how the safety functions will be assigned to appropriate combinations of human action, automation, active safety features, passive safety features, or inherent safety characteristics. [9] | ||
The third column of the taxonomy presents a list of Human -Automation Interaction Breakdowns. Examples are Automation provides misleading support to operators, and Critical operator actions are unsuitably blocked by automation. Essentially, all the listed breakdowns are deficiencies in functional requirement analysis and functional allocation. | |||
The HFE review can benefit by incorporating the breakdowns as a ch eck list for asking questions. | |||
Similarly, the taxonomy can also help HFE review the element on Human Factors Verification and Validation, which states that Verification and validation (V&V) evaluations comprehensively determine that the final HFE design conforms to accepted design principles and enables personnel to successfully and safely perform their tasks to achieve operational goals. When present in a design, the listed breakdowns represent design deficiencies that should be captured in V&V. The current HFE process described in NUREG-0711 focuses on ensuring that state-of-the-art human factors principles are incorporated into the design and implementation which is verified during V&V. In contrast, grasping automation failures requires risk-focused considerations of failure modes and failure causes. | Similarly, the taxonomy can also help HFE review the element on Human Factors Verification and Validation, which states that Verification and validation (V&V) evaluations comprehensively determine that the final HFE design conforms to accepted design principles and enables personnel to successfully and safely perform their tasks to achieve operational goals. When present in a design, the listed breakdowns represent design deficiencies that should be captured in V&V. The current HFE process described in NUREG-0711 focuses on ensuring that state-of-the-art human factors principles are incorporated into the design and implementation which is verified during V&V. In contrast, grasping automation failures requires risk-focused considerations of failure modes and failure causes. | ||
3.4 | |||
Each PIF is represented by a set of attributes, which describe the ways the PIF | 3.4 Human Reliability Analysis | ||
The fourth column of the taxonomy lists examples of Human and Organizational Slips/Misconceptions, such as Faulty operator programming of automation and Inadvertent activation / deactivation of automation which are not specifically addressed in our HFE review process. They are, however, addressed in HRA. The NRCs new HRA methodology, the Integrated Human Event Analysis System (IDHEAS) [18, 19], uses 20 performance influencing factors (PIFs), as shown in Table 1, to model human performance. | |||
Each PIF is represented by a set of attributes, which describe the ways the PIF challenge s human performance and increases the likelihood of human errors. We examined that all the listed slips/misconceptions are included in IDHEAS PIF attributes. Since the target paper indicates that the list is an initial subset of a large pool, we recommend that future wor k considers using IDHEAS PIF structure to organize human and organizations slips / | |||
misconceptions. | misconceptions. | ||
Table 1 Performance Influencing Factors in IDHEAS Method Environment and | |||
* | Table 1 Performance Influencing Factors in IDHEAS Method | ||
* | |||
* | Environment and System Personnel Task situation | ||
* | * Wkatio | ||
* Procedures, | * Systnd | ||
* Workplace visibility | * Sing | ||
* Imion accessibility and I | |||
* Procedures, lility an hababy transparency guidelines, and reliabity | |||
* Workplace visibility to persl instructions | |||
* Scenario familiarity | * Scenario familiarity | ||
* Noise in workplace | * Noise in workplace | ||
* Human- | * Human-* Training | ||
* Training | * Multi-tasking, and communication system | ||
* Multi-tasking, | * Teamwork and interruption and pathways interfaces organizational | ||
* Teamwork and pathways | |||
* Cold/heat/humidity | * Cold/heat/humidity | ||
* | * Eip fts distraction | ||
* Resistance to | * Resistance to and tls | ||
* Work processes | * Work processes | ||
* Task complexity physical movement | * Task complexity physical movement | ||
* Mental fatigue | * Mental fatigue | ||
* Time pressure and stress | * Time pressure and stress | ||
* Physical demands Finally, we see that one piece missing from the taxonomy is the so what, a description of human failures due to the challenges or misconceptions. We recommend the target paper authors consider incorporating IDHEAS cognitive failure modes to represent human failures in using automation. The cognitive failure modes represent failures of macrocognitive functions, which are the basic cognitive elements to achieve complex operational tasks. The cognitive failure modes are human-centered, thus they can be used to model human failures in any automation systems. | * Physical demands | ||
Finally, we see that one piece missing from the taxonomy is the so what, a description of human failures due to the challenges or misconceptions. We recommend the target paper authors consider incorporating IDHEAS cognitive failure modes to represent human failures in using automation. The cognitive failure modes represent failures of macrocognitive functions, which are the basic cognitive elements to achieve complex operational tasks. The cognitive failure modes are human-centered, thus they can be used to model human failures in any automation systems. | |||
IDHEAS cognitive failure modes consist of the failures of the following five macrocognitive functions: | IDHEAS cognitive failure modes consist of the failures of the following five macrocognitive functions: | ||
* Detection (D) is noticing cues or gathering information in the work environment. | * Detection (D) is noticing cues or gathering information in the work environment. | ||
Line 89: | Line 121: | ||
* Action execution (E) is the implementation of the decision or plan to change some physical component or system. | * Action execution (E) is the implementation of the decision or plan to change some physical component or system. | ||
* Interteam coordination (T) focuses on how various teams interact and collaborate on an action. | * Interteam coordination (T) focuses on how various teams interact and collaborate on an action. | ||
Each macrocognitive function is achieved through a set of basic cognitive processors. | Each macrocognitive function is achieved through a set of basic cognitive processors. | ||
IDHEAS uses the failure of the processors as detailed cognitive failure modes, as shown in Table 2. These detailed cognitive failure modes can specifically model types of human failures in human-automation integration. | IDHEAS uses the failure of the processors as detailed cognitive failure modes, as shown in Table 2. These detailed cognitive failure modes can specifically model types of human failures in human-automation integration. | ||
Failure of | Table 2. IDHEAS detailed cognitive failure modes | ||
DM5. Fail to Simulate or evaluate the decision or plan DM6. Fail to Communicate and authorize the decision Failure of Action | |||
: 4. | Failure of Detailed cognitive failure modes Macrocognitive Function Failure of Detection (D) D1. Fail to establish the correct mental model or to initiate detection D2. Fail to select, identify, or attend to sources of information D3. Incorrectly perceive or classify information D4. Fail to verify perceived information D5. Fail to retain, record, or communicate the acquired information Failure of U1. Fail to assess/select data Understanding (U) U2. Fail to select/adapt/develop the mental model U3. Fail to integrate data with the mental model U4. Fail to verify and revise the outcome of understanding U5. Fail to export the outcome Failure of DM1. Fail to Adapt the infrastructure of decisionmaking Decisionmaking DM2. Fail to Manage the goals and decision criteria (DM) DM3. Fail to Acquire and select data for decisionmaking DM4. Fail to Make decision (judgment, strategies, plans) | ||
: 5. | DM5. Fail to Simulate or evaluate the decision or plan DM6. Fail to Communicate and authorize the decision Failure of Action E1. Fail to Assess action plan and criteria Execution (E) E2. Fail to Develop or modify action scripts E3. Fail to Prepare or adapt infrastructure for action implementation E4. Fail to Implement action scripts E5. Fail to Verify and adjust execution outcomes Failure of Interteam T1. Fail to establish or adapt teamwork infrastructure Coordination (T) T2. Fail to Manage information T3. Fail to Maintain shared situational awareness T4. Fail to Manage resources T5. Fail to Plan interteam collaborative activities T6. Fail to Implement decisions and commands T7. Fail to Verify, modify, and control the implementation | ||
[1] | : 4. Concluding Remarks | ||
The target paper, The Failure to Grasp Automation Failure makes several important contributions towards moving the field of automation research forward. The present commentary is from a regulatory perspective and particularly focused on the impact for t he nuclear domain. The target paper addressed a much richer set of operational challenges through the analysis of automation failure events than can be accomplished in most laboratory experiments concerning human-automation interaction. The framework used builds upon traditional human information processing models, resulting in a more integrated approach. By focusing on the causes and mechanisms of automation failure, the initial taxonomy proposed not only extends the human-automation research for improved automation, but the taxonomy itself represents a potentially successful integration of the interdisciplinary approaches for identifying automation failures including DI&C, HFE, and HRA. The introduction of this taxonomy is a clear step towards addressing the challenge of integrating these interdisciplinary intersections by providing a common language. We suggest several areas where the target paper authors may consider moving forward in the development of the taxonomy, such as incorporation of cognitive failure modes. We also propose several use cases for the taxonomy. The taxonomy may be used to guide experimental research aimed at how different automation features a ffect performance in complex operational contexts. System designers may use the framework to anticipate and identify systemic automation failures. Likewise, after proper evaluation, regulators may use a similar taxonomy as a checklist to evaluate if a particular design is vulnerable to systemic automation failures. Overall, the analysis and proposed taxonomy could benefit the human-automation research, particularly through the focused scope on complex operational contexts like the nuclear domain. | |||
: 5. References | |||
[1] U.S. NUCLEAR REGULATORY COMMISSION, Final Safety Evaluation Report Related to Certification of the AP1000 Standard Design NUREG -1793, Supplement | |||
: 2. U.S. Nuclear Regulatory Commission, Washington, DC, 2011. | : 2. U.S. Nuclear Regulatory Commission, Washington, DC, 2011. | ||
[2] | |||
[3] | [2] U.S. NUCLEAR REGULATORY COMMISSION, 2020. Standard Design Approval for the Nuscale Power Plant Based on the Nuscale Standard Plant Design Certification Application, ML20247J564, U.S. Nuclear Regulatory Commission, Washington DC, 2020. | ||
[3] U.S. NUCLEAR REGULATORY COMMISSION News: NRC Authorizes Vogtle Unit 3 Fuel Loading and Operation. ML22215A210. U.S. Nuclear Regulatory Commission, Washington, DC, August 3, 2022. | |||
[4] U.S. NUCLEAR REGULATORY COMMISSION News: NRC Authorizes Vogtle Unit 4 Fuel Loading and Operation, U.S. Nuclear Regulatory Commission, Washington, DC, July 28, 2023. | [4] U.S. NUCLEAR REGULATORY COMMISSION News: NRC Authorizes Vogtle Unit 4 Fuel Loading and Operation, U.S. Nuclear Regulatory Commission, Washington, DC, July 28, 2023. | ||
[5] Georgia Power, https://www.georgiapower.com/company/news-center/2023-articles/vogtle-unit-3-goes-into-operation.html. July 31, 2023. | [5] Georgia Power, https://www.georgiapower.com/company/news-center/2023-articles/vogtle-unit-3-goes-into-operation.html. July 31, 2023. | ||
[6] U.S. NUCLEAR REGULATORY COMMISSION, Digital Instrumentation and Controls Licensing Process Interim Staff Guidance, DI&C-ISG-06, Revision 2. U.S. Nuclear Regulatory Commission. Washington, DC, 2020. | |||
[6] U.S. NUCLEAR REGULATORY COMMISSION, Digital Instrumentation and Controls Licensing Process Interim Staff Guidance, DI&C -ISG-06, Revision 2. U.S. Nuclear Regulatory Commission. Washington, DC, 2020. | |||
[7] U.S. NUCLEAR REGULATORY COMMISSION, Human Factors Engineering Program Review Model, NUREG-0711, Rev. 3, US Nuclear Regulatory Commission, Washington, DC, 2012. | [7] U.S. NUCLEAR REGULATORY COMMISSION, Human Factors Engineering Program Review Model, NUREG-0711, Rev. 3, US Nuclear Regulatory Commission, Washington, DC, 2012. | ||
[8] U.S. NUCLEAR REGULATORY COMMISSION, Guidance for the Review of Changes to Human Actions, NUREG-1764, Revision 1, US Nuclear Regulatory Commission, Washington, DC, 2007 | |||
[9] U.S. NUCLEAR REGULATORY COMMISSION, Proposed Rule: Risk-Informed, Technology-Inclusive Regulatory Framework for Advanced Reactors (RIN 3150-AK31). | [8] U.S. NUCLEAR REGULATORY COMMISSION, Guidance for the Review of Changes to Human Actions, NUREG -1764, Revision 1, US Nuclear Regulatory Commission, Washington, DC, 2007 | ||
[10] U.S. NUCLEAR REGULATORY COMMISSION, Predecisional - Documents to Support Part 53 - Development of Scalable Human Factors Engineering Review Plans: Draft Interim Staff Guidance. DRO-ISG-2023-03. (2022). 10/18-19/2022 ACRS | |||
[9] U.S. NUCLEAR REGULATORY COMMISSION, Proposed Rule: Risk -Informed, Technology-Inclusive Regulatory Framework for Advanced Reactors (RIN 3150-AK31). SECY-23-0021: ML21162A093, U.S. Nuclear Regulatory Commission. Washington, DC, 2023. | |||
[10] U.S. NUCLEAR REGULATORY COMMISSION, Predecisional - Documents to Support Part 53 - Development of Scalable Human Factors Engineering Review Plans: Draft Interim Staff Guidance. DRO -ISG-2023-03. (2022). 10/18-19/2022 ACRS Public Meeting, ML22272A051, U.S. Nuclear Regulatory Commission. Washington, DC, 2022. | |||
[11] G. Skraaning Jr., G. Jamieson, D. Armando, Q. Guanoluisa. Future Challenges in Human-Automation Interaction: Technology Trends and Operational Experiences from Other Industries., Halden Working Report HWR-1308. OECD Halden Reactor Project, Halden, Norway, 2020. | [11] G. Skraaning Jr., G. Jamieson, D. Armando, Q. Guanoluisa. Future Challenges in Human-Automation Interaction: Technology Trends and Operational Experiences from Other Industries., Halden Working Report HWR-1308. OECD Halden Reactor Project, Halden, Norway, 2020. | ||
Simulator Study, Halden Working Report HWR-1198, OECD Halden Reactor Project, Halden, Norway, 2017. | [12] Claire Taylor, Michael Hildebrandt, Robert McDonald, Niav Hughes. Operator Response to Failures of a Computerised Procedure System: Results from a Training Simulator Study, Halden Working Report HWR-1198, OECD Halden Reactor Project, Halden, Norway, 2017. | ||
[13] Hancock, P. A., John D. Lee, and John W. Senders. "Attribution errors by people and intelligent machines." Human factors (2021): 00187208211036323. | [13] Hancock, P. A., John D. Lee, and John W. Senders. "Attribution errors by people and intelligent machines." Human factors (2021): 00187208211036323. | ||
[14] G. Skraaning Jr, G.,Jamieson, The Failure to Grasp Automation Failure. Journal of Cognitive Engineering and Decision Making. | [14] G. Skraaning Jr, G.,Jamieson, The Failure to Grasp Automation Failure. Journal of Cognitive Engineering and Decision Making. | ||
https://doi.org/10.1177/15553434231189375, 2023. | https://doi.org/10.1177/15553434231189375, 2023. | ||
[15] J. Xing, N. H. Green, Failure Modes In Human-Automation Integration. The Enlarged Halden Program Review Group (EHPRG) meeting, OECD Halden Human-Technology-Organization (MTO) Project, Halden, Norway, 2023. | [15] J. Xing, N. H. Green, Failure Modes In Human-Automation Integration. The Enlarged Halden Program Review Group (EHPRG) meeting, OECD Halden Human-Technology-Organization (MTO) Project, Halden, Norway, 2023. | ||
[16] U.S. NUCLEAR REGULATORY COMMISSION, Boeing 737 Crashes: Lessons Learned for NRC Digital Instrumentation and Controls Evaluation Process. U.S. | [16] U.S. NUCLEAR REGULATORY COMMISSION, Boeing 737 Crashes: Lessons Learned for NRC Digital Instrumentation and Controls Evaluation Process. U.S. | ||
Nuclear Regulatory Commission, Washington, DC (September 22, 2022). | Nuclear Regulatory Commission, Washington, DC (September 22, 2022). | ||
[17] J. Xing, Y. J. Chang, and J. DeJesus, The General Methodology of an Integrated Human Event Analysis System (IDHEAS-G) U.S. Nuclear Regulatory Commission, NUREG-2198 (ADAMS Accession No. ML19235A161), 2019. | [17] J. Xing, Y. J. Chang, and J. DeJesus, The General Methodology of an Integrated Human Event Analysis System (IDHEAS-G) U.S. Nuclear Regulatory Commission, NUREG-2198 (ADAMS Accession No. ML19235A161), 2019. | ||
[18] J. Xing, Y. J. Chang, and J. DeJesus, "Integrated Human Event Analysis System for Event and Condition Assessment (IDHEAS-ECA)". U.S. Nuclear Regulatory Commission, NUREG-2256, ADAMS Accession Number: ML22165A282, 2022.}} | [18] J. Xing, Y. J. Chang, and J. DeJesus, "Integrated Human Event Analysis System for Event and Condition Assessment (IDHEAS-ECA)". U.S. Nuclear Regulatory Commission, NUREG-2256, ADAMS Accession Number: ML22165A282, 2022.}} |
Revision as of 20:48, 5 October 2024
ML23355A249 | |
Person / Time | |
---|---|
Issue date: | 12/18/2023 |
From: | Niav Hughes, Jing Xing NRC/RES/DRA/HFRB |
To: | |
Hughes N; Xing J | |
References | |
Download: ML23355A249 (10) | |
Text
A regulatory perspective: Have we done enough on grasping automation failure?
Jing Xing, Niav Hughes Green U.S. Nuclear Regulatory Commission Jing.xing@nrc.gov Niav.Hughes@nrc.gov
Abstract This paper responds to Skraaning & Jamieson target paper The Failure to Grasp Automation Failure. We acknowledge that the target paper made several important contributions to automation research in the human factors community: i) It analyzed automation failure events in complex operational systems in contrast to the vast majority of laboratory research on human-automation interaction; ii) The analysis was performed on the human performance framework beyond the traditional human information processing models; iii) The analysis focuses on the causes and mechanisms of automation failure; the paper presented an initial taxonomy of automation failure; this expands the themes of human-automation research regarding enhancements to automation implementation; iv)
The analysis and taxonomy demonstrate the integration of approaches to grasping automation failures from system instrumentation & controls, human factors engineering, (and human reliability analysis. W e present an overview of the regulatory framework related to use of automation in nuclear power plants and examine the framework elements on whether they adequately address the issue of Failure to Grasp Automation Failure using the taxonomy in the target paper. Overall, we believe that the target paper could enhance the consideration of potential automation failures in the design and regulatory review process of automation technologies.
- 1. Introduction
This paper responds to Skraaning & Jamieson target paper from the perspectives of regulatory applications. The authors of this paper have worked at the U.S. Nuclear Regulatory Commission (NRC), Office of Nuclear Regulatory Research for nearly two decades, and the content of this paper represents the authors, not the NRCs technical opinions. Both have research background in cognitive science and human factors engineering. Both lead the development of regulatory guidance and methods for reviewing human factors engineering (HFE) and human reliability analysis (HRA) in the design of new technologies in nuclear power plant ( NPP). The regulatory guidance and methods require technical basis of state-of-art research. Studies of human-automation interaction on complex process control can address the challenges in regulatory activities. T he target paper presents the results that can enhanc e the technical basis for reviewing automation technologies in NPPs. Before discussing the target paper, we provide a landscape of t he NRCs regulatory activities related to the emergent use of automation.
Advanced nuclear reactor technologies present new challenges. High level automation is expected to be prevalent in advanced NPPs and modernization of existing NPP control rooms. The NRC have approved the design of technologies proposing higher levels of automation including the Westinghouse AP1000 [1] and NuScale [2]. Two AP1000 units are authorized for operation by the NRC [3, 4, 5]; each unit is operated from a nearly fully digital control room. Meanwhile, modernization activities also engender implementations of control room automation. For example, modifications to traditional plants in the U.S. seek to employ DI&C on safety systems. Knowledge about the implications of automation implementations on operator performance will enhance the technical basis for NRCs regulatory activities.
Modernization efforts involve using digital instrumentation and control s (DI&C), as well as automating operator manual actions. Novel elements of the design will likely include more advanced automation and, thus, will be targeted for HFE review to determine whether the applicant has reasonably assured the effective integration of automation and operators, and that the design supports safe operations. The NRC has the guidance DI&C-ISG-06 ((6] to review license amendment requests associated with safety-related DI&C equipment modifications. For modifications that may involve HFE considerations, an HFE safety evaluation should be performed in accordance with NUREG-0711 [7], Human Factors Engineering Program Review Model; and NUREG-1764 [8] and Guidance for the Review of Changes to Human Actions. More recently, the NRC staff developed the draft Guidance Development of Scalable Human Factors Engineering Review Plans (DRO-ISG-2023- 03)
[10], under the Risk-Informed, Technology-Inclusive Regulatory Framework [9] for HFE review of advanced reactors. The guidance provides a risk-informed and performance-based process to identify the most safety-significant systems, systems with likely human factors challenges, and novel elements of the design.
Research has been conducted on human-automation systems for NPP control rooms. For example, the Halden Human Technology Organization (Halden HTO) at the Institute for Energy Technology (IFE) consolidated its two decades long research results studying human-automation interaction in NPP simulators [11]. Several organizations have reported human performance studies of computerized procedures with automation functionality embedded [12]. Operational experience review has documented many DI&C and automation events in NPPs. Most DI&C events involve issues in HFE considerations. There is also research regarding automated vehicles related to the mitigation of attribution error in the design of automated and autonomous nuclear power plants [ 13]. The target paper by Skraaning and Jamieson, The Failure to Grasp Automation Failure [14], reviewed recent aviation accidents involving automation failures and proposed an initial taxonomy of automation failure and automation-related human performance challenges. Xing and Green
[15] reported that deficiencies of human-automation integration led to automation failure events in several ways: i) The automation system worked as expected but the deficiencies in human-automation interaction led to human failures; ii) The automation failures led to, or aggravated, human-automation integration deficiencies and then led to human failing to identify or recover from automation failures; iii) The DI&C element had unusual behaviors or deviated from operators understanding, thus leading to or aggravating human-automation integration deficiencies and then leading to human failures. Taken together, these different sources of information can provide an understanding of automation reliability and how that reliability impacts the operators use of automation.
In 2022, an interdisciplinary team of NRC staff working in DI&C, HFE, and risk analysis systematically evaluated the findings from investigative reports of BOEING 737 crashes
[16]. The team examined the recommendations in these investigative reports for their potential implementation in the NRCs D I&C regulatory process. The team recommended four areas of focus to enhance DI&C licensing and regulatory oversight, two of which address HFE in DI&C reviews:
- The NRC staff should continue to improve integration and communication among DI&C technical reviews, HFE reviews, and subsequent inspection oversight for new or significantly different applications from conception to installation.
- The NRC staff should develop guidance for assessing systems engineering approaches for the DI&C design and human factors life-cycle evaluation, which are important for ensuring that approved D I&C designs are appropriately integrated to maintain safety functionality.
These underscore the importance of effective communication and integration among HFE and DI&C for the review of advanced reactor technologies.
- 2. General comments on t he target paper from regulatory application perspectives
We assert that the target paper made several important contributions to human-automation research:
i) The authors analysed automation failure events in complex operational systems in contrast to most laboratory research on human-automation interaction. Previous studies and our own experience of developing human factors engineering guidance demonstrated that results from human-automation-interaction laboratory experiments in simple multi-task contexts generally do not predict human performance with automation in complex control process systems [16].
ii) The authors analysed automation failure accidents with a human performance framework beyond the traditional human information processing models. The human performance framework considers the broad context of complex cognitive process tasks, while human information processing models typically represent a slice of the overall cognitive processes involved in complex operational tasks.
iii) The analysis focuses on the causes and mechanisms of automation failure; the taxonomy of automation failure expands the themes of human-automation research that have been focused on making automation work better. It is prevalent that laboratory experiments study the effects of human -automation interaction characteristics such as level or type of automation on measures such as workload, situational awareness, and level of trust. However, there has been little evidence that those characteristics and measures can predict human performance, in particular the reliability of humans ability to grasp automation failure.
iv) The taxonomy incorporates findings on automation systems, design logic and components, DI&C, human-automation integration, and human and organizational factors. It calls the attention that grasping automation failure needs integrated approaches from all these aspects. This echoes the recommendations made by the NRC team on integration of DI&C, HFE, and risk analysis.
v) The taxonomy could inspire experimental research scenarios that are more industry relevant. It could serve as a roadmap for the design of future domain specific experimental research aimed at how different automation features influence performance in complex operational contexts. System designers could use the framework to anticipate and identify systemic automation failures. Likewise, after proper evaluation, regulators may consider using a similar taxonomy as a checklist to evaluate if a particular design is vulnerable to systemic automation failures.
- 3. Potential to enhance regulatory guidance for reviewing new NPP technologies.
3.1 A Framework for Integrating DI &C and Human Factor Engineering
Regardless of the vast landscape of advanced technologies, NPP operation share the critical functional aspects in common:
- Operators work within complex control systems;
- Operators are in control of the systems, although the systems can run at high or full automation modes;
- Operators tasks involve monitoring, situational assessment, decision-making /
planning, manipulation / control, and teamwork; Use of higher levels of automation may result in more monitoring tasks for the human operator versus manipulation and control types of tasks.
- Operator responses are procedure-based.
Xing and Green [15 ] generalize the NRCs regulatory and licensing activities in DI&C, HFE, and HRA into a framework depicted in Figure 1. This framework represents how human -
automation integration works: i) The DI&C elements achieve functions of automation systems; behaviors of DI&C elements impact elements of human-automation integration; ii)
Human-automation integration should ensure that the automation functions do not cause human errors and failures of DI&C elements do not propagate to human failures, iii) The human cognition system should ensure that human operators are capable of identifying and recovering from automation failures. In the diagram, the box on the left represents DI&C elements that constitute an automation system. The box in the middle of the diagram represents HFE elements that support human-automation interaction. The box on the right represents the elements of human cognitive task performance.
Figure 1. A framework for integrating DI&C systems and HFE
The taxonomy of the target paper is organized by automation-induced human performance challenges in four columns1) Elementary Automation Failures 2) Systematic Automation Failures and 3) Human Automation Interaction Breakdowns, and 4) Negative Human Performance Outcomes that may occur in the presence of automation and, may be the result of a lack of adherence to human factors design principles. Next, we discuss how the target paper taxonomy may enhance the elements in the framework presented in Figure 1.
3.2 DI&C systems
The NRC staff conducts DI&C design review s to ensure DI&C system safety for NPP operations. The DI&C design review identifies potential design hazards and analyzes system or component failure modes. The first column of taxonomy contains Elementary Automation Failures, such as Automatic functions are missing or lost, and Loss of power supply to automation, corresponding to design hazards of automation systems. The listed hazards can enhance existing hazard identification methods that are not specifically developed for automation systems.
The second column of the taxonomy contains Systematic Automation Failures. Examples are Automation works as intended but operates outside the design basis, and Automatic systems works in parallel but compromise each other. This list can enrich our understanding of potential failure modes in DI&C elements of an automation system and help identifying potential automation failures that induce human performance challenges.
3.3 Human-Automation Integration
The NRC staff review human-automation integration using guidance contained in Human Factors Engineering Review Model documented in NUREG -071 1 [7]. The HFE model includes twelve elements, some of which are shown in the middle box of Figure 1. The element Functional Requirements Analysis and Function Allocation is most relevant to automation design. NUREG -0711 states The purpose of this element is to verify that the applicant defined those functions that must be carried out to satisfy the plants safety goals and that the assignment of responsibilities for those functions (function allocation) to personnel and automation in a way that takes advantage of human strengths and avoids human limitations. The current proposed 10 CFR Part 53 rule § 53.440 (n)(4) states, A functional requirements analysis and function allocation must be used to ensure that plant design features address how safety functions and functional safety criteria are satisfied, and how the safety functions will be assigned to appropriate combinations of human action, automation, active safety features, passive safety features, or inherent safety characteristics. [9]
The third column of the taxonomy presents a list of Human -Automation Interaction Breakdowns. Examples are Automation provides misleading support to operators, and Critical operator actions are unsuitably blocked by automation. Essentially, all the listed breakdowns are deficiencies in functional requirement analysis and functional allocation.
The HFE review can benefit by incorporating the breakdowns as a ch eck list for asking questions.
Similarly, the taxonomy can also help HFE review the element on Human Factors Verification and Validation, which states that Verification and validation (V&V) evaluations comprehensively determine that the final HFE design conforms to accepted design principles and enables personnel to successfully and safely perform their tasks to achieve operational goals. When present in a design, the listed breakdowns represent design deficiencies that should be captured in V&V. The current HFE process described in NUREG-0711 focuses on ensuring that state-of-the-art human factors principles are incorporated into the design and implementation which is verified during V&V. In contrast, grasping automation failures requires risk-focused considerations of failure modes and failure causes.
3.4 Human Reliability Analysis
The fourth column of the taxonomy lists examples of Human and Organizational Slips/Misconceptions, such as Faulty operator programming of automation and Inadvertent activation / deactivation of automation which are not specifically addressed in our HFE review process. They are, however, addressed in HRA. The NRCs new HRA methodology, the Integrated Human Event Analysis System (IDHEAS) [18, 19], uses 20 performance influencing factors (PIFs), as shown in Table 1, to model human performance.
Each PIF is represented by a set of attributes, which describe the ways the PIF challenge s human performance and increases the likelihood of human errors. We examined that all the listed slips/misconceptions are included in IDHEAS PIF attributes. Since the target paper indicates that the list is an initial subset of a large pool, we recommend that future wor k considers using IDHEAS PIF structure to organize human and organizations slips /
misconceptions.
Table 1 Performance Influencing Factors in IDHEAS Method
Environment and System Personnel Task situation
- Wkatio
- Systnd
- Sing
- Imion accessibility and I
- Procedures, lility an hababy transparency guidelines, and reliabity
- Workplace visibility to persl instructions
- Scenario familiarity
- Noise in workplace
- Human-* Training
- Multi-tasking, and communication system
- Teamwork and interruption and pathways interfaces organizational
- Cold/heat/humidity
- Eip fts distraction
- Resistance to and tls
- Work processes
- Task complexity physical movement
- Mental fatigue
- Time pressure and stress
- Physical demands
Finally, we see that one piece missing from the taxonomy is the so what, a description of human failures due to the challenges or misconceptions. We recommend the target paper authors consider incorporating IDHEAS cognitive failure modes to represent human failures in using automation. The cognitive failure modes represent failures of macrocognitive functions, which are the basic cognitive elements to achieve complex operational tasks. The cognitive failure modes are human-centered, thus they can be used to model human failures in any automation systems.
IDHEAS cognitive failure modes consist of the failures of the following five macrocognitive functions:
- Detection (D) is noticing cues or gathering information in the work environment.
- Understanding (U) is the integration of pieces of information with a persons mental model to make sense of the scenario or situation.
- Decisionmaking (DM) includes selecting strategies, planning, adapting plans, evaluating options, and making judgments on qualitative information or quantitative parameters.
- Action execution (E) is the implementation of the decision or plan to change some physical component or system.
- Interteam coordination (T) focuses on how various teams interact and collaborate on an action.
Each macrocognitive function is achieved through a set of basic cognitive processors.
IDHEAS uses the failure of the processors as detailed cognitive failure modes, as shown in Table 2. These detailed cognitive failure modes can specifically model types of human failures in human-automation integration.
Table 2. IDHEAS detailed cognitive failure modes
Failure of Detailed cognitive failure modes Macrocognitive Function Failure of Detection (D) D1. Fail to establish the correct mental model or to initiate detection D2. Fail to select, identify, or attend to sources of information D3. Incorrectly perceive or classify information D4. Fail to verify perceived information D5. Fail to retain, record, or communicate the acquired information Failure of U1. Fail to assess/select data Understanding (U) U2. Fail to select/adapt/develop the mental model U3. Fail to integrate data with the mental model U4. Fail to verify and revise the outcome of understanding U5. Fail to export the outcome Failure of DM1. Fail to Adapt the infrastructure of decisionmaking Decisionmaking DM2. Fail to Manage the goals and decision criteria (DM) DM3. Fail to Acquire and select data for decisionmaking DM4. Fail to Make decision (judgment, strategies, plans)
DM5. Fail to Simulate or evaluate the decision or plan DM6. Fail to Communicate and authorize the decision Failure of Action E1. Fail to Assess action plan and criteria Execution (E) E2. Fail to Develop or modify action scripts E3. Fail to Prepare or adapt infrastructure for action implementation E4. Fail to Implement action scripts E5. Fail to Verify and adjust execution outcomes Failure of Interteam T1. Fail to establish or adapt teamwork infrastructure Coordination (T) T2. Fail to Manage information T3. Fail to Maintain shared situational awareness T4. Fail to Manage resources T5. Fail to Plan interteam collaborative activities T6. Fail to Implement decisions and commands T7. Fail to Verify, modify, and control the implementation
- 4. Concluding Remarks
The target paper, The Failure to Grasp Automation Failure makes several important contributions towards moving the field of automation research forward. The present commentary is from a regulatory perspective and particularly focused on the impact for t he nuclear domain. The target paper addressed a much richer set of operational challenges through the analysis of automation failure events than can be accomplished in most laboratory experiments concerning human-automation interaction. The framework used builds upon traditional human information processing models, resulting in a more integrated approach. By focusing on the causes and mechanisms of automation failure, the initial taxonomy proposed not only extends the human-automation research for improved automation, but the taxonomy itself represents a potentially successful integration of the interdisciplinary approaches for identifying automation failures including DI&C, HFE, and HRA. The introduction of this taxonomy is a clear step towards addressing the challenge of integrating these interdisciplinary intersections by providing a common language. We suggest several areas where the target paper authors may consider moving forward in the development of the taxonomy, such as incorporation of cognitive failure modes. We also propose several use cases for the taxonomy. The taxonomy may be used to guide experimental research aimed at how different automation features a ffect performance in complex operational contexts. System designers may use the framework to anticipate and identify systemic automation failures. Likewise, after proper evaluation, regulators may use a similar taxonomy as a checklist to evaluate if a particular design is vulnerable to systemic automation failures. Overall, the analysis and proposed taxonomy could benefit the human-automation research, particularly through the focused scope on complex operational contexts like the nuclear domain.
- 5. References
[1] U.S. NUCLEAR REGULATORY COMMISSION, Final Safety Evaluation Report Related to Certification of the AP1000 Standard Design NUREG -1793, Supplement
- 2. U.S. Nuclear Regulatory Commission, Washington, DC, 2011.
[2] U.S. NUCLEAR REGULATORY COMMISSION, 2020. Standard Design Approval for the Nuscale Power Plant Based on the Nuscale Standard Plant Design Certification Application, ML20247J564, U.S. Nuclear Regulatory Commission, Washington DC, 2020.
[3] U.S. NUCLEAR REGULATORY COMMISSION News: NRC Authorizes Vogtle Unit 3 Fuel Loading and Operation. ML22215A210. U.S. Nuclear Regulatory Commission, Washington, DC, August 3, 2022.
[4] U.S. NUCLEAR REGULATORY COMMISSION News: NRC Authorizes Vogtle Unit 4 Fuel Loading and Operation, U.S. Nuclear Regulatory Commission, Washington, DC, July 28, 2023.
[5] Georgia Power, https://www.georgiapower.com/company/news-center/2023-articles/vogtle-unit-3-goes-into-operation.html. July 31, 2023.
[6] U.S. NUCLEAR REGULATORY COMMISSION, Digital Instrumentation and Controls Licensing Process Interim Staff Guidance, DI&C -ISG-06, Revision 2. U.S. Nuclear Regulatory Commission. Washington, DC, 2020.
[7] U.S. NUCLEAR REGULATORY COMMISSION, Human Factors Engineering Program Review Model, NUREG-0711, Rev. 3, US Nuclear Regulatory Commission, Washington, DC, 2012.
[8] U.S. NUCLEAR REGULATORY COMMISSION, Guidance for the Review of Changes to Human Actions, NUREG -1764, Revision 1, US Nuclear Regulatory Commission, Washington, DC, 2007
[9] U.S. NUCLEAR REGULATORY COMMISSION, Proposed Rule: Risk -Informed, Technology-Inclusive Regulatory Framework for Advanced Reactors (RIN 3150-AK31). SECY-23-0021: ML21162A093, U.S. Nuclear Regulatory Commission. Washington, DC, 2023.
[10] U.S. NUCLEAR REGULATORY COMMISSION, Predecisional - Documents to Support Part 53 - Development of Scalable Human Factors Engineering Review Plans: Draft Interim Staff Guidance. DRO -ISG-2023-03. (2022). 10/18-19/2022 ACRS Public Meeting, ML22272A051, U.S. Nuclear Regulatory Commission. Washington, DC, 2022.
[11] G. Skraaning Jr., G. Jamieson, D. Armando, Q. Guanoluisa. Future Challenges in Human-Automation Interaction: Technology Trends and Operational Experiences from Other Industries., Halden Working Report HWR-1308. OECD Halden Reactor Project, Halden, Norway, 2020.
[12] Claire Taylor, Michael Hildebrandt, Robert McDonald, Niav Hughes. Operator Response to Failures of a Computerised Procedure System: Results from a Training Simulator Study, Halden Working Report HWR-1198, OECD Halden Reactor Project, Halden, Norway, 2017.
[13] Hancock, P. A., John D. Lee, and John W. Senders. "Attribution errors by people and intelligent machines." Human factors (2021): 00187208211036323.
[14] G. Skraaning Jr, G.,Jamieson, The Failure to Grasp Automation Failure. Journal of Cognitive Engineering and Decision Making.
https://doi.org/10.1177/15553434231189375, 2023.
[15] J. Xing, N. H. Green, Failure Modes In Human-Automation Integration. The Enlarged Halden Program Review Group (EHPRG) meeting, OECD Halden Human-Technology-Organization (MTO) Project, Halden, Norway, 2023.
[16] U.S. NUCLEAR REGULATORY COMMISSION, Boeing 737 Crashes: Lessons Learned for NRC Digital Instrumentation and Controls Evaluation Process. U.S.
Nuclear Regulatory Commission, Washington, DC (September 22, 2022).
[17] J. Xing, Y. J. Chang, and J. DeJesus, The General Methodology of an Integrated Human Event Analysis System (IDHEAS-G) U.S. Nuclear Regulatory Commission, NUREG-2198 (ADAMS Accession No. ML19235A161), 2019.
[18] J. Xing, Y. J. Chang, and J. DeJesus, "Integrated Human Event Analysis System for Event and Condition Assessment (IDHEAS-ECA)". U.S. Nuclear Regulatory Commission, NUREG-2256, ADAMS Accession Number: ML22165A282, 2022.