ML19017A253: Difference between revisions

From kanterella
Jump to navigation Jump to search
(Created page by program invented by StriderTol)
(Created page by program invented by StriderTol)
Line 45: Line 45:
* RP-0215-10815, Concept of Operations, Revision 2, dated December 2, 2016 (ADAMS Accession No. ML16364A347) (ConOps)
* RP-0215-10815, Concept of Operations, Revision 2, dated December 2, 2016 (ADAMS Accession No. ML16364A347) (ConOps)
* RP-0316-17616, Human Factors Engineering Task Analysis Results Summary Report, Revision 0, dated December 8, 2016 (ADAMS Accession No. ML17004A221) (TA RSR) 18-2
* RP-0316-17616, Human Factors Engineering Task Analysis Results Summary Report, Revision 0, dated December 8, 2016 (ADAMS Accession No. ML17004A221) (TA RSR) 18-2
* RP-0316-17614, Human Factors Engineering Operating Experience Review Results Summary Report, Revision 0, dated December 7, 2016 (ADAMS Accession No.
* RP-0316-17614, Human Factors Engineering Operating Experience Review Results Summary Report, Revision 0, dated December 7, 2016 (ADAMS Accession No. ML16364A341) (OER RSR),
ML16364A341) (OER RSR),
* RP-0316-17615, Human Factors Engineering Functional Requirements Analysis and Function Allocation Results Summary Report, Revision 0, dated December 2, 2016 (ADAMS Accession No. ML16364A341) (FRA/FA RSR)
* RP-0316-17615, Human Factors Engineering Functional Requirements Analysis and Function Allocation Results Summary Report, Revision 0, dated December 2, 2016 (ADAMS Accession No. ML16364A341) (FRA/FA RSR)
* RP-0316-17618, Human Factors Engineering Treatment of Important Human Actions Results Summary Report, Revision 0, dated December 2, 2016 (ADAMS Accession No. ML17004A221) (TIHA RSR)
* RP-0316-17618, Human Factors Engineering Treatment of Important Human Actions Results Summary Report, Revision 0, dated December 2, 2016 (ADAMS Accession No. ML17004A221) (TIHA RSR)
Line 56: Line 55:
* RP-1215-20253, Control Room Staffing Plan Validation Methodology, Revision 3, dated December 2, 2016 (ADAMS Accession No. ML16365A179) (SPV Methodology TR)
* RP-1215-20253, Control Room Staffing Plan Validation Methodology, Revision 3, dated December 2, 2016 (ADAMS Accession No. ML16365A179) (SPV Methodology TR)
* RP-0516-49116, Control Room Staffing Plan Validation Results, Revision 1, dated December 2, 2016 (ADAMS Accession No. ML16365A190) (SPV Results TR)
* RP-0516-49116, Control Room Staffing Plan Validation Results, Revision 1, dated December 2, 2016 (ADAMS Accession No. ML16365A190) (SPV Results TR)
* RP-0316-17617, Human Factors Engineering Staffing and Qualifications Results Summary Report, Revision 0, dated December 2, 2016 (ADAMS Accession No.
* RP-0316-17617, Human Factors Engineering Staffing and Qualifications Results Summary Report, Revision 0, dated December 2, 2016 (ADAMS Accession No. ML17004A221) (S&Q RSR)
ML17004A221) (S&Q RSR)
Regulatory Basis The following NRC regulations contain the relevant requirements for this review:
Regulatory Basis The following NRC regulations contain the relevant requirements for this review:
* Title 10 of the Code of Federal Regulations (10 CFR), Section 52.47(a)(8), as it pertains to the information necessary to demonstrate compliance with any technically relevant portions of the Three Mile Island requirements set forth in 10 CFR 50.34(f), except 10 CFR 50.34(f)(1)(xii), (f)(2)(ix), and (f)(3)(v)
* Title 10 of the Code of Federal Regulations (10 CFR), Section 52.47(a)(8), as it pertains to the information necessary to demonstrate compliance with any technically relevant portions of the Three Mile Island requirements set forth in 10 CFR 50.34(f), except 10 CFR 50.34(f)(1)(xii), (f)(2)(ix), and (f)(3)(v)
Line 235: Line 233:
However, it was not clear to the staff when the applicant considers that the plant design is completed, and at what point in the design process the applicant plans to resolve Priority 2 HEDs. NUREG-0711, Section 11.4.4, explains that significant HEDs, including Priority 2 HEDs, 18-18
However, it was not clear to the staff when the applicant considers that the plant design is completed, and at what point in the design process the applicant plans to resolve Priority 2 HEDs. NUREG-0711, Section 11.4.4, explains that significant HEDs, including Priority 2 HEDs, 18-18


should be addressed during V&V, if feasible. The staff issued RAI 9360, Question 18-41 (ADAMS Accession No. ML18180A359), to request that the applicant explain the phrase when the plant design will be completed. In its response to RAI 9360, Question 18-41 (ADAMS Accession No. ML18172A227), the applicant provided a proposed revision to the DI IP, Section 2.0, Design Implementation Assessments, to clarify that resolution of Priority 2 HEDs will occur before the applicants turnover of the HFE program implementation to a licensee. The staff finds this acceptable because important issues will be addressed by members of the applicants HFE team, who are knowledgeable of the issue and resolution. RAI 9360, Question 18-41, is a Confirmatory Item. However, Section 18.11.4.4.3, Conclusion, of this SER discusses that staff issued RAI 9415, Question 18-46 (ADAMS Accession No.
should be addressed during V&V, if feasible. The staff issued RAI 9360, Question 18-41 (ADAMS Accession No. ML18180A359), to request that the applicant explain the phrase when the plant design will be completed. In its response to RAI 9360, Question 18-41 (ADAMS Accession No. ML18172A227), the applicant provided a proposed revision to the DI IP, Section 2.0, Design Implementation Assessments, to clarify that resolution of Priority 2 HEDs will occur before the applicants turnover of the HFE program implementation to a licensee. The staff finds this acceptable because important issues will be addressed by members of the applicants HFE team, who are knowledgeable of the issue and resolution. RAI 9360, Question 18-41, is a Confirmatory Item. However, Section 18.11.4.4.3, Conclusion, of this SER discusses that staff issued RAI 9415, Question 18-46 (ADAMS Accession No. ML18204A190) in part to request the applicant explain how satisfactory resolution of these HEDs and any HEDs identified during design implementation activities will occur and be documented. RAI 9415, Question 18-46 is Open Item 18-22.
ML18204A190) in part to request the applicant explain how satisfactory resolution of these HEDs and any HEDs identified during design implementation activities will occur and be documented. RAI 9415, Question 18-46 is Open Item 18-22.
Documentation (Criterion 2.4.4(3))
Documentation (Criterion 2.4.4(3))
Criterion 2.4.4(3) states that the applicant should document the actions taken to address each issue in the system, and if no action is required, this should be justified. The HFE PMP, Section 5.3, lists information that is entered in the HFEITS for each issue, which includes the actions taken to address each issue (i.e., resolutions) and justification if no action is taken. The HFE PMP, Section 5.3, states that descriptions of resolutions are sufficiently detailed to provide traceability and third party review. Also, the HFE PMP, Section 5.4.7, states that HEDs may not always be resolved, and the basis for accepting an HED without change is documented.
Criterion 2.4.4(3) states that the applicant should document the actions taken to address each issue in the system, and if no action is required, this should be justified. The HFE PMP, Section 5.3, lists information that is entered in the HFEITS for each issue, which includes the actions taken to address each issue (i.e., resolutions) and justification if no action is taken. The HFE PMP, Section 5.3, states that descriptions of resolutions are sufficiently detailed to provide traceability and third party review. Also, the HFE PMP, Section 5.4.7, states that HEDs may not always be resolved, and the basis for accepting an HED without change is documented.
Line 336: Line 333:
18.2.4.1.2    Staff Assessment The staff compared the scope of the five methods used to assess relevant sources of operating experience described in the OER RSR to the associated NUREG-0711 criteria and considered the supplemental guidance in NUREG/CR-7202, which describes challenges related to OER methodologies that are unique to SMR technologies.
18.2.4.1.2    Staff Assessment The staff compared the scope of the five methods used to assess relevant sources of operating experience described in the OER RSR to the associated NUREG-0711 criteria and considered the supplemental guidance in NUREG/CR-7202, which describes challenges related to OER methodologies that are unique to SMR technologies.
The staff found that the scope described in the OER RSR was consistent with the applicable NRC guidance described above with the following exceptions:
The staff found that the scope described in the OER RSR was consistent with the applicable NRC guidance described above with the following exceptions:
* At a high level, the OER RSR submittal was consistent with the six bullet points of Criterion 3.4.1(1); however, it did not specifically identify some notable published examples of OER that apply to SMRs (see NUREG/CR-7202, Appendix A: Questions for SMR Applicants Organized by NUREG-0711 Element, Section A.1 Operating Experience Review). The staff issued RAI 9153, Question 18-5 (ADAMS Accession No.
* At a high level, the OER RSR submittal was consistent with the six bullet points of Criterion 3.4.1(1); however, it did not specifically identify some notable published examples of OER that apply to SMRs (see NUREG/CR-7202, Appendix A: Questions for SMR Applicants Organized by NUREG-0711 Element, Section A.1 Operating Experience Review). The staff issued RAI 9153, Question 18-5 (ADAMS Accession No. ML17286B066), to clarify the scope of the OER analysis with regard to certain nonnuclear industries. The response to RAI 9153, Question 18-5 (ADAMS Accession 18-27
ML17286B066), to clarify the scope of the OER analysis with regard to certain nonnuclear industries. The response to RAI 9153, Question 18-5 (ADAMS Accession 18-27


No. ML17346A971), describes the results obtained from the OER process when considering these nonnuclear technologies. The staff was able to determine that the OER process had, in fact, included the appropriate nonnuclear industries. Therefore, the staff considers RAI 9153, Question 18-5, to be resolved.
No. ML17346A971), describes the results obtained from the OER process when considering these nonnuclear technologies. The staff was able to determine that the OER process had, in fact, included the appropriate nonnuclear industries. Therefore, the staff considers RAI 9153, Question 18-5, to be resolved.
Line 436: Line 432:
Figures 4-10-4-24 of the FRA/FA RSR illustrate how the FRA/FA database identifies system configurations necessary for safe operation. These screen shots show a variety of parameters necessary for the operator to understand, such as when the associated function is necessary, working, and ready for termination.
Figures 4-10-4-24 of the FRA/FA RSR illustrate how the FRA/FA database identifies system configurations necessary for safe operation. These screen shots show a variety of parameters necessary for the operator to understand, such as when the associated function is necessary, working, and ready for termination.
The applicant credited the design reliability assurance program in this process (see SRP Chapter 17, Quality Assurance, Section 17.4, and ER-0000-3387, NuScale Plant Functions, Revision 0).
The applicant credited the design reliability assurance program in this process (see SRP Chapter 17, Quality Assurance, Section 17.4, and ER-0000-3387, NuScale Plant Functions, Revision 0).
The staff issued RAI 9381, Question 18-15 (ADAMS Accession No. ML18068A728), to clarify the nature of the safety functions compared to the critical safety functions (CSFs) necessary for other HFE review elements (such as the HSI design review element in NUREG-0711, Chapter 8, Human-System Interface Design) and review under SRP Chapter 13, Conduct of Operations. The response to RAI 9381, Question 18-15 (ADAMS Accession No.
The staff issued RAI 9381, Question 18-15 (ADAMS Accession No. ML18068A728), to clarify the nature of the safety functions compared to the critical safety functions (CSFs) necessary for other HFE review elements (such as the HSI design review element in NUREG-0711, Chapter 8, Human-System Interface Design) and review under SRP Chapter 13, Conduct of Operations. The response to RAI 9381, Question 18-15 (ADAMS Accession No. ML18114A351), clarifies that there are three safety functions, which are equivalent to the CSFs (maintain containment integrity, reactivity control, and remove fuel assembly heat). This clarification was sufficient to close the RAI.
ML18114A351), clarifies that there are three safety functions, which are equivalent to the CSFs (maintain containment integrity, reactivity control, and remove fuel assembly heat). This clarification was sufficient to close the RAI.
In June 2018, the staff conducted an audit of the FRA/FA database (ADAMS Accession No. ML18208A370). The staff confirmed that the results in the database were adequately represented by the sample of results in the FRA/FA RSR. The results reviewed were consistent with the method described in the FRA/FA RSR.
In June 2018, the staff conducted an audit of the FRA/FA database (ADAMS Accession No. ML18208A370). The staff confirmed that the results in the database were adequately represented by the sample of results in the FRA/FA RSR. The results reviewed were consistent with the method described in the FRA/FA RSR.
With respect to Criterion 4.4(4), the staff reviewed the sample database entries found in the FRA/FA RSR and found them to contain entries for each of the bulleted areas listed in this criterion. The structure of the database helps to ensure that the FRA/FA process will include those bulleted items of the criterion. The staff finds that the use of the FRA/FA database is an 18-36
With respect to Criterion 4.4(4), the staff reviewed the sample database entries found in the FRA/FA RSR and found them to contain entries for each of the bulleted areas listed in this criterion. The structure of the database helps to ensure that the FRA/FA process will include those bulleted items of the criterion. The staff finds that the use of the FRA/FA database is an 18-36
Line 498: Line 493:
However, the application contains little information on verification that the goals described in this criterion have been accomplished. The FRA/FA RSR, Section 5.0, Analysis of Conclusions, describes the interdisciplinary approach used in the design/analysis process. It indicates that the results are reviewed and evaluated but provides little detail as to how this is done.
However, the application contains little information on verification that the goals described in this criterion have been accomplished. The FRA/FA RSR, Section 5.0, Analysis of Conclusions, describes the interdisciplinary approach used in the design/analysis process. It indicates that the results are reviewed and evaluated but provides little detail as to how this is done.
Moreover, the applicant submitted the FRA/FA RSR before an NRC audit that found that several of the analyses were not yet complete.
Moreover, the applicant submitted the FRA/FA RSR before an NRC audit that found that several of the analyses were not yet complete.
The staff was unable to independently confirm the execution of these processes to be complete during the May 2017 audit. It is clear from the criterion that the full intent of the criterion is to confirm that the final product of the FRA/FA process is of high quality. The criterions use of the word verify implies that the process must be complete. In addition, the first two bullets of the criterion use the word all, indicating that a sample of results is not adequate to complete this criterion. Moreover, the fact that the applicant submitted the FRA/FA RSR before the completion of FRA/FA activities suggests that any verification process that was executed was ineffective. Therefore, the staff issued RAI 9372, Question 18-14 (ADAMS Accession No.
The staff was unable to independently confirm the execution of these processes to be complete during the May 2017 audit. It is clear from the criterion that the full intent of the criterion is to confirm that the final product of the FRA/FA process is of high quality. The criterions use of the word verify implies that the process must be complete. In addition, the first two bullets of the criterion use the word all, indicating that a sample of results is not adequate to complete this criterion. Moreover, the fact that the applicant submitted the FRA/FA RSR before the completion of FRA/FA activities suggests that any verification process that was executed was ineffective. Therefore, the staff issued RAI 9372, Question 18-14 (ADAMS Accession No. ML18068A727), to clarify what, if any, verification had taken place. The response to RAI 9372, Question 18-14 (ADAMS Accession No. ML18114A822), indicates that an interdisciplinary team verified that the high-level functions reported in the FRA/FA RSR, Table 3-1, were accurate.
ML18068A727), to clarify what, if any, verification had taken place. The response to RAI 9372, Question 18-14 (ADAMS Accession No. ML18114A822), indicates that an interdisciplinary team verified that the high-level functions reported in the FRA/FA RSR, Table 3-1, were accurate.
The FRA/FA RSR, Section 5.0, discusses this verification. The verification of allocations was initially assessed during the SPV and will be tested during the ISV (to occur concurrently with the NRC Phase 4 review). Because this is credited as a verification activity, RAI 9372, Question 18-14 is Open Item 18-1 until after the ISV is complete and a determination has been made that the allocations are appropriate.
The FRA/FA RSR, Section 5.0, discusses this verification. The verification of allocations was initially assessed during the SPV and will be tested during the ISV (to occur concurrently with the NRC Phase 4 review). Because this is credited as a verification activity, RAI 9372, Question 18-14 is Open Item 18-1 until after the ISV is complete and a determination has been made that the allocations are appropriate.
18.3.4.4.3    Conclusion The staff finds that the methodologies described are consistent with the applicable criteria in NUREG-0711; therefore, the methodologies are acceptable. However, because ISV testing is credited as a means of satisfying Criterion 4.4(8), the staff should confirm that the verification of the FRA/FA is complete and accurate after the test is complete. Therefore, the applicant has not met this criterion until a final verification of the FRA/FA process is completed and confirmed.
18.3.4.4.3    Conclusion The staff finds that the methodologies described are consistent with the applicable criteria in NUREG-0711; therefore, the methodologies are acceptable. However, because ISV testing is credited as a means of satisfying Criterion 4.4(8), the staff should confirm that the verification of the FRA/FA is complete and accurate after the test is complete. Therefore, the applicant has not met this criterion until a final verification of the FRA/FA process is completed and confirmed.
Line 1,390: Line 1,384:
When an HED is generated, the V&V IP, Section 5.2, describes the procedure for assessing whether the HED is an indicator of additional issues. Specifically, the applicant will use the following methods to assess the extent of condition and causal effects across HSI design features to determine whether an HED is an indicator of additional issues:
When an HED is generated, the V&V IP, Section 5.2, describes the procedure for assessing whether the HED is an indicator of additional issues. Specifically, the applicant will use the following methods to assess the extent of condition and causal effects across HSI design features to determine whether an HED is an indicator of additional issues:
*        [[
*        [[
*
*        ]]
*        ]]
Therefore, the staff concludes the applicants methodology includes procedures for evaluating whether an HED is a potential indicator of additional issues. The staff finds that the application conforms to Criterion 11.4.2.3(2).
Therefore, the staff concludes the applicants methodology includes procedures for evaluating whether an HED is a potential indicator of additional issues. The staff finds that the application conforms to Criterion 11.4.2.3(2).
Line 1,483: Line 1,476:
* Bias in the sample of participants should be prevented by avoiding the use of participants who (1) are members of the design organization, (2) participated in prior evaluations, and (3) were selected for some specific characteristic, such as crews identified as good performers or more experienced.
* Bias in the sample of participants should be prevented by avoiding the use of participants who (1) are members of the design organization, (2) participated in prior evaluations, and (3) were selected for some specific characteristic, such as crews identified as good performers or more experienced.
The intent of Criteria 11.4.3.4 (1), (2), and (4) is twofold: to ensure that those participating in the ISV testing are representative of those that will eventually operate the real plant and to ensure bias is adequately controlled such that not only good performers are included or personnel with some unfair advantage like superior knowledge of the design, both of which may bias results.
The intent of Criteria 11.4.3.4 (1), (2), and (4) is twofold: to ensure that those participating in the ISV testing are representative of those that will eventually operate the real plant and to ensure bias is adequately controlled such that not only good performers are included or personnel with some unfair advantage like superior knowledge of the design, both of which may bias results.
The V&V IP, Section 4.4, describes the participants, stating that Individual operating crews participating in the ISV may be previously licensed commercial reactor or senior reactor operators, operators with Navy nuclear experience, or design engineering staff members familiar with the NuScale Power plant design. Section 4.4 states that crew members are selected and distributed across crews with consideration for age, gender, education level, and experience. Because the applicant cited members of the design engineering staff as potential ISV participants, the staff issued RAI 9371, Question 18-24 (ADAMS Accession No.
The V&V IP, Section 4.4, describes the participants, stating that Individual operating crews participating in the ISV may be previously licensed commercial reactor or senior reactor operators, operators with Navy nuclear experience, or design engineering staff members familiar with the NuScale Power plant design. Section 4.4 states that crew members are selected and distributed across crews with consideration for age, gender, education level, and experience. Because the applicant cited members of the design engineering staff as potential ISV participants, the staff issued RAI 9371, Question 18-24 (ADAMS Accession No. ML18077A001) to clarify how these participants are representative of the anticipated plant 18-135
ML18077A001) to clarify how these participants are representative of the anticipated plant 18-135


personnel who will interact with the HSI, how bias is prevented, and whether participants have participated in prior evaluations (e.g., SPV).
personnel who will interact with the HSI, how bias is prevented, and whether participants have participated in prior evaluations (e.g., SPV).
Line 1,620: Line 1,612:
The V&V IP, Section 4.7, also states the following:
The V&V IP, Section 4.7, also states the following:
Data are analyzed for each scenario across multiple trials. The method of analysis, consistency of measure assessing performance, and criteria used to determine successful performance for a given scenario is determined by the HFE Design Team.
Data are analyzed for each scenario across multiple trials. The method of analysis, consistency of measure assessing performance, and criteria used to determine successful performance for a given scenario is determined by the HFE Design Team.
Although the applicant committed to analyzing data across trials, it did not provide any information on the methodology or on the criteria used to determine successful performance for a given scenario. The staff issued RAI 9399, Question 18-35 (ADAMS Accession No.
Although the applicant committed to analyzing data across trials, it did not provide any information on the methodology or on the criteria used to determine successful performance for a given scenario. The staff issued RAI 9399, Question 18-35 (ADAMS Accession No. ML18082B396), to ask the applicant to describe the method(s) that will be used to analyze data across trials and the criteria that will be used to determine successful performance. The applicants supplemental response to RAI 9399, Question 18-35 (ML18249A421) provides both revisions to the V&V IP and specific proprietary examples of trending techniques across trials.
ML18082B396), to ask the applicant to describe the method(s) that will be used to analyze data across trials and the criteria that will be used to determine successful performance. The applicants supplemental response to RAI 9399, Question 18-35 (ML18249A421) provides both revisions to the V&V IP and specific proprietary examples of trending techniques across trials.
The applicant stated, Data is collected from multiple sources including crew debriefs, observer debriefs, NASA TLX questionnaires, Situational Awareness questionnaires, and management observations. The data is collected and added to a database where an HFE Subject Matter Expert (SME) and an Operations SME bin and code the performance data and then independently identify significant issues and trends within the data. This analysis compares and contrasts data sources, data across crews, data across trials, and data across scenarios. The HFE and Operations SMEs then collaborate on trending results and Human Engineering Discrepancy (HED) identification.
The applicant stated, Data is collected from multiple sources including crew debriefs, observer debriefs, NASA TLX questionnaires, Situational Awareness questionnaires, and management observations. The data is collected and added to a database where an HFE Subject Matter Expert (SME) and an Operations SME bin and code the performance data and then independently identify significant issues and trends within the data. This analysis compares and contrasts data sources, data across crews, data across trials, and data across scenarios. The HFE and Operations SMEs then collaborate on trending results and Human Engineering Discrepancy (HED) identification.
Additionally, in the response to RAI 9399, Question 18-35 (ML18137A584) the applicant identified the specific criteria used to determine successful performance for a given scenario Because the applicant intends to evaluate the data collected from all scenario trials, the staff finds the applicants data analysis methodology acceptable to analyze data from multiple trials and to assess the success of each scenario, which is consistent with Criterion 11.4.3.7(2). RAI 9399, Question 18-35, is being tracked as a Confirmatory Item pending the incorporation of the changes into next revision of the DCA Part 2.
Additionally, in the response to RAI 9399, Question 18-35 (ML18137A584) the applicant identified the specific criteria used to determine successful performance for a given scenario Because the applicant intends to evaluate the data collected from all scenario trials, the staff finds the applicants data analysis methodology acceptable to analyze data from multiple trials and to assess the success of each scenario, which is consistent with Criterion 11.4.3.7(2). RAI 9399, Question 18-35, is being tracked as a Confirmatory Item pending the incorporation of the changes into next revision of the DCA Part 2.

Revision as of 14:14, 2 February 2020

DCA - Chapter 18 SE with Open Items (Public)
ML19017A253
Person / Time
Site: NuScale
Issue date: 01/02/2019
From: Prosanta Chowdhury
NRC/NRO/DLSE/LB1
To:
Chowdhury P/NRO/1647
References
Chapter 18
Download: ML19017A253 (165)


Text

18 HUMAN FACTORS ENGINEERING This chapter of the safety evaluation report (SER) documents the U.S. Nuclear Regulatory Commission (NRC or Commission) staffs review of Chapter 18, Human Factors Engineering, of the NuScale Power, LLC (hereinafter referred to as the applicant), Design Certification Application (DCA), Part 2, Final Safety Analysis Report (FSAR), Revision 1 (Agencywide Documents Access and Management System (ADAMS) Accession No. ML18086A191).

The staff reviewed the human factors engineering (HFE) of the control room design in accordance with NUREG-0800, Standard Review Plan for the Review of Safety Analysis Reports for Nuclear Power Plants: LWR Edition (SRP), Chapter 18, Human Factors Engineering, Revision 3, issued December 2016. Consistent with SRP Chapter 18, the staff compared the application to the relevant1 review criteria in NUREG-0711, Human Factors Engineering Program Review Model, Revision 3, issued November 2012 (ADAMS Accession No. ML12324A013), in order to gain reasonable assurance that the application complies with the HFE-related regulations cited under the Regulatory Basis subsections of this SER.

SRP Chapter 18 identifies 12 areas of review for the successful integration of human characteristics and capabilities into nuclear power plant design. These areas of review correspond to the 12 elements of an HFE program identified in NUREG-0711:

  • HFE program management
  • operating experience review
  • functional requirements analysis (FRA) and function allocation (FA)
  • task analysis
  • staffing and qualifications (S&Q)
  • treatment of important human actions (IHAs)
  • human-system interface (HSI) design
  • procedure development
  • training program development
  • human factors verification and validation (V&V)
  • design implementation
  • human performance monitoring The staff has organized Chapter 18 of this SER to align with the 12 elements.

Additionally, in DCA Part 7, Exemptions, Section 6, 10 CFR 50.54(m), Control Room Staffing, the applicant requested that minimum licensed operator staffing requirements specific to the NuScale standard plant design be adopted as requirements applicable to licensees referencing the NuScale standard plant design certification (DC) in lieu of the requirements stated in Title 10 of the Code of Federal Regulations (10 CFR), Section 50.54(m). The staffing requirements in 10 CFR 50.54(m) are applicable to facility licensees; they are not applicable to applicants for a design certification. Therefore, although the proposed licensed operator staffing requirements for the NuScale standard plant design are included in DCA Part 7, the applicant 1 Not all of the review criteria in NUREG-0711 are relevant to a DCA. For example, some criteria are relevant only to licensees that are modifying a control room design at an operating reactor. Those criteria are identified in NUREG-0711 and are not included in this SER.

18-1

does not propose an exemption from the requirements in 10 CFR 50.54(m). NUREG-1791, Guidelines for Assessing Exemption Requests from the Nuclear Power Plant Licensed Operating Staff Requirements Specified in 10 CFR 50.54(m), issued July 2005 (ADAMS Accession No. ML052080125), contains guidance the staff uses to determine whether an applicants staffing proposal provides adequate assurance that public health and safety will be maintained at a level that is comparable to that afforded by compliance with the current regulations. Section 18.5, Staffing and Qualifications, of this SER includes the staffs evaluation of the proposed design-specific staffing requirement.

18.1 Human Factors Engineering Program Management Introduction The objective of this element is to verify that the applicant has an HFE design team with the responsibility, authority, placement within the organization, and composition to reasonably assure that the plant design meets the commitment to HFE. NUREG-0711, Chapter 2, HFE Program Management, Section 2.3, Applicant Products and Submittals, states that the applicant should provide an implementation plan (IP) for HFE program management, and this element has no results summary report (RSR).

Summary of Application DCA Part 2 Tier 1: DCA Part 2 Tier 1, Section 3.15, Human Factors Engineering, Revision 1, contains the Tier 1 information associated with Design Description (System Description, Design Commitments), and Inspections, Tests, Analyses, and Acceptance Criteria (ITAAC) (ADAMS Accession No. ML18086A146).

DCA Part 2 Tier 2: The applicant provided a description of this HFE element in DCA Part 2 Tier 2, Section 18.1, Human Factors Engineering Program Management.

ITAAC: There are no inspections, tests, analyses, and acceptance criteria (ITAAC) associated with this HFE element.

Technical Specifications: There are no technical specifications (TS) associated with this HFE element.

Topical Reports: There are no topical reports associated with this element.

Technical Reports: The applicant submitted the following technical reports (TRs) in support of the HFE design:

  • RP-0914-8534, Human Factors Engineering Program Management Plan, Revision 3, dated December 2, 2016 (ADAMS Accession No. ML17004A221) (HFE PMP)
  • RP-0215-10815, Concept of Operations, Revision 2, dated December 2, 2016 (ADAMS Accession No. ML16364A347) (ConOps)
  • RP-0316-17616, Human Factors Engineering Task Analysis Results Summary Report, Revision 0, dated December 8, 2016 (ADAMS Accession No. ML17004A221) (TA RSR) 18-2
  • RP-0316-17614, Human Factors Engineering Operating Experience Review Results Summary Report, Revision 0, dated December 7, 2016 (ADAMS Accession No. ML16364A341) (OER RSR),
  • RP-0316-17615, Human Factors Engineering Functional Requirements Analysis and Function Allocation Results Summary Report, Revision 0, dated December 2, 2016 (ADAMS Accession No. ML16364A341) (FRA/FA RSR)
  • RP-0316-17618, Human Factors Engineering Treatment of Important Human Actions Results Summary Report, Revision 0, dated December 2, 2016 (ADAMS Accession No. ML17004A221) (TIHA RSR)
  • RP-0316-17619, Human-System Interface Design Results Summary Report, Revision 0, dated December 2, 2016 (ADAMS Accession No. ML16364A347) (HSI Design RSR)
  • RP-0914-8543, Human Factors Verification and Validation Implementation Plan, Revision 4, dated November 30, 2017 (ADAMS Accession No. ML18004B973) (V&V IP)
  • RP-0914-8544, Human Factors Engineering Design Implementation Implementation Plan, Revision 1, dated September 16, 2016 (ADAMS Accession No. ML16364A347)

(DI IP)

  • RP-1215-20253, Control Room Staffing Plan Validation Methodology, Revision 3, dated December 2, 2016 (ADAMS Accession No. ML16365A179) (SPV Methodology TR)
  • RP-0516-49116, Control Room Staffing Plan Validation Results, Revision 1, dated December 2, 2016 (ADAMS Accession No. ML16365A190) (SPV Results TR)
  • RP-0316-17617, Human Factors Engineering Staffing and Qualifications Results Summary Report, Revision 0, dated December 2, 2016 (ADAMS Accession No. ML17004A221) (S&Q RSR)

Regulatory Basis The following NRC regulations contain the relevant requirements for this review:

  • Title 10 of the Code of Federal Regulations (10 CFR), Section 52.47(a)(8), as it pertains to the information necessary to demonstrate compliance with any technically relevant portions of the Three Mile Island requirements set forth in 10 CFR 50.34(f), except 10 CFR 50.34(f)(1)(xii), (f)(2)(ix), and (f)(3)(v)
  • 10 CFR 50.34(f)(2)(iii), to provide, for Commission review, a control room design that reflects state-of-the-art human factor principles before committing to the fabrication or revision of fabricated control room panels and layouts 18-3

SRP Chapter 18,Section III, Acceptance Criteria, lists the acceptance criteria adequate to meet the above requirements, as well as review interfaces with other SRP sections.

Acceptance criteria for HFE design methodology are provided in NUREG-0711 (listed below).

(NUREG-0711 references NUREG-0700, Human-System Interface Design Review Guidelines, which provides detailed acceptance criteria for HFE design attributes.)

  • NUREG-0711, Revision 3, Section 2.4, Review Criteria The following documents also provide additional criteria or guidance in support of the SRP acceptance criteria to meet the above requirements:
  • NUREG-0696, Functional Criteria for Emergency Response Facilities, issued February 1981
  • NUREG-0700, Human-System Interface Design Review Guidelines, Revision 2, issued May 2002 Technical Evaluation General Human Factors Engineering Program Goals and Scope (Criteria 2.4.1(1)-(7))

NUREG-0711, Section 2.4.1, General HFE Program Goals and Scope, includes seven criteria for this topic. The seventh criterion addresses plant modifications and is not applicable to new reactors; therefore, the staff only evaluated the first six criteria as discussed below. The six criteria address HFE program goals (Criterion 2.4.1(1)); assumptions and constraints (Criterion 2.4.1(2)); HFE program duration (Criterion 2.4.1(3)); HFE facilities (Criterion 2.4.1(4));

HSIs, procedures, and training (Criterion 2.4.1(5)); and personnel (Criterion 2.4.1(6)).

HFE Program Goals (Criterion 2.4.1(1))

Criterion 2.4.1(1) identifies four general human-centered goals for an HFE program, and it also states that as the HFE program develops, the generic goals should be further defined and used as a basis for HFE tests and evaluations. The applicants HFE PMP, Section 2.1, Program Goals, lists the goals of the applicants HFE program. The staff reviewed these goals and found they include the four generic human-centered HFE design goals listed in Criterion 2.4.1(1). The generic goals are that personnel tasks can be accomplished within time and other performance criteria, and that the integrated system (i.e., hardware, software, and personnel elements) supports personnel situation awareness, provides acceptable workload levels, and supports error detection and recovery capability.

The HFE PMP, Section 2.1, also states that as the program develops, the goals are further defined and used as a basis for HFE tests and evaluations. One significant HFE evaluation the applicant conducted was the staffing plan validation (SPV). The staff reviewed the SPV Results TR, Section 6.1, Staffing Plan Validation Evaluation Methods, which identifies the criteria the applicant used to evaluate the proposed minimum staffing level during the SPV test. The staff found that the applicant identified specific methods to evaluate whether task performance, personnel situation awareness, workload, and error detection and recovery capability were acceptable under challenging operating conditions. For example, the applicant identified time 18-4

limits within which certain tasks were required to be performed, as well as the upper and lower acceptable limits of workload.

Another significant HFE evaluation is the integrated systems validation (ISV), which the NRC defines in NUREG-0711 as an evaluation, using performance-based tests, to determine whether an integrated systems design (i.e., hardware, software, and personnel elements) meets performance requirements and supports the plants safe operation. Human engineering discrepancies (HEDs) are identified if performance criteria are not met. The applicants V&V IP, Section 4.5.1, Types of Performance Measures, and Section 4.5.2.1, Collection Methods, identify the methods the applicant will use to evaluate the ISV results. These include methods to evaluate whether task performance meets time and performance criteria, situation awareness and workload are acceptable, and HSIs minimize personnel errors and support error detection and recovery.

Based on the above, the staff finds that the applicant defined the general HFE program goals and developed specific acceptance criteria based on these general goals for evaluating the results of HFE tests and evaluations in order to assess whether the general HFE program goals have been met. Thus, the application conforms to this criterion.

Assumptions and Constraints (Criterion 2.4.1(2))

Criterion 2.4.1(2) states that the applicant should identify the design assumptions and constraints (i.e., aspects of the design that are inputs to the HFE program). The applicant identified the following design assumptions and constraints in HFE PMP, Section 2.2.1, Assumptions and Constraints:

  • Passive features: The passive safety features reduce the need for operator action during any design-basis event (DBE). Specifically, DCA Part 2 Tier 2, Section 15.0.0.5, Limiting Single Failures, states that no operator actions are required for 72 hours8.333333e-4 days <br />0.02 hours <br />1.190476e-4 weeks <br />2.7396e-5 months <br /> following a DBE.
  • Modular design: The plant is intended to be scalable up to 12 units at a site, and operation of the first unit can begin before successive units are complete. Refueling of individual units can occur with others online. All units are controlled from a single main control room (MCR).
  • High degree of automation: The NuScale plant is highly automated to reduce the need for operator actions and to allow for monitoring multiple units simultaneously. Routine operating tasks are automated to the extent that human interactions to start, stop, or suspend automated sequences do not distract the operator.

Additionally, in DCA Part 2 Tier 2, Section 18.5.2, Methodology, the applicant identified the initial MCR staffing assumption, which is that the MCR staff consists of three licensed reactor operators (ROs) and three licensed senior reactor operators (SROs). The applicant explained that the basis for the initial MCR staffing assumption is that the passive safety systems, simplicity of operation, high levels of automation, and a limited number of IHAs will keep workload levels within acceptable limits for the MCR staff. The initial staffing assumption was an input to the other HFE analyses, such as the task analysis (TA), and also to the SPV, which the staff discusses in detail in Section 18.5 of this SER.

18-5

The HSI Design RSR Section 3.3, Human-System Interface Design Overview, states that the HFE team presents findings and solicits input from the instrumentation and control (I&C) and computer systems design disciplines in order to consider whether the HFE design concepts are technically feasible, with a special emphasis on performance requirements. The HSI Design RSR, Section 4.1.2.1, System Requirements, states, There are no known I&C platform system constraints related to the MCR layout optimization or HSI design for monitoring and control of multiple units.

The staff finds that the applicant identified HFE design assumptions as summarized above. This treatment conforms to this criterion.

HFE Program Duration (Criterion 2.4.1(3))

Criterion 2.4.1(3) states that the applicants HFE program should be in effect at least from the start of the design cycle through completion of the initial plant startup test program. The HFE PMP, Section 2.2.2, HFE Program Duration, states that the HFE program is applicable from the start of conceptual design through the completion of plant startup testing. Accordingly, the staff finds that the application conforms to this criterion.

HFE Facilities (Criterion 2.4.1(4))

Criterion 2.4.1(4) states that the applicants HFE program should cover the MCR, remote shutdown facility, technical support center (TSC), emergency operations facility (EOF), and local control stations (LCSs). However, applicants may apply the elements of the HFE program in a graded fashion to facilities other than the MCR and remote shutdown facility, providing justification in the HFE program plan. The executive summary of the HFE PMP states that the HFE program incorporates all 12 elements listed in NUREG-0711. The HFE PMP, Section 2.2.3, Applicable Facilities, states that the HFE program scope includes the alarms, controls, indications, and procedures applicable to the MCR and the remote shutdown station (RSS) (i.e., the remote shutdown facility). Therefore, the staff finds that the HFE program covers the MCR and RSS.

SRP Chapter 18,Section II.7, states, [t]ypically the HFE design responsibility is split between the DC applicants (identifies the displays and alarms) and the COL applicant (identifies facility layout, radiation level data, and communications). Sections 18.7.4.6 and 18.7.4.7 of this SER contain the staffs evaluation of how the applicant identified the displays and alarms to be included in the TSC. Those sections also document the staffs conclusion that the applicant fulfilled its HFE design responsibility for the TSC and EOF as discussed in SRP Chapter 18, Revision 3,Section II.7.

The Commissions regulations do not specifically address LCSs. However, it is important to the staff to know whether the HFE program includes LCSs used to conduct safety- or risk-significant actions when the consequences of operator errors are significant. Section 1.1, Background, of NUREG/CR-6146, Local Control Stations: Human Engineering Issues and Insights, issued September 1994, states the following:

The U.S. Nuclear Regulatory Commission (NRC) developed "Guidelines for Control Room Design Reviews," (NUREG-0700) to provide guidance on the human factors aspects of control room design, and it is widely used as a basis for evaluating human factors as part of detailed reviews of these designs. However, 18-6

these guidelines have not been applied consistently to operator interfaces located outside the main control room [i.e., local control stations (LCSs)]. At many of these LCSs, operators must take action during normal, abnormal, and emergency operations. Errors at LCSs have initiated and exacerbated off-normal events. Therefore, human engineering of these operator interfaces is important to the NRC.

The TIHA RSR (RP-0316-17618), Section 3.3.5, Addressing Important Human Actions during Human-System Interface Design, states the following:

When a local control station (LCS) is required for conducting an IHA that LCS HSI is designed using the same style guide as the MCR HSIs. This ensures HSI design consistency, training efficiency, clear labeling, easy accessibility, and avoidance of hazardous locations.

The staff finds the applicants plan to design LCSs for IHAs using the HFE guidelines in the same style guide that applies to the design of the MCR acceptable because the Style Guide contains relevant guidance for HSIs at LCSs, including guidelines for labeling, accessibility, and avoidance of hazardous locations.

The ConOps, Section 3.2.5, Arrangement of Human-System Interfaces, identifies HSIs that support refueling activities as LCSs. The TIHA RSR did not list any IHAs related to refueling activities. DCA Part 2 Tier 2, Chapter 19.1.6.2, Results from the Low Power and Shutdown Operations Probabilistic Risk, identifies key insights for the low power and shutdown internal probabilistic risk assessment (PRA) and states that module drop accidents are the dominant contributors to core damage. Because of the relative risk significance of the reactor building crane, and because the applicant identified that errors in its operation can contribute to the likelihood of a core damage event, the staff needed to understand whether HFE guidelines have been or will be applied to the HSIs used during module movement to help prevent occurrence of significant operator errors during module movement. Accordingly, the staff issued Request for Additional Information (RAI) 9360, Question 18-42 (ADAMS Accession No. ML18180A359).

In the response to RAI 9360, Question 18-42 (ADAMS Accession No. ML18172A227), the applicant stated the following:

The LCS HSI used for module movement are vendor-supplied. The HFE design for these controls will be developed by the vendor because the controls must reflect the specialized nature of crane operation. The NuScale HFE design team is working with engineering to develop procurement specifications that characterize the crane control function requirements.

Implementation of the Style Guide standards will be included in the purchase specification to establish as much consistency with NuScale HFE design as possible but on a not to interfere basis with establishing the safety and control standards required by crane design. Since this effort is at an early stage of development and beyond the scope of the current MCR verification and validation (V&V) process, specific details on the scope of HFE related direction in the procurement specification cannot be addressed at this time.

18-7

The staff understands that the design of the reactor building crane HSIs will include the HFE standards in the Style Guide to the extent possible by incorporating the HFE guidelines in the purchase specifications. The staff also finds that having the HFE design team working with engineering staff to develop procurement specifications that characterize the crane control function requirements is a good HFE practice to help minimize the occurrence of errors during module movement. Therefore, the staff concludes that the applicants plan to include HFE guidelines in the procurement specifications would help minimize operator errors that might occur during module movement. RAI 9360, Question 18-42, is resolved and closed.

The applicant has addressed the IHAs it has identified at this point in the design as discussed in the TIHA RSR. Also, the staff acknowledges that the reactor building crane design has not yet been completed, and therefore detailed information about its HSI design does not exist at this time, nor will it exist when the applicant completes the V&V activities. Section 18.6.4.1 of this SER discusses that the staff conducting the review of DCA Part 2 Tier 2, Chapter 19 has identified issues that may or may not change the risk-important human actions and issued RAI 9128, Question 19-37 (ADAMS Accession No. ML17340A626) to resolve these issues. The issues addressed by RAI 9128 are not yet resolved, and the staff is tracking resolution of RAI 9128 to determine whether there are any human actions related to refueling that may be significant enough to be considered IHAs. As discussed in more detail in Section 18.11.4.5.2 of this SER, the staff issued RAI 9415, Question 18-46 (ADAMS Accession No. ML18204A190) in part to request the applicant to clarify how any IHAs identified after completion of the V&V activities will be addressed during the design implementation activities to be performed by a licensee prior to fuel load. Resolution of RAI 9415, Question 18-46 is Open Item 18-22.

Accordingly, the staff finds that the application conforms to this criterion.

HSIs, Procedures and Training (Criterion 2.4.1(5))

Criterion 2.4.1(5) states that the applicants HFE program should address the design of HSIs and identify inputs to the development of procedures and training for all operations, accident management, maintenance, test, inspection, and surveillance tasks that operational personnel will perform or supervise. In addition, the HFE design process should identify training program input for I&C technicians, electrical maintenance personnel, mechanical maintenance personnel, radiological protection technicians, chemistry technicians, and engineering support personnel. Any other personnel who perform tasks directly related to plant safety also should be included.

The applicant described the HSI design in the HSI Design RSR and DCA Part 2 Tier 2, Section 18.7, Human-System Interface Design. Therefore, the staff finds the applicant addressed the design of HSIs in the HSI Design RSR.

The TA RSR, Section 3.6.1, Functional Requirements Analysis and Function Allocation and Task Analysis, describes how the applicant used the VISION developer application to identify input to the development of procedures and the training program for operational personnel.

VISION is a relational database that is used to store the FRA/FA, TA, S&Q analysis, development of HSIs, procedures, and training data. VISION is commonly used in the nuclear industry to manage the training programs for plant personnel, including licensed operators and personnel identified in 10 CFR 50.120, Training and Qualification of Nuclear Power Plant Personnel, that must be trained using a systems approach to training. As shown in the TA RSR, Table 3-2, VISION icon descriptions, VISION can document the steps required to 18-8

complete a task, which are direct inputs to procedure development, and also the skills, knowledge, and abilities personnel need to perform the tasks, which are direct inputs to the training program. Further, DCA Part 2 Tier 2, Section 18.4.3, Results, states that the TA also produced a basic knowledge and abilities catalog.

In addition, the TA RSR, Section 1.2, Scope, states that the HFE program analyzes tasks associated with activities performed by the plant personnel identified in 10 CFR 50.120 and other personnel, such as information technology technicians, when those activities include tasks that impact licensed operator workload. Also, the HFE PMP, Section 2.2.4, Applicable Human-System Interfaces, Procedures and Training, states that the program provides input to the training programs for personnel identified in 10 CFR 50.120 and other personnel who perform tasks directly related to plant safety. During an audit conducted May 9-11, 2017 (ADAMS Accession No. ML17181A415), the staff reviewed the results of the TA. The staff found the applicant had identified tasks for nonlicensed operators as well as licensed operators (licensed operators supervise nonlicensed operators). The staff finds that the HFE design process includes inputs to the training program for operations personnel and other relevant personnel.

The staff finds that Criterion 2.4.1(5) is satisfied because the HFE program addresses the design of the HSIs, as documented in the HSI Design RSR, and inputs to the procedures and training programs for operations personnel, categories of personnel listed under 10 CFR 50.120(b)(2), and other personnel who perform tasks related to plant safety.

Accordingly, the staff finds that the application conforms to this criterion.

Personnel (Criterion 2.4.1(6))

Criterion 2.4.1(6) states that the applicants HFE program should consider operations S&Q, including licensed control room operators as defined in 10 CFR Part 55, Operators Licenses; nonlicensed operators; shift supervisors; and shift technical advisors. The applicant described staffing and the qualifications of licensed operators, including the shift supervisor and shift technical advisor, in DCA Part 2 Tier 2, Section 18.5.3, Results, as well as in the S&Q RSR.

DCA Part 2 Tier 2, Section 18.5.3 states the following:

A staffing plan validation was conducted using guidance in NUREG-0711, NUREG-1791, and NUREG/CR-6838 as well as other industry guidance...The results of the S&Q analysisconfirm that up to 12 NuScale Power Modules and the associated plant facilities may be operated safely and reliably by a minimum staffing contingent of three licensed reactor operators and three licensed senior reactor operators from a single control room during normal, abnormal, and emergency conditions.

The staff evaluates the SPV in Section 18.5.4 of this SER. Because the applicant has identified the required operations staffing and the qualifications (i.e., licensed operators), the staff finds that the applicants HFE program considered S&Q for licensed operators.

The S&Q RSR, Section 1.2, Scope, states the following:

18-9

Staffing analysis for non-licensed operatorsare included only if they are determined to impact licensed operator workload. When licensed operator workload is impacted, then the area of concern was analyzed to a degree sufficient to quantify the impact to licensed operator workload or staffing and to develop any human-system interface (HSI) or staffing adjustments required to address the specific task and associated staffing requirements.

During the May 2017 audit (ADAMS Accession No. ML17181A415), the staff reviewed TA results and found the applicant identified tasks for nonlicensed operators as well as licensed operators. Also, the S&Q RSR, Section 4.8, Staffing Levels, Position Descriptions, and Qualifications, states that the number of nonlicensed operators requested by the control room staff was tracked during the SPV scenarios to include the workload of managing this resource.

The ISV Test Plan also includes procedures for tracking the use of nonlicensed operators. The applicant has identified the number of nonlicensed operators to be available during the SPV and ISV, which will allow the applicant to gain data about whether the number of nonlicensed operators is reasonable. The applicant will document the ISV results in the V&V RSR, and the information about the use of nonlicensed operators will be made available to the COL applicant referencing the NuScale standard plant DC. DCA Part 2 Tier 2, Section 18.5, Staffing and Qualifications, also contains COL Item 18.5-1, which states, A COL applicant that references the NuScale Power Plant design certification will address the staffing and qualifications of non-licensed operators.

As such, the staff observed that the applicant has considered nonlicensed operator staffing with respect to the support the licensed operators will need from such staff to operate the plant from the control room, and a COL item covers the responsibility of the COL applicant to address S&Q of nonlicensed operators. The staff documents the acceptability of COL Item 18.5-1 in Section 18.5.6 of this SER.

Accordingly, the staff finds that the application conforms to this criterion.

Human Factors Engineering Team and Organization (Criteria 2.4.2(1)-(4))

NUREG-0711, Section 2.4.2, HFE Team and Organization, includes four criteria for this topic.

The four criteria address the following aspects of the applicants HFE team: responsibilities (Criterion 2.4.2(1)), organizational placement and authority (Criterion 2.4.2(2)), composition and expertise, (Criterion 2.4.2(3)), and team staffing (Criterion 2.4.2(4)).

Responsibility of the HFE Team (Criterion 2.4.2(1))

Criterion 2.4.2(1) lists activities the applicants HFE team should be responsible for performing.

These activities include overseeing and reviewing all activities in HFE design, development, test, and evaluation, including the initiation, recommendation, and provision of solutions through designated channels for problems identified in implementing the HFE work. The HFE PMP, Section 3.1, Responsibility, states that the HFE team is the primary organization responsible for the HFE program. The staff reviewed the HFE PMP, Section 3.1, and found that the responsibilities of the HFE team include all those listed in the criterion. Therefore, the staff concludes that the applicant has established a specific entity to be responsible for the applicants HFE design. Accordingly, the staff finds that the application conforms to this criterion.

18-10

Organizational Placement and Authority (Criterion 2.4.2(2))

Criterion 2.4.2(2) states that the applicant should describe the primary HFE organization(s) or function(s) within the engineering organization designing the plant. The organization should be illustrated to show organizational and functional relationships, reporting relationships, and lines of communication. The applicant also should address necessary transitions between responsible organizations and how the HFE team has the authority and appropriate organizational placement to reasonably assure that all its areas of responsibility are completed; to identify problems in establishing the overall plan; and to control further processing, delivery, installation, or use of HFE products until the disposition of a nonconformance, deficiency, or unsatisfactory condition is resolved.

The HFE PMP, Section 3.2, Organizational Placement and Authority, states that the HFE team consists of two groups: a core group and another group that includes other members of the design organization who provide expertise to the core group when needed. The core group members report directly to the HFE Supervisor, who reports to the Operations Manager, who reports directly to the Vice President of Operations. The members of the other group are distributed throughout the design organization and provide expertise to the core HFE team as needed. These personnel take direction from the HFE Supervisor while performing HFE activities. Therefore, the applicant has identified the organizational and functional relationships, reporting relationships, and lines of communication.

The HFE PMP, Section 3.2, explains that the HFE Supervisor has ultimate responsibility for scheduling and overseeing various HFE activities and is the owner of the human factors engineering issue tracking system (HFEITS) database. The HFE Supervisor or other members of the HFE team elevate HFE issues within the management chain as necessary. Also, DCA Part 2 Tier 2, Section 18.1.3.1, General Process and Procedures, states, Any member of the HFE team may identify problems and propose solutions using the HFEITS tool. The HFE Supervisor has authority to make decisions regarding resolution of HFEITS items...

Because the HFE team has been given the responsibility for the HFE design as discussed under Criterion 2.4.2(1), and because the HFE Supervisor is the owner of the HFEITS and has the authority to make decisions to resolve issues, the staff concludes that the applicants HFE team has adequate authority and organizational placement to reasonably assure that its areas of responsibility are completed. Accordingly, the staff finds that the application conforms to this criterion.

Composition and Expertise (Criterion 2.4.2(3))

Criterion 2.4.2(3) states that the applicants HFE design team should include the expertise described in the appendix to NUREG-0711. The HFE PMP, Section 3.3, Composition, states the following:

The experience and education levels of the members of the core HFE team meet many of the requirements listed in Table 3-1; however, both the core HFE team and the HFE team members distributed throughout the organization taken together meet all the required experience and qualifications as listed in Table 3-1.

18-11

The staff compared the HFE PMP, Table 3-1, Human Factors Engineering Team Member Qualifications, to the appendix to NUREG-0711 and found that Table 3-1 lists all the qualifications in the appendix to NUREG-0711. Therefore, the staff concludes that the applicants HFE team includes the expertise described in the appendix to NUREG-0711.

Accordingly, the staff finds that the application conforms to this criterion.

HFE Team Staffing (Criterion 2.4.2(4))

Criterion 2.4.2(4) states that the applicant should describe team staffing in terms of job descriptions and assignments of team personnel. The HFE PMP, Section 3.4, Team Staffing, states the following:

The HFE supervisor assigns members of the HFE team (including personnel from outside the Plant Operations organization) to HFE activities to ensure that needed expertise is applied in performing those activities. Members of the core HFE team are assigned as leads and owners of various HFE related areas. For example, each core HFE team member is assigned a group of systems and is the primary interface and representative with engineering for that system.

Additionally, this person is responsible for completing all the work in support of functional requirements analysis and function allocation (FRA/FA), TA, HSI, procedures, and training development for the systems assigned. This person also performs all system design document and functional specification reviews for the assigned group of systems. Members of the core HFE team are also assigned as functional leads for nonsystem areas such as probabilistic risk analysis (PRA), emergency planning, and simulator design.

Additionally, the HFE PMP, Table 3-2, Human Factors Engineering Team Participant Primary Responsibilities, shows the assignment of the personnel qualifications listed in Table 3-1 to each of the 12 HFE program elements. The appendix to NUREG-0711 explains the typical contributions of personnel with the particular set of qualifications to an HFE design team. The staff reviewed Table 3-2 and found that the qualifications were appropriately assigned to the 12 HFE program elements. For example, the appendix to NUREG-0711 states that personnel with computer system engineering qualifications typically participate in designing and selecting computer-based equipment, such as controls and displays. Table 3-2 shows that personnel with computer system engineering qualifications are assigned to HSI design activities.

The staff concludes the applicant has given job descriptions of the HFE team members and assigns tasks to HFE team members with the appropriate expertise to perform those tasks.

Accordingly, the staff finds that the application conforms to this criterion.

Human Factors Engineering Process and Procedures (Criteria 2.4.3(1)-(6))

NUREG-0711, Section 2.4.3, HFE Processes and Procedures, includes six criteria for this topic. The six criteria address process procedures (Criterion 2.4.3(1)), process management tools (Criterion 2.4.3(2)), integration of HFE and other plant design activities (Criterion 2.4.3(3)),

HFE program milestones (Criterion 2.4.3(4)), HFE documentation (Criterion 2.4.3(5)), and subcontractors (Criterion 2.4.3(6)).

18-12

Process Procedures (Criterion 2.4.3(1))

Criterion 2.4.3(1) states that the applicant should identify the process through which the team will execute its responsibilities and include procedures for governing the internal management of the team, making decisions on managing the HFE program, making HFE design decisions, controlling changes in the design of equipment, and reviewing HFE products. The HFE PMP, Section 4.1.1, Human Factors Engineering Team Assignment, states that the HFE Supervisor assigns tasks to HFE team members based on the expertise necessary to complete the task, which is identified in the HFE PMP, Table 3-2. The HFE PMP, Section 4.1.6, Review of Human Factors Engineering Products, states that the HFE Supervisor is responsible for scheduling and overseeing HFE activities, including reviewing HFE team products. The HFE PMP, Section 4.1.3, Making HFE Design Decisions, states that the HFE Supervisor has primary authority to make management decisions for HFE activities. If design decisions require input from multiple organizations, the HFE Supervisor may elevate issues to management through the use of internal procedures, design review boards, and the Corrective Action Program.

Additionally, the HFE PMP, Section 4.1.6, Review of Human Factors Engineering Products, states that HFE activities are conducted in accordance with the Quality Management Plan (QMP), which establishes controls to ensure that all provisions and commitments contained in the Quality Assurance Program Description (QAPD)2 have been implemented appropriately, and in accordance with other procedures governing the design control process. The design process includes provisions to control design inputs, outputs, changes, interfaces, records, and organization interfaces within NuScale and with suppliers. The applicants and suppliers procedures describe design change processes and the division of responsibilities for design-related activities.

Therefore, the staff concludes that the applicant has identified the process through which the HFE team executes its responsibilities in the HFE PMP and the procedures that govern that process. Accordingly, the staff finds that the application conforms to this criterion.

Process Management Tools (Criterion 2.4.3(2))

Criterion 2.4.3(2) states that the applicant should identify the tools and techniques the team members use to verify that they fulfill their responsibilities. The applicant identified the following tools and techniques the HFE team members use to verify that they fulfill their responsibilities:

  • Verification checklists: The HSI Design RSR, Appendix B, Human-System Interface Task Support Verification Form, contains an example of the task support verification form, which the HFE team uses to verify the design supports operator tasks by comparing the HSI to TA results. Appendix C, Human Factors Engineering Design Verification Form, contains an example of the design verification form the HFE team 2 The staff documents its finding that NuScales QAPD, NP-TR-1010-859-NP, NuScale Topical Report:

Quality Assurance Program Description for the NuScale Power Plant, Revision 3, issued October 2016 (ADAMS Accession No. ML16347A405), complies with the requirements in Appendix B, Quality Assurance Criteria for Nuclear Power Plants and Fuel Reprocessing Plants, to 10 CFR Part 50, Domestic Licensing of Production and Utilization Facilities, for the quality assurance program and is therefore acceptable in Safety Evaluation of the NuScale Topical Report: Quality Assurance Program Description for Design Certification of the NuScale Small Modular Reactor (ADAMS Accession No. ML16203A107).

18-13

uses to verify that the design conforms to the design-specific HFE guidelines by comparing the HSI to the Style Guide.

  • HFEITS Records: The HFE PMP, Section 5.3, Human Factors Engineering Issues Tracking Documentation, lists the information contained in the HFEITS. The HFEITS is used to document issues and resolutions and assign issue owners and evaluators.

Issue owners are responsible for resolving the issues. The HFE Supervisor has overall responsibility for managing the HFEITS. The HFEITS Review Committee is responsible for reviewing documentation on all issues in the HFEITS to verify that the resolutions have been completed before closing an issue.

  • HFE Databases: The HFE PMP, Section 6.1, Operating Experience Review, states that the results of the operating experience review (OER) are contained in the OER database. The HFE PMP, Section 6.2, Functional Requirements Analysis and Function Allocation, and Section 6.3, Task Analysis, state that databases also contain the results of the FRA, FA, and TA. The databases can be used to search and review the results of these analyses. During the May 2017 audit (ADAMS Accession No. ML17181A415), the staff observed how the databases allow the HFE team to determine the extent of the completion of a given analysis by observing whether the data fields are complete.

Additionally, as explained in the staffs evaluation of Criterion 2.4.3(1), the HFE Supervisor oversees the HFE team and reviews HFE team products. Therefore, the staff concludes that the applicant has identified the tools and techniques the HFE team members use to verify that they fulfilled their responsibilities. Accordingly, the staff finds that the application conforms to this criterion.

Integration of HFE and Other Plant Design Activities (Criterion 2.4.3(3))

Criterion 2.4.3(3) states that the applicant should describe the process for integrating the inputs from other design work to the HFE program, and the outputs from the HFE program to other plant design activities. The applicant should also discuss the iterative aspects of the HFE design process. The HFE PMP, Appendix A, NuScale HFE Program Design Integration, contains Figure A-1, NuScale and Human Factors Engineering Program Design Integration, which illustrates how the HFE team is integrated into the iterative design process. Appendix A describes in detail how the HFE team participates in the plant engineering design process. The staff reviewed this description in Appendix A and found that it describes a means for HFE team members to review plant design documents and provide recommendations. Appendix A also provides an example of how the plant design was changed as a result of HFE team review and feedback.

Also, the HFE PMP, Section 4.1.5, Controlling Changes in Design Equipment, states the following:

As discussed in Section 3.4, the HFE team members perform reviews of the assigned system design documents and have the authority to approve the documents. They also participate in key meetings such as system design phase reviews. This ensures that the HFE team members have the authority to influence and control design changes.

18-14

The FRA/FA RSR, Section 4.6, Design Incorporation Recommendations Examples, and the OER RSR, Appendix G, Issues Identified by NuScale HFE Team Personnel Incorporated into Design, list plant system design issues identified by the HFE team that have been incorporated into the design. As such, the staff concludes that the applicants interdisciplinary review process, the HFEITS, and the participation of HFE team members in the system design reviews integrate the HFE team and the plant systems designers to help ensure that HFE is considered in the design of the plant systems.

As shown in DCA Part 2 Tier 2, Figure 18.1-1, Overview of Human Factors Engineering Program Process, and the HFE PMP, Appendix A, Figure A-1, the plant system design documents are inputs to the FRA, FA, and TA. The HFE PMP, Section 3.4, explains that the HFE team core group members are assigned plant systems and are the primary interface or point of contact with the engineering organization for that system. The HFE team core group member is responsible for completing the FRA, FA, TA, HSI design, and procedure and training program development for his or her assigned plant system. As such, the staff concludes that the applicants process provides inputs from other plant design work to the HFE team.

DCA Part 2 Tier 2, Figure 18.1-1, and the HFE PMP, Appendix A, Figure A-1, also show that the HFE team uses the HFEITS to track HFE issues that impact plant design documents. The HFE PMP, Section 5.4.2, Human Factors Engineering Issue Tracking System Team Lead, states that one of the responsibilities of the HFEITS Team Lead is to coordinate the resources, including plant system subject matter experts (SMEs), to resolve HFE issues. Also, the HFE PMP, Appendix A, Figure A-2, Human Factors Engineering Program Process, illustrates the feedback from the HFE program to other plant engineering disciplines. For example, Appendix A shows that the results of V&V activities may be provided as input to the PRA and human reliability analysis (HRA). Appendix A explains that the ISV tests design assumptions made in the PRA and HRA, and feedback about those assumptions is documented and tracked to the appropriate disciplines using the HFEITS. As such, the staff concludes that the applicants process of using the HFEITS provides output from the HFE program as input to other plant design disciplines.

The HFE PMP, Figure A-2, also illustrates the iterative aspects of the HFE design process by showing that results and products of the HFE program elements may be refined as further design detail is developed. For example, Appendix A explains that revisions to the PRA/HRA are considered for impact on the TIHA results.

Therefore, the staff concludes the applicant has described the process for integrating the design activities (i.e., the inputs from other design work to the HFE program, and the outputs from the HFE program to other plant design activities) and discussed the iterative aspects of the HFE design process. Accordingly, the staff finds that the application conforms to this criterion.

HFE Program Milestones (Criterion 2.4.3(4))

Criterion 2.4.3(4) states that the applicant should identify HFE milestones that show the relationship of the elements of the HFE program to the integrated plant design, development, and licensing schedule. A relative program schedule of HFE tasks should be available for review. The HFE PMP, Table 4-1, Human Factors Engineering Program and Design Activity Milestones, identifies when the HFE elements will be completed relative to the design, development, and licensing schedule. Accordingly, the staff finds that the application conforms to this criterion.

18-15

HFE Documentation (Criterion 2.4.3(5))

Criterion 2.4.3(5) states that the applicant should identify the HFE documentation items, such as RSRs and their supporting materials, and briefly describe them, along with the procedures for their retention and for making them available to the staff for review. The applicant provided an IP for staff review for the HFE program management element, the design implementation element, and the V&V element. The applicant provided an RSR for staff review for each of the following HFE program elements: OER, TA, FRA, FA, S&Q, and HSI design. The HFE PMP, Table 4-1, shows that for the V&V program element, the applicant will provide the RSR before Phase 4 of the DC review.

The HFE PMP, Section 4.5, Human Factors Engineering Documentation, states that HFE documents, including RSRs, HFEITS records, and verification checklists, are quality records that will be retained in accordance with the QMP. The applicant stated that all such documentation is available for staff review upon request. As discussed in the staffs evaluation of Criterion 2.4.3(4), the applicant provided RSRs and IPs for review. NUREG-0711 explains that IPs and RSRs are the two primary types of applicant submittals that the staff reviews.

Accordingly, the staff finds that the application conforms to this criterion.

Subcontractor Efforts (Criterion 2.4.3(6))

Criterion 2.4.3(6) states that the applicant should include HFE requirements in each subcontract contributing to the HFE program, periodically verify the subcontractor's compliance with HFE requirements, and describe milestones and the methods used for this verification. The HFE PMP, Section 4.2, Process Management Tools, states that the HFE activities are conducted in accordance with the QMP, which establishes controls to ensure that all provisions and commitments contained in the QAPD have been implemented appropriately. Further, the HFE PMP, Section 4.6, Subcontractor HFE Efforts, states, If a subcontractor is involved in HFE activities, the HFE team verifies that the subcontractor is properly trained and complies with the QMP. This section also states that the quality assurance organization verifies that the subcontractors conduct work in accordance with the QMP or the subcontractors quality assurance program as contracted.

DCA Part 2 Tier 2, Section 17.5, Quality Assurance Program Description, states that the QAPD is provided in NP-TR-1010-859-NP, Revision 3 (ADAMS Accession No. ML16347A405).

The staff reviewed that document. The QAPD, Section 3.1.4, Procurement Document Control, states the following:

Procurement documents for items and services obtained by or for NuScale include or reference documents describing applicable design bases, design requirements, and other requirements necessary to ensure component performance. The procurement documents are controlled to address deviations from the specified requirements.

Because the applicants QAPD states that procurement documents include design requirements and because HFE is conducted in accordance with the QMP, which implements the commitments in the QAPD, the staff concludes that procurement documents provided to any subcontractors will include HFE design requirements.

Further, the QAPD, Section 3.1.3, Design Control, states the following:

18-16

NuScale has design control measures to ensure that the established design requirements are included in the design. These measures ensure that applicable design inputs are included or correctly translated into the design documents and deviations from those requirements are controlled. Design verification is provided through the normal supervisory review of the designer's work.

The QAPD, Section 3.1.18, Audits, states the following:

NuScale employs measures for line management to periodically review and document the adequacy of processes, including taking any necessary corrective action. Audits independent of line management are not required. Line management is responsible for determining whether reviews conducted by line management or audits conducted by any organization independent of line management are appropriate. If performed, audits are conducted and documented to verify compliance with design and procurement documents, instructions, procedures, drawings, and inspection and test activities.

Because the QAPD states that supervisors review the design products and line management also periodically reviews products to verify conformance to design requirements, the staff concludes that the applicant has described methods for verifying compliance with procurement requirements. The applicant also identified that the milestones for review include periodic review determined by line management and supervisory review of the products by supervisors.

Therefore, the staff concludes that the applicant includes HFE requirements in procurement documents and has established methods of verifying conformance to those requirements.

Accordingly, the staff finds that the application conforms to this criterion.

Tracking Human Factors Engineering Issues (Criteria 2.4.4(1)-(4))

NUREG-0711, Section 2.4.4, Tracking HFE Issues, includes four criteria for this topic. The four criteria address HFE issue tracking availability (Criterion 2.4.4(1)), methods (Criterion 2.4.4(2)), documentation (Criterion 2.4.4(3)), and responsibility (Criterion 2.4.4(4)).

Availability (Criterion 2.4.4(1))

Criterion 2.4.4(1) states that the applicant should have a tracking system to address human factors issues that are known to the industry; identified throughout the life cycle of the HFE aspects of design, development, and evaluation; and deemed by the HFE program as HEDs.

The HFE PMP, Section 5.1, Availability of Human Factors Engineering Issue Tracking System, states that the applicant uses the HFEITS database to address HFE issues, including those issues specifically listed in Criterion 2.4.4(1). Therefore, the staff concludes the applicant established a tracking system for HFE issues. Accordingly, the staff finds that the application conforms to this criterion.

Methods (Criterion 2.4.4(2))

Criterion 2.4.4(2) states that the applicant should establish criteria for entering issues into the system and tracking issues until the potential for negative effects on human performance is reduced to an acceptable level. The HFE PMP, Section 5.2, Human Factors Engineering Issues Tracking Methodology, states the following:

18-17

Because the HFE team is imbedded into the design engineering process, most potential HFE issues are able to be resolved immediately. This is accomplished through direct feedback to design engineers, at engineering design phase review meetings, and during design document review and comment resolution. If the issue cannot be immediately resolved, it is entered into the HFEITS database and is assigned a unique tracking number.

For example, the HFE PMP, Section 6.1, states that if an OER issue is applicable to the design, and the issue cannot be resolved at the current point of the design, then the issue is entered into HFEITS. The HFE PMP, Section 5.4.7, Human Engineering Discrepancy Resolution, explains that HFE issues identified as V&V activities are specifically referred to as HEDs, and HEDs are also entered in the HFEITS. Thus, the staff concludes that the applicant established criteria for entering issues into the tracking system.

The HFE PMP, Section 5.4.8, HED Process Flow, states that each HED is assigned one of the following three priority classifications:

(1) Priority 1 HEDs have a potential direct or indirect impact on plant safety.

(2) Priority 2 HEDs have a direct or indirect impact on plant performance and operability.

(3) Priority 3 HEDs are those that are not Priority 1 or Priority 2.

Section 5.4.8 also discusses when these issues are resolved and how the priority is assigned:

  • Priority 1 HEDs are resolved before ISV testing is considered complete. HEDs initiated as a result of the failure to meet a pass/fail performance measure are Priority 1 HEDs.

Cross-cutting issues determined through HED analysis or performance measure analysis are Priority 1 HEDs because of their global impact on the HSI design performance.

  • Priority 2 HEDs have a direct or indirect impact on plant performance and operability and are resolved before the plant design is completed.
  • Priority 3 HEDs do not have to be resolved. If resolution of a Priority 3 HED is determined to be necessary, it would be resolved during the design implementation program element.

NUREG-0711, Section 11.4.4, Human Engineering Discrepancy Review Criteria, contains guidance for determining which HEDs to correct. The staff found the applicants plan to resolve Priority 1 and 2 HEDs to be consistent with this guidance, and, therefore, the applicant will address HEDs that could have negative impacts on human performance. The HFE PMP, Section 6.6, Human-System Interface Design, states that HFE issues generated during HSI design or from earlier program elements are resolved during HSI design so that the final output is a complete HSI design suitable for V&V. Thus, the staff concludes that resolving Priority 1 and 2 HEDs as part of V&V, as well as resolving HFE issues identified from the HFE elements completed before V&V, provides reasonable assurance that the potential for negative effects on human performance will be reduced to an acceptable level.

However, it was not clear to the staff when the applicant considers that the plant design is completed, and at what point in the design process the applicant plans to resolve Priority 2 HEDs. NUREG-0711, Section 11.4.4, explains that significant HEDs, including Priority 2 HEDs, 18-18

should be addressed during V&V, if feasible. The staff issued RAI 9360, Question 18-41 (ADAMS Accession No. ML18180A359), to request that the applicant explain the phrase when the plant design will be completed. In its response to RAI 9360, Question 18-41 (ADAMS Accession No. ML18172A227), the applicant provided a proposed revision to the DI IP, Section 2.0, Design Implementation Assessments, to clarify that resolution of Priority 2 HEDs will occur before the applicants turnover of the HFE program implementation to a licensee. The staff finds this acceptable because important issues will be addressed by members of the applicants HFE team, who are knowledgeable of the issue and resolution. RAI 9360, Question 18-41, is a Confirmatory Item. However, Section 18.11.4.4.3, Conclusion, of this SER discusses that staff issued RAI 9415, Question 18-46 (ADAMS Accession No. ML18204A190) in part to request the applicant explain how satisfactory resolution of these HEDs and any HEDs identified during design implementation activities will occur and be documented. RAI 9415, Question 18-46 is Open Item 18-22.

Documentation (Criterion 2.4.4(3))

Criterion 2.4.4(3) states that the applicant should document the actions taken to address each issue in the system, and if no action is required, this should be justified. The HFE PMP, Section 5.3, lists information that is entered in the HFEITS for each issue, which includes the actions taken to address each issue (i.e., resolutions) and justification if no action is taken. The HFE PMP, Section 5.3, states that descriptions of resolutions are sufficiently detailed to provide traceability and third party review. Also, the HFE PMP, Section 5.4.7, states that HEDs may not always be resolved, and the basis for accepting an HED without change is documented.

Because the actions taken to address issues and descriptions of the resolutions will be documented, the staff concludes that the applicants method is acceptable. Accordingly, the staff finds that the application conforms to this criterion.

Responsibility (Criterion 2.4.4(4))

Criterion 2.4.4(4) states that the applicants tracking procedures should describe individual responsibilities for logging, tracking, and resolving issues, along with the acceptance of the outcome. The HFE PMP, Section 5.4, Human Factors Engineering Tracking Responsibilities, states that all HFE team members are responsible for identifying, logging, evaluating, and tracking HFE issues to resolution. HFE team members are assigned the following specific responsibilities to address HFE issues documented in the HFEITS:

  • Issue evaluators are assigned to evaluate issues, recommend issue owners, and recommend corrective actions. Issue evaluator assignments are documented in the HFEITS.
  • Issue owners are assigned to resolve issues, update the HFEITS with proposed and completed actions, and update design documentation, if necessary. Issue owner assignments are documented in the HFEITS.
  • The HFEITS Team Lead coordinates resources to identify and implement resolutions, approves resolution of issues with support from the HFE team as needed, coordinates the HFEITS Review Committee, and tracks issue resolution and due dates.

18-19

  • The HFEITS Review Committee reviews all HFE issues before final closure to verify completion of the resolution (i.e., accepting the outcome of the HFE resolution process).
  • The HFE Supervisor has overall responsibility for administering and managing the HFEITS and the HFEITS Review Committee.

Therefore, the staff concludes that the applicant has described individual responsibilities for logging, tracking, and resolving HFE issues, along with the acceptance of the outcome.

Accordingly, the staff finds that the application conforms to this criterion.

Technical Program (Criteria 2.4.5(1)-(5))

NUREG-0711, Section 2.4.5, Technical Program, includes five criteria for this topic. The fifth criterion addresses plant modifications and is not applicable to new reactors; therefore, the staff evaluated the first four criteria as discussed below. The four criteria address status (Criterion 2.4.5(1)); schedule (Criterion 2.4.5(2)); standards and specifications (Criterion 2.4.5(3)); and facilities, equipment, tools, and techniques (Criterion 2.4.5(4)).

Status (Criterion 2.4.5(1)) and Schedule (Criterion 2.4.5(2))

Criterion 2.4.5(1) states that the applicant should describe the applicability and status of each of the 12 HFE elements, and Criterion 2.4.5(2) states that the applicant should provide a schedule for completing HFE activities that are unfinished at the time of application. The HFE PMP, Table 4-2, Human Factors Engineering Element Documentation, shows the HFE elements and explains the status of each. All of the HFE elements listed in the criterion are shown in Table 4-2 and are identified as being applicable to the HFE program. The applicant provided with the application an RSR for each element that is the responsibility of the DC applicant, with the exception of the RSR for the V&V element. Table 4-2 states that the V&V RSR will be submitted before the start of Phase 4 of the staffs review. The information in Table 4-2 is consistent with the letter dated January 14, 2016, from the NRC to the applicant (ADAMS Accession No. ML15302A516) and the letter dated April 8, 2016, from the applicant to the NRC (ADAMS Accession No. ML16099A270).

The applicant provided COL items for the HFE elements that had not been completed at the time of the application: training program development, procedure development, design implementation, and human performance monitoring. The application addresses the COL responsibilities as follows:

  • DCA Part 2 Tier 2, Section 13.2, Training, contains COL Items 13.2-1 and 13.2-2 for the COL applicant to provide a description and schedule of the initial training and qualification as well as requalification programs for ROs, SROs, and nonlicensed plant staff (plant management, supervisory personnel, technicians, and general employees).

SRP Chapter 18,Section II, states, Training programs are considered operational programs as identified in SRP Section 13.4, Operational Programs. For a new nuclear power plant (NPP) the training program will usually be reviewed during the COL FSAR review rather than the DC. Providing a COL item for training program development is consistent with the SRP guidance. (The staff evaluates these COL items in Section 13.2 of this SER.)

18-20

  • DCA Part 2 Tier 2, Section 13.5, Plant Procedures, contains COL Items 13.5-1, 13.5-2, 13.5-3, 13.5-5, 13.5-7, and 13.5-8 for the COL applicant to describe the site-specific plant procedures and provide a schedule for development, implementation, and procedure control. NUREG-0711, Section 9.1, Background, states, In the nuclear industry, procedure development is the responsibility of individual utilities. The procedures program is reviewed by staff using SRP Chapter 13. Providing COL items for procedure development is consistent with the guidance in NUREG-0711. (The staff evaluates these COL items in Section 13.5 of this SER.)
  • The applicant provided the DI IP to address the HFE design implementation program element. The applicant cannot complete the activities in the DI IP at this time because the as-built plant and site-specific information must exist to complete these activities, and the plant and site-specific information does not exist yet.
  • The HFE PMP, Table 4-2, states that Human performance monitoring is the responsibility of a COL applicant. No implementation plan or RSR is submitted as part of design certification application. DCA Part 2 Tier 2, Section 18.12, Human Performance Monitoring, includes COL Item 18.12-1 for the COL applicant to develop the human performance monitoring program. NUREG-0711, Section 13.2, Objective, explains that human performance monitoring is an operational program that may be incorporated into a COLs problem identification and resolution program and the training program. As such, human performance monitoring is a COL responsibility, and providing a COL item for human performance monitoring is consistent with the guidance in NUREG-0711. (The staff evaluates this COL item in Section 18.12 of this SER.)

Therefore, the staff concludes that the applicant has described the program status and schedule, and they are consistent with the schedule discussed with the applicant and the staff before receiving the DCA for review. Accordingly, the staff finds that the application conforms to this criterion.

Standards and Specifications (Criterion 2.4.5(3))

Criterion 2.4.5(3) states that the applicants plan should identify and describe the standards and specifications that are sources of the HFE requirements. The executive summary of the HFE PMP states that the HFE program incorporates 12 HFE elements in accordance with the guidance of NUREG-0711, and Revision 3 is specified in the HFE PMP, Section 7.0, NUREG-0711 Conformance Evaluation.

The HSI Design RSR, Section 3.5.1, HSI Style Guide, states that The style guide contains instructions for determining where and how HFE guidance is used in the overall design process. The HSI Design RSR, Section 4.5.1.2, Purpose, states that the Style Guide primarily draws from NUREG-0700 for guidance, and other documents, including accepted commercial HSI, and military HFE design standards were reviewed and are properly referenced.

The staff reviewed the Style Guide, Volume II, and found that it identifies the references for the design-specific HFE requirements established by the applicant for the HFE design. Accordingly, the staff finds that the application conforms to this criterion.

Facilities, Equipment, Tools, and Techniques (Criterion 2.4.5(4))

18-21

Criterion 2.4.5(4) states that the applicants plan should specify HFE facilities, equipment, tools, and techniques (such as laboratories, simulators, and rapid prototyping software) that the HFE program will employ. The applicant described the following HFE facilities, equipment, tools, and techniques used in the HFE program:

  • HFE facilities and equipment: The applicant developed a control room simulator that was used in the process of developing the HFE design. The HSI Design RSR, Section 3.2, Simulator Development, states, The NuScale simulator is an evolutionary expression of the MCR interface that is built incrementally and represents the design detail as it emerges. The V&V IP, Section 4.3, Validation Test Beds, explains the use of the control room simulator for ISV testing conducted to validate the HFE design:

The principal validation test bed for the ISV is the control room simulator.

The fidelity of the validation test beds models and HSI are verified to represent the current, as designed NuScale plant prior to use for the validation. The test bed model is made up of four modeling software packages, all working from current NuScale designs. Together, they provide a high level of fluid and reactivity modeling. Precisely modeling the predicted behavior of the reactor core, thermodynamic performance, balance of plant, and electrical system design is desired as NuScale does not have a comparison reference plant. All 12 units are simultaneously and independently modeled, but they all correctly share systems that provide input for multiple units.

The staff concludes that using the control room simulator, which models the current plant system design and the HSIs resulting from the HFE design process, helps to ensure the design that is validated represents the design that will be built and operated.

  • Tools: The applicant uses databases, such as the HFEITS, for tracking HFE issues, and the OER database for storing the results of the OER. The applicant also uses the VISION Developer application, which is described in Section 18.1.4.1 of this SER in the staffs evaluation of Criterion 2.4.1(5), as a means for documenting the results of HFE analyses, such as TAs, and using them as inputs to the development of the procedures and training programs. In addition, the HSI Design RSR, Section 4.2.1, Simulator Software, explains how the applicant used proprietary software to ensure the HFE design guidelines were applied consistently to all of the HSIs to which the guidelines were applicable. Other tools the applicant uses include verification checklists, which the staff discusses in its evaluation of Criterion 2.4.3(2) in Section 18.1.4.3 of this SER.

The staff concludes that these tools help to ensure the consistent application of the applicants HFE design criteria to the HSIs, help to provide for efficient and thorough verification activities, and help to ensure that necessary design changes are implemented. The staff also concludes that these tools allow for the documentation of HFE issues, facilitate integration of the results of the HFE analysis elements (e.g., TA) to be used as inputs to the HSI design elements (e.g., procedure development), and help to ensure that HFE guidelines are applied consistently to HSIs.

18-22

  • Techniques: The HSI Design RSR, Section 3.3.4, Rapid Prototyping, states the following:

Based on the latest conceptual sketches and feedback from interfacing with other disciplines, mock-ups or prototype screens integrated with a software simulator of the system are developed for review and evaluation.

While the prototype provides a realistic user experience with the system, the focus is on testing design concepts and soliciting feedback, rather than producing an engineering-quality software architecture and user interface.

The staff concludes that the applicants technique is an acceptable means of gaining user feedback that can be incorporated into the design as it evolves.

Accordingly, the staff finds that the application conforms to this criterion.

Combined License Information Items No COL information items are associated with HFE program management.

Conclusion The staff evaluated the applicants method for HFE program management and finds that it conforms to the criteria in NUREG-0711, Section 2.4. Therefore, the staff concludes that the applicants HFE program description addresses the goals and scope of the HFE program, identifies the HFE team and member qualifications, identifies HFE processes and procedures, covers methods for tracking HFE issues, and provides an overview of how each of the HFE program elements will be addressed. The staff will continue to track resolution of the open item discussed in this section to confirm it is adequately resolved and that the HFE program management plan continues to be implemented adequately in light of any changes to the HFE design that result from the issues related to Open Item 18-22 discussed in this section.

18.2 Operating Experience Review Introduction The objective of this review is to verify that the applicant has identified and analyzed HFE-related problems and issues encountered in previous designs so that these problems and issues may be avoided in the development of the new design. This review should also verify that the applicant has retained the positive features of previous designs. This is done through an evaluation of licensee event reports, Institute of Nuclear Power Operations significant event reports and significant operating experience reports, plant corrective action systems, operational and maintenance logs and records, and data from interviews with experienced plant personnel.

Summary of Application DCA Part 2 Tier 1: Refer to Section 18.1.2 of this SER.

DCA Part 2 Tier 2: The applicant provided a description of this HFE element in DCA Part 2 Tier 2, Section 18.2, Operating Experience Review.

18-23

ITAAC: There are no ITAAC associated with this element.

Technical Specifications: There are no TS associated with this element.

Topical Reports: There are no topical reports associated with this element.

Technical Reports: Refer to Section 18.1.2 of this SER.

Regulatory Basis The following NRC regulations contain the relevant requirements for this review:

  • 10 CFR 52.47(a)(8) as it pertains to the information necessary to demonstrate compliance with any technically relevant portions of the Three Mile Island requirements set forth in 10 CFR 50.34(f), except paragraphs (f)(1)(xii), (f)(2)(ix), and (f)(3)(v)
  • 10 CFR 50.34(f)(3)(i) addresses administrative procedures for evaluating operating, design, and construction experience
  • 10 CFR 50.34(f)(2)(iii) - Provide, for Commission review, a control room design that reflects state-of-the-art human factor principles prior to committing to the fabrication or revision of fabricated control room panels and layouts SRP Chapter 18,Section III, Acceptance Criteria, lists the acceptance criteria adequate to meet the above requirements, as well as review interfaces with other SRP sections. Acceptance criteria for HFE design methodology are provided in NUREG-0711 (listed below). (NUREG-0711 references NUREG-0700, Human-System Interface Design Review Guidelines, which provides detailed acceptance criteria for HFE design attributes.)
  • NUREG-0711, Revision 3, Human Factors Engineering Program Review Model, Chapter 3, Operating Experience Review, Section 3.4, Review Criteria, issued November 2012 The following documents also provide additional guidance in support of the SRP acceptance criteria to meet the above requirements:
  • NUREG/CR-7202, NRC Reviewer Aid for Evaluating the Human-Performance Aspects Related to the Design and Operation of Small Modular Reactors, issued June 2015
  • NUREG/CR-7126, Human-Performance Issues Related to the Design and Operation of Small Modular Reactors, issued June 2012 NUREG-0711 states, The main reason an applicant conducts an OER as part of the HFE program is to identify HFE-related safety issues. The objective is to ensure that the applicant has reviewed previous designs and analyzed the results so that the design process can maintain positive features from predecessors and eliminate or minimize negative aspects of the design. The staff reviewed the RSR and applied the acceptance criteria in NUREG-0711, Section 3.4, to ensure that this objective is met (Section 3.4.3, Plant Modifications, applies only for plant modifications and therefore was not used).

18-24

Some aspects of the design necessitate a modified version of an OER compared to what has been done for previous DCs. For example, nonnuclear industry OERs (e.g., unmanned aerial vehicles and teleoperative medicine) will play a much greater role in understanding the strengths and weaknesses of various automation techniques that are unique to NuScale.

NUREG/CR-7202 provides guidance to the staff to identify these new considerations with respect to small modular reactor (SMR) designs like NuScale. NUREG-7126 provides additional detail that supports the criteria in NUREG/CR-7202.

The review criteria in NUREG-0711, Section 3.4, do not specifically mention the use of nonnuclear OER; however, the background section states, It may be based on multiple predecessors and encompass both non-nuclear and nuclear industry sources. Additionally, NUREG/CR-7202, Section 2.2, Novel Systems and Limited Operating Experience from Predecessor Systems, identifies questions the staff may consider when evaluating ways the applicant has compensated for aspects of the design that may not have any or have only limited relevant predecessor plant operating experience.

The staff review verified that the applicant has a systematic and dedicated process for identifying, tracking, and addressing operating experience in the design in a manner similar to previous DC reviews. However, the staff focused this review using the guidance in NUREG/CR-7202 by verifying that the scope of the operating experience information reviewed includes appropriate surrogate industry information and by assessing whether a sufficient OER has been performed even when there may be limited or no relevant predecessor nuclear industry operating experience.

Technical Evaluation The staff reviewed the application using the criteria in NUREG-0711, Sections 3.3, Applicant Products and Submittals, and 3.4, Review Criteria. All acceptance criteria in NUREG-0711, Section 3.4, were applied to this review, with the exception of those in Section 3.4.3, Plant Modifications, which apply only to plant modifications. The subsections below document the results of the staffs evaluation.

Scope (Criteria 3.4.1(1)-(5))

NUREG-0711, Section 3.4.1, Scope, includes five criteria for this topic. Each criterion addresses confirming that the applicants OER program includes certain types of operating experience, including predecessor/related plants and systems, recognized industry HFE issues, related HSI technology, issues identified by plant personnel, and IHAs.

18.2.4.1.1 Summary of Application The following sections of the OER RSR (RP-0316-17614) contain information on predecessor/related plants and systems (Criterion 3.4.1(1)):

  • Section 3.1, Review of Predecessor and Related Plants and Systems, describes the methodology used to identify predecessor/related plants and systems.
  • Appendix A, Issues from Predecessor and Related Plants and Systems Incorporated into Design, lists the issues identified during the OER, most of which are not applicable to the NuScale design because of the nature of the design.

18-25

  • Tables 3-1, Comparison of commercial PWR systems to NuScale systems, and 3-3, OER scope, predecessor determination, and relevance, illustrate the comparisons of systems in the NuScale design relative to existing designs. Table 3-2, Examples of systems and components eliminated in the NuScale design, provides examples of systems that have been eliminated completely from the NuScale design and highlights the design features that make the elimination of the systems possible.
  • Section 4.1, Predecessor and Related Plants and Systems, summarizes those systems and processes that do not apply to the NuScale design by virtue of the design.

Elimination of these systems and processes removes many of the problems identified by the OER that traditional operating plants experience.

In addition, the following sections of the OER RSR contain information on recognized industry HFE issues (Criterion 3.4.1(2)):

  • Section 2.1, Operating Experience Review Process Overview, indicates that the analysis considered NUREG/CR-6400, Human Factors Engineering (HFE) Insights for Advanced Reactors Based Upon Operating Experience, issued January 1997, and the other sources of operating experience listed in the criterion.
  • Appendix B, List of Operating Experience Sources Reviewed, gives the number of issues identified based on each source of operating experience and describes how these issues were handled in the OER process. Appendix B also indicates which OER items have already been closed.
  • Section 3.2, Review of Recognized Industry HFE Issues, provides additional detail on the process used to review this information. This section indicates that lessons learned from the accident at Fukushima were included, as was information from NRC generic communications. These sources of information demonstrate that the applicant has considered sources of operating experience information that was created after 1996, as described in Criterion 3.4.1(2).

The following sections of the OER RSR contain information on related HSI technology (Criterion 3.4.1(3)):

  • Section 3.3, Review of Related HSI Technology, provides a detailed methodology for reviewing HSI technology that goes beyond consideration of the traditional nuclear industry HSI. The scope of this section includes nonnuclear HSI technologies that may be applicable to this design.
  • Appendix D, Related HSI Technology Issues Incorporated into Design, provides the results of HSI technology issues that were identified and included in the design process.

The following sections of the OER RSR contain information on issues identified by plant personnel (Criterion 3.4.1(4)):

  • Section 3.4, Review of Issues Identified by Plant Personnel, provides the methodology used to review operating experience gathered from plant personnel. This methodology includes interviewing a wide range of operating plant personnel, including operators, 18-26

procedure writers, supervisors, maintenance technicians, and others, to solicit operating experience.

  • Appendix F, Plant Personnel Interviews and Findings, contains information about the personnel who were interviewed, as well as the topics that were discussed in the interviews.
  • Section 4.4, Issues Identified by Plant Personnel, summarizes the issues identified during the data collection process. This is supplemented by Appendices E-G, which provide a sample of the findings.

The following sections of the OER RSR contain information about IHAs (Criterion 3.4.1(5)):

  • Section 3.5, Review of Important Human Actions, includes an OER methodology for the review of IHAs. IHAs were identified early in the design process (see Section 18.6 of this SER) and entered into the OER database for tracking. The methodology considers both the successful operation of systems used to conduct IHAs as well as conditions that may have caused errors in predecessor plants. The process also considers any IHA that may be different from plants that were reviewed.

Section 3.5 also indicates that the NuScale design has only two IHAs; however, other HAs that could have negative consequences (but were not identified as IHAs) are also identified and analyzed in the OER process.

  • Section 4.5, Important Human Actions, includes the results of the OER process related to IHAs. It describes two IHAs: one is relevant compared to a benchmark operating plant and the other is unique to NuScale, with some similarities to the benchmark plant.
  • Appendix H, Important Human Action Issues Incorporated into Design, provides export-controlled information results from this process. A table describes the source of the operating experience, each issue, and a design solution and method of implementation for each entry.

18.2.4.1.2 Staff Assessment The staff compared the scope of the five methods used to assess relevant sources of operating experience described in the OER RSR to the associated NUREG-0711 criteria and considered the supplemental guidance in NUREG/CR-7202, which describes challenges related to OER methodologies that are unique to SMR technologies.

The staff found that the scope described in the OER RSR was consistent with the applicable NRC guidance described above with the following exceptions:

  • At a high level, the OER RSR submittal was consistent with the six bullet points of Criterion 3.4.1(1); however, it did not specifically identify some notable published examples of OER that apply to SMRs (see NUREG/CR-7202, Appendix A: Questions for SMR Applicants Organized by NUREG-0711 Element, Section A.1 Operating Experience Review). The staff issued RAI 9153, Question 18-5 (ADAMS Accession No. ML17286B066), to clarify the scope of the OER analysis with regard to certain nonnuclear industries. The response to RAI 9153, Question 18-5 (ADAMS Accession 18-27

No. ML17346A971), describes the results obtained from the OER process when considering these nonnuclear technologies. The staff was able to determine that the OER process had, in fact, included the appropriate nonnuclear industries. Therefore, the staff considers RAI 9153, Question 18-5, to be resolved.

  • Similarly, the staff found that the bulleted list of considerations in the OER RSR, Section 3.3, was, for the most part, consistent with Criterion 3.4.1(3) and the supplemental guidance in NUREG/CR-7126. However, it was not clear from the OER RSR how the applicant addressed certain important examples related to multiunit operation and other issues described in NUREG/CR-7126. RAI 9153, Question 18-5, also addressed this gap. The response to RAI 9153, Question 18-5, clarifies how multiunit operation and other issues described in NUREG/CR-7126 were considered in the OER process and used to improve the design. The response indicates that NUREG/CR-7126 was used as an additional source of input to the OER analysis, and it summarizes a sample of results that are uniquely relevant to the NuScale design, such as the lessons learned from unmanned aircraft systems, oil refinery control systems, and teleoperative medicine experience. The response to RAI 9153 clarified how appropriate non-nuclear experience with control of multiple devices was applied to the analysis, therefore, the staff considers RAI 9153, Question 18-5, to be resolved.

In June 2018, the staff conducted an audit of the applicants OER analysis (ADAMS Accession No. ML18208A370). The staff confirmed that the sample of results presented in the OER RSR was representative of the full set of results contained in the OER database by reviewing a sample of OER items and ensuring that they were consistent with the applicable acceptance criteria.

18.2.4.1.3 Conclusion The staff found that the methodology described in the OER RSR covers most of the scope of an acceptable OER program as described in NUREG-0711 and the supplemental SMR guidance.

The response to RAI 9153, Question 18-5, adequately the scope of the OER in the original application materials.

During the June 2018 audit, the staff confirmed that the sample of results provided in the OER RSR was an adequate representation of the full set of OER results in the OER database.

Therefore, the staff considers the scope of the analysis to be consistent with the applicable NUREG-0711 criteria and therefore acceptable.

Issue Analysis, Tracking and Review (Criteria 3.4.2(1)-(4))

NUREG-0711, Section 3.4.2, Issue Analysis, Tracking and Review, includes four criteria for this topic. These criteria address the applicants ability to analyze and track relevant operating experience events. This includes describing an adequate OER process (Criterion 3.4.2(1)),

analyzing OER content to identify relevant human performance issues (Criterion 3.4.2(2)),

documenting the OER process (Criterion 3.4.2(3)), and tracking relevant OER entries (Criterion 3.4.2(4)).

18-28

18.2.4.2.1 Summary of Application The following sections of the OER RSR contain information related to describing an adequate OER process (Criterion 3.4.2(1)):

  • Section 2.1 describes the process used, which includes methods for screening, reviewing, providing recommendations, and documenting results. This section also discusses team member roles and important decisionmaking points. Figure 2-1, Operating experience review process, illustrates the processes.
  • Section 1.2, Scope, identifies the scope of the OER, which includes various operating, design, and construction experience.

The following sections of the OER RSR contain information related to analyzing OER content to identify relevant human performance issues (Criterion 3.4.2(2)):

  • Section 2.2, OER Team Composition and Responsibilities, identifies the responsibilities of the OER team, which include the bulleted items listed in Criterion 3.4.2(2).
  • Section 3.6.3, HFE Issue Tracking System Database, describes the documentation and tracking of considerations listed in this criterion.

The following sections of the OER RSR contain information related to documenting the OER process (Criterion 3.4.2(3)):

  • Section 3.6 describes the documentation of OER issues, including the use of three separate databases. Appendices J-L provide screen capture examples of each of the three databases.
  • Section 3.6.1 describes the OER database, which documents issues that are preliminarily screened into the OER. The OER team assesses all entries. Any entries that are found to be out of the scope of OER are closed out after a justification is written, but the information is retained. Any entries that are found to be within the scope of OER are transferred to either the engineering database (for issues that are within the scope of OER but are not human factors issues) or to the HFEITS (for OER issues that are related to human factors).

The following section of the OER RSR contains information related to tracking OER entries (Criterion 3.4.2(4)):

  • Appendix I, Sample of Open Issues Being Tracked, to the RSR identifies a sample of open items that are still being tracked in the HFEITS.

18.2.4.2.2 Staff Assessment The staff reviewed the OER RSR sections noted above and compared the descriptions of the methodologies used to the applicable NUREG-0711 criteria.

The staff found that although a specific NuScale procedure for conducting OER is not referenced, the OER RSR provides adequate detail for the process to be implemented as described. The staff determined this treatment to be adequate to meet Criterion 3.4.2(1).

18-29

The staff assessed the methodology described in Sections 2.2 and 3.6.3 and found that it is consistent with Criterion 3.4.2(2). The staff found that the entries in the appendices illustrate that the process described in the methodology is producing results related to human performance issues, sources of human error, and design elements that support human performance.

The staff reviewed the screen shots of the databases and concluded that the database structure captures the relevant parameters necessary to implement the program as described in the OER RSR. In addition, the staff reviewed the OER database during the May 2017 audit (ADAMS Accession No. ML17181A415) and found it to be an adequate means of the documenting the OER process in accordance with Criterion 3.4.2(3).

In June 2018, the staff audited the applicants OER analysis (ADAMS Accession No. ML18208A370). The staff confirmed that the sample of results presented in the OER RSR was representative of the full set of results contained in the OER database by reviewing a sample of OER items and ensuring that they were consistent with the applicable acceptance criteria.

18.2.4.2.3 Conclusion The staff finds that the methodology described in the OER RSR is consistent with the relevant NUREG-0711 criteria as described above. In addition to reviewing the relevant sections of the OER RSR, the staff audited the OER database and HFEITS database during the May 2017 audit. The OER database was found to be a sufficient means to document and track OER analyses. The June 2018 audit confirmed that the results of the OER process in the OER database are consistent with those results reported in the OER RSR. Therefore, the staff finds this treatment to be an acceptable means of meeting this review criterion.

Combined License Information Items No COL information items are associated with DCA Part 2 Tier 2, Section 18.2.

Conclusion The staff reviewed the methodology used to conduct the OER analysis described in the OER RSR and the RAI responses described above and found that the methodology conforms to the relevant NUREG-0711 acceptance criteria.

The applicant provided several appendices in the OER RSR that include samples of the final data. According to NUREG-0711, Section 3.3, the OER RSR should represent complete work.

During its audit in June 2018, the staff compared the sample of OER data presented in the appendices to the analyses found in the OER database. The staff found that the results in the OER RSR were consistent with the results in the OER database. Therefore, the staff finds this treatment to represent an acceptable OER analysis.

18.3 Functional Requirements Analysis and Function Allocation Introduction Functional Requirements Analysis (FRA) is the identification of functions that must be performed to satisfy plant overall goals (e.g., safe operation, power generation). FA is the 18-30

analysis of requirements for plant control and the assignment of control functions to (1) personnel (e.g., manual control), (2) system elements (e.g., automatic control and passive, self-controlling phenomena), and (3) combinations of personnel and system elements (e.g., shared control, automatic systems with manual backup).

The objective of the staffs review is to verify that (1) the plant's functions that must be performed to satisfy plant safety objectives have been defined, and (2) the allocation of those functions to human and system resources has resulted in a role for personnel that takes advantage of human strengths and avoids human limitations.

Summary of Application DCA Part 2 Tier 1: Refer to Section 18.1.2 of this SER.

DCA Part 2 Tier 2: The applicant provided a description of this HFE element in DCA Part 2 Tier 2, Section 18.3, Functional Requirements Analysis and Function Allocation.

ITAAC: There are no ITAAC associated with this element.

TS: There are no TS associated with this element.

Topical Reports: There are no topical reports associated with this element.

Technical Reports: Refer to Section 18.1.2 of this SER.

Regulatory Basis The following NRC regulations contain the relevant requirements for this review:

  • 10 CFR Section 52.47(a)(8) as it pertains to the information necessary to demonstrate compliance with any technically relevant portions of the Three Mile Island requirements set forth in 10 CFR 50.34(f), except paragraphs (f)(1)(xii), (f)(2)(ix), and (f)(3)(v)
  • 10 CFR 50.34(f)(2)(iii) - Provide, for Commission review, a control room design that reflects state-of-the-art human factor principles prior to committing to the fabrication or revision of fabricated control room panels and layouts SRP Chapter 18,Section III, Acceptance Criteria, lists the acceptance criteria adequate to meet the above requirements, as well as review interfaces with other SRP sections. Acceptance criteria for HFE design methodology are provided in NUREG-0711 (listed below). (NUREG-0711 references NUREG-0700, Human-System Interface Design Review Guidelines, which provides detailed acceptance criteria for HFE design attributes.)
  • NUREG-0711, Revision 3, Human Factors Engineering Program Review Model, Chapter 4, Functional Requirements Analysis and Function Allocation, Section 4.4, Review Criteria, issued November 2012 The following documents also provide additional guidance in support of the SRP acceptance criteria to meet the above requirements:

18-31

  • NUREG/CR-7126, Human-Performance Issues Related to the Design and Operation of Small Modular Reactors, issued June 2012
  • NUREG/CR-3331, A Methodology for Allocation of Nuclear Power Plant Control Functions to Human and Automated Control, issued 1983
  • NUREG/CR-7202, NRC Reviewer Aid for Evaluating the Human-Performance Aspects Related to the Design and Operation of Small Modular Reactors, issued June 2015 Technical Evaluation The NuScale design will rely on automation more so than operating plants do; therefore, the staff gave special consideration to the FA portion of the review to ensure that the issues described in NUREG/CR-7126 and NUREG/CR-7202, specifically, the relevant issues in Appendix A.2, Functional Requirements Analysis and Function Allocation, are adequately addressed, in addition to the NUREG-0711 criteria.

NUREG-0711, Section 4.4, contains nine acceptance criteria, eight of which are applicable to DC applicants (the ninth is applicable only for modifications). The staff used the applicable criteria to review the FRA/FA RSR (RP-0316-17615) to ensure the objectives are met as discussed below. In addition, issues of significant interest with regard to human/automation interaction may be observed during ISV testing.

Methodology (Criteria 4.4(1)-(2))

NUREG-0711, Section 4.4, Criteria 4.4(1)-(2), address the methodology used for the FRA/FA processes. Criterion 4.4(1) focuses on ensuring a structured and documented methodology that reflects HFE principles, and Criterion 4.4(2) aims to ensure that the process is iterative.

18.3.4.1.1 Summary of Application Information supporting the documentation of the FRA/FA methodology reflecting HFE principles (Criterion 4.4(1)) includes a high-level summary of the FRA and FA methodologies in DCA Part 2 Tier 2, Sections 18.3.2-18.3.2.2. Figure 3-1, FRA/FA activity and information flow, of the FRA/FA RSR illustrates both types of analysis, described in detail in the subsections below.

18.3.4.1.1.1 Functional Requirement Analysis Methodology DCA Part 2 Tier 2, Section 18.3.2.1, Objectives and Scope, describes the FRA methodology.

This method describes an iterative process that identifies specific plant-level functions and decomposes these functions into subfunctions, system functions, processes, and components necessary to accomplish the plant-level function, identified in the FRA/FA RSR, Table 3-1, NuScale Plant Functions.

The FRA/FA RSR, Section 2.1, FRA/FA Process Overview, provides a brief overview of the FRA/FA processes. The FRA/FA RSR elaborates on them in Sections 3.0-3.8 and provides an export-controlled methodology for developing plant functional requirements, performing function decomposition and requirements analysis, conducting FA, and documenting the processes illustrated in Figure 3-1.

18-32

18.3.4.1.1.2 Function Allocation Methodology DCA Part 2 Tier 2, Section 18.3.2.2, Function Allocation Methodology, describes the FA methodology. This method systematically assigns functions to automation, manual operation, or a combination of both. This process considers relevant concerns related to safe operation, including repetition of action, operator safety, likelihood of errors, and several others.

The FRA/FA RSR, Section 3.5, Automation Philosophy, indicates that the overall philosophy is to use automation to support the needs of the operator. NuScales strategy is that automation should be used for routine tasks and error-prone tasks and that interlocks should be used to prevent operators from performing undesired actions.

The FRA/FA RSR, Section 3.6, Automation Criteria, provides specific descriptions about how the applicant allocates functions to automation, operators, or a combination of both. The process uses SMEs to consider the tasks and to select an appropriate allocation. Table 3-2, Levels of automation, identifies and defines various levels of automation that the NuScale design uses.

DCA Part 2 Tier 2, Section 18.3.2, Methodology, describes information related to the iteration of the FRA/FA (Criterion 4.4(2)), which indicates that the FRA/FA process is iterative and is kept current throughout the plant life cycle. Similarly, the FRA/FA RSR, Section 2.1, indicates that the process is performed iteratively throughout the design process and is kept current through decommissioning.

18.3.4.1.2 Staff Assessment With regard to Criterion 4.4(1), the staff reviewed Figure 3-1 of the FRA/FA RSR and found that it documents an analysis process that is both logical and structured. The FRA/FA database documents the analyses (Section 4.2, FRA/FA Database, of the FRA/FA RSR provides additional details). The staff reviewed the structure of the database and a sample of preliminary results during the May 2017 audit (ADAMS Accession No. ML17181A415). Although the staff found that some of the FRA/FA analyses were not complete at the time, those that were complete were consistent with the examples shown in the FRA/FA RSR (Appendices B-G). In June 2018, the staff conducted a follow-up audit of the FRA/FA database (ADAMS Accession No. ML18208A370). During this audit, the staff confirmed that the entries in the database were complete and consistent with the methodology described in the FRA/FA RSR.

The HFE principles discussed in Sections 3.5, Automation Philosophy, and 3.6, Automation Criteria, of the FRA/FA RSR were consistent with the goal of FAto use the strengths of humans and automations to optimize system performancewith one exception. The FRA/FA RSR did not explicitly indicate that the applicant had considered unique human factors aspects associated with SMRs (see NUREG/CR-7202), such as the increased use of automation and the increased workload associated with multimodal operation. During the June 2018 audit (ADAMS Accession No. ML18208A370), the staff reviewed the FRA/FA database to confirm that it adequately addressed the SMR-specific HFE considerations described in NUREG/CR-7202. The staff reviewed a sample of the topics in NUREG/CR-7202 and concluded that the applicant had adequately considered them in the FRA/FA process.

In addition, the staff considered the results of the SPV audit, which took place in August 2016 (ADAMS Accession No. ML16259A110). These results provide preliminary evidence that the 18-33

automation schemes used have been successful in managing high workload conditions with the NuScale design during scenario testing. Furthermore, a more rigorous test of the final integrated system will be conducted during the ISV.

With regard to Criterion 4.4(2), the staff compared the statement in DCA Part 2 Tier 2, Section 18.3.2, to the criterion and found it to be consistent with the intent of the criterion, but it was unclear from the FRA/FA RSR how the process will be managed by the COL applicant. The staff issued RAI 9220, Question 18-22 (ADAMS Accession No. ML18077A000), to address this concern and to clarify when iterations will occur.

The response to RAI 9220, Question 18-22 (ADAMS Accession No. ML18115A441), indicates that NuScale will maintain the FRA/FA current in accordance with Regulatory Guide (RG) 1.206, Combined License Applications for Nuclear Power Plants (LWR Edition), issued June 2007.

Future COL holders will need to prepare an application in accordance with RG 1.206, which instructs COL holders to submit an update to the DCA Part 2. RG 1.206, Section C.I.18.3.2.1, Methodology for Functional Requirements Analysis, allows the COL holder to credit a predecessor design in this process; in this case, the predecessor would be considered the certified design. Any changes to this design that affect the FRA should be addressed as part of the COL application. RG 1.206, Section C.I.18.3.2.2, Methodology for Function Allocation Analysis, provides similar guidance on the COLs treatment of FA.

The response to RAI 9220, Question 18-22, also indicates that iterations to the FRA/FA will occur when modifications are proposed, rather than at predetermined periodic intervals. This strategy will address any relevant concerns that may arise after the initial DC approval, thus ensuring that the FRA/FA remains current over time. Therefore, the staff finds this treatment to be acceptable.

18.3.4.1.3 Conclusions The staff finds the FRA/FA methodology is structured and documented in accordance with Criterion 4.4(1) of NUREG-0711. The staff confirmed during the June 2018 audit that the NuScale process adequately addresses the HFE principles applicable to SMRs described in NUREG/CR-7202. Therefore, the staff finds this treatment conforms to Criterion 4.4(1).

The staff finds that the FRA/FA process is iterative. Reliance on RG 1.206 provides an adequate means of ensuring that the FRA/FA is updated accordingly. Therefore, the staff finds this treatment conforms to Criterion 4.4(2) of NUREG-0711.

A revision to RG1.206 was issued in October 2018 after NuScales response to RAI 9220. Revision 1 of RG1.206 states The applicable DCR appendix to 10 CFR Part 52 requires COL applicants to provide a report to the NRC that contains a brief description of any plant-specific departures from the DCD with a summary of the evaluation for each departure. The DCR also requires each applicant to maintain and submit updates to its plant-specific DCD, which consists of the generic DCD and plant-specific departures. To fulfill these requirements, the applicant may provide a report separate from the FSAR with the description and evaluation for each departure and include a summary table in this section of the FSAR that lists each departure and the FSAR sections that address each departure.

Although there is no specific guidance related to FRA/FA in RG 1.206 Revision 1, the DC rule still requires the COL applicant to inform the staff of any changes from the generic DCD. Since 18-34

the FRA/FA RSR is incorporated by reference in the DCD, then any changes to it would be need to be identified in the COL application. Therefore, significant changes to the FRA/FA would still need to be identified in a COL application prepared using Revision 1 of RG 1.206. Therefore, the staff finds this to be an adequate means of ensuring a COL applicant will provide the information regarding changes to the FRA/FA during the COL application process.

Functional Requirements Analysis Results (Criteria 4.4(3)-(4))

NUREG-0711, Section 4.4, Criteria 4.4(3)-(4), focus on ensuring that the results of the FRA analysis are adequate. Criterion 4.4(3) gives specific properties that the plants functional hierarchy should address. Criterion 4.4(4) focuses on identifying design requirements associated with the high-level plant functions identified in the plants functional hierarchy.

18.3.4.2.1 Summary of Application The following application materials address Criterion 4.4(3):

  • DCA Part 2 Tier 2, Section 18.3.2.1, describes the functional hierarchy/task decomposition at a high level. This section gives two high-level goals: plant safety and power generation. The listed plant-level functions support these goals. In addition, Figures A-1, CVCS decomposition for fuel assembly heat removal and reactivity control, and A-2, Example of removing fuel heat assembly decomposition during operation, in the FRA/FA RSR illustrate a sample of vertical slices through the NuScale functional hierarchy that resemble Figure 4-1, CVCS system description & purpose. of NUREG-0711.
  • DCA Part 2 Tier 2, Section 18.3.2.1, addresses the treatment of predecessor plant systems and compares them to traditional nuclear power plant systems and functions. It describes how the applicant decomposes the high-level functions in a way that ultimately supports the FA process.
  • The FRA/FA RSR, Section 3.2, Plant Functional Requirement Development, reiterates much of the process identified in DCA Part 2 Tier 2, with additional detail. It lists the NuScale plant functions in Table 3-1, next to NuScale design features intended to support each function. Section 2.1 indicates that the applicant defines the plant functions using the design reliability assurance program expert panel. SRP Section 17.4, Reliability Assurance Program (RAP), addresses the review of the design reliability assurance program process.
  • The FRA/FR RSR, Section 4.4.2, Predecessor Designs, illustrates how the FRA/FA database is used to consider and document NuScale functions compared to predecessor designs and to assess their influence on plant functions.

The following sections of the application address Criterion 4.4(4):

  • DCA Part 2 Tier 2, Section 18.3.1, Objectives and Scope, indicates that the purpose of the FRA/FA process is to ensure that safety and power generation goals are sufficiently defined, analyzed, and allocated.

18-35

  • The FRA/FA RSR, Section 4.4, Functional Requirements Analysis and Database Examples of Results, describes the applicants FRA/FA database. Sections 4.4.1-4.4.8 describe how the database documents each of the bulleted items in the criterion. In addition, Figures 4-1-4-24 show sample entries from the FRA/FA database for three different systems that correspond directly with the bullets.

18.3.4.2.2 Staff Assessment With respect to Criterion 4.4(3), the applicant provided a reasonable high-level description of the FRA/FA process in DCA Part 2 Tier 2, supported by the diagrams in Appendix A, System Decomposition, to the FRA/FA RSR that closely resemble the functional hierarchy shown in NUREG-0711, Chapter 4, Functional Requirements Analysis and Function Allocation. The tables below the diagrams provide examples of how the data in the FRA/FA database are represented. Appendix A shows only a sample of two sections of the functional hierarchy. This is reasonable because the full functional hierarchy is very large and would be difficult to represent on paper. The functional decomposition breaks down the systems into progressively more specific subsystems and components, as described in the criterion.

Although NuScale does not have an immediate predecessor design, the applicant has considered systems and functions that resemble pressurized-water reactors in a way that is consistent with the intention of the criteria. The FRA/FA database captures this information in a manner that preserves the information with other important system information.

Figures 4-10-4-24 of the FRA/FA RSR illustrate how the FRA/FA database identifies system configurations necessary for safe operation. These screen shots show a variety of parameters necessary for the operator to understand, such as when the associated function is necessary, working, and ready for termination.

The applicant credited the design reliability assurance program in this process (see SRP Chapter 17, Quality Assurance, Section 17.4, and ER-0000-3387, NuScale Plant Functions, Revision 0).

The staff issued RAI 9381, Question 18-15 (ADAMS Accession No. ML18068A728), to clarify the nature of the safety functions compared to the critical safety functions (CSFs) necessary for other HFE review elements (such as the HSI design review element in NUREG-0711, Chapter 8, Human-System Interface Design) and review under SRP Chapter 13, Conduct of Operations. The response to RAI 9381, Question 18-15 (ADAMS Accession No. ML18114A351), clarifies that there are three safety functions, which are equivalent to the CSFs (maintain containment integrity, reactivity control, and remove fuel assembly heat). This clarification was sufficient to close the RAI.

In June 2018, the staff conducted an audit of the FRA/FA database (ADAMS Accession No. ML18208A370). The staff confirmed that the results in the database were adequately represented by the sample of results in the FRA/FA RSR. The results reviewed were consistent with the method described in the FRA/FA RSR.

With respect to Criterion 4.4(4), the staff reviewed the sample database entries found in the FRA/FA RSR and found them to contain entries for each of the bulleted areas listed in this criterion. The structure of the database helps to ensure that the FRA/FA process will include those bulleted items of the criterion. The staff finds that the use of the FRA/FA database is an 18-36

effective means for working through and documenting the process. In addition, the June 2018 audit of the FRA/FA database (ADAMS Accession No. ML18208A370) confirmed that the results of the process were consistent with the method described in the FRA/FA RSR.

18.3.4.2.3 Conclusion The staff finds that the methodology described in the FRA/FA RSR is consistent with the applicable NUREG-0711 criteria because Criteria 4.4(3)-(4) are met as described above. The staff confirmed during the June 2018 audit that the results in the FRA/FA database were consistent with the FRA/FA methodology. Therefore, the staff concludes that this treatment of Criteria 4.4(3)-(4) is acceptable.

Function Allocation Results (Criteria 4.4(5)-(7))

Criteria 4.4(5)-(7) address the results of the FA. Criterion 4.4(5) indicates that the FA should identify the level of automation for each function as well as the technical bases for the allocation. Criterion 4.4(6) indicates that the FA should address primary actions taken by the operator as well as other operator actions, such as monitoring automation, detecting degradations/failures, and assuming manual control. Criterion 4.4(7) addresses the overall role of the operators while considering all functions allocated to them.

18.3.4.3.1 Summary of Application The following sections of the application address Criterion 4.4(5):

  • DCA Part 2 Tier 2, Section 18.3.2.2, provides high-level rules for determining the appropriate allocation and level of automation. The applicant expanded on these rules in the FRA/FA RSR, Section 3.5, which provides the set of conditions used by SMEs to decide whether a function should be allocated to the human, the automation, or a combination of both. In addition, Section 3.6 of the FRA/FA RSR presents eight automation criteria used by SMEs to allocate functions to the specific levels of automation defined in Table 3-2 of the FRA/FA RSR.
  • The FRA/FA RSR, Section 4.5, Function Allocation Example, provides a partial FA table. This table of export-controlled information represents a sample of the results of the FA process. The table presents tasks that are paired with the assigned allocation, technical basis for the allocation, and a description of the role of the operator while performing/monitoring the task.

The following sections of the application address Criterion 4.4(6):

  • DCA Part 2 Tier 2, Section 18.3.2, provides a very high-level indication that the process will address this criterion, stating that the HFE team determines the conditions and parameters necessary for monitoring and control.
  • The FRA/FA RSR, Section 2.1, provides additional detail and states, FA provides a framework for determining roles and responsibilities of personnel and automation.
  • The FRA/FA RSR, Section 4.5, describes the information that FA tables are to include.

Table 4-2, Partial function allocation table for CVCS, shows a partial FA table.

Appendices E-G show three sample allocation tables for select systems. These tables 18-37

show the allocation (including the level of automation used (e.g., automatic with the consent of the operator), technical basis for the allocation, and the role of the operator.

When the function is allocated to manual control, the table typically includes a brief description of what the operator must do. When the allocation is to automation, the role of the operator typically is to monitor and take control when automation fails. Although Table 4-2 does not explicitly describe how the operator will understand that the automation has failed (i.e., parameters to monitor), the FRA/FA collects and stores information needed by the operator to successfully monitor and back up failed automation (see the FRA/FA RSR, Sections 4.4.4-4.4.8). Moreover, the TA process iterates and supplements this information.

The following sections of the application address Criterion 4.4(7):

  • Although the associated Tier 1 or Tier 2 material associated with FRA/FA does not explicitly address an overall operator role, the FRA/FA RSR contains much information about how operator roles are developed:

- The FRA/FA RSR, Table 4-1, VISION Icon Descriptions, has an entry for Job Position, which is defined as, A way to determine the roles and responsibilities of a task. This indicates that roles and responsibilities of a task are considered in the FRA/FA database. In addition, Table 4-1 has an entry for Tasks, which are defined as, A well-defined unit of work having an identifiable beginning and end which is a measurable component of a specific job.

- The FRA/FA RSR, Section 4.5, provides an example of a partial FA table. It shows entries in the Role of the Operator Column, which describes how operators will interact with various systems during tasks.

- Appendices E-G of the FRA/FA RSR show sample allocation tables. These tables show the allocation, technical basis for the allocation, and the role of the operator. When the allocation is to automation, the role of the operator typically is to monitor and take control when automation fails.

  • The applicant compiled a more comprehensive description of operator roles in the ConOps:

- The ConOps, Section 2.2, Operations Crew Composition, Qualification, Training and Command and Control, describes the crew composition, qualifications, and training and the basic command and control concept. Section 2.1, Plant Mission, describes additional duties associated with the SROs and ROs, including tasks that go beyond direct manipulation of plant controls, such as implementation of the emergency plan, directing/overseeing staff, and conducting surveillances.

- The ConOps, Section 2.3, Operator Roles and Responsibilities, covers the roles and responsibilities of operators and describes how operators should control and monitor plant functions and communicate with other team members.

- The ConOps, Section 2.4, Machine Agent and Shared Roles, describes the roles of machine agents (automation) and shared roles between the machine and 18-38

human operators. It describes various methods through which the operator may communicate with the automated system (such as setting control parameters, initiating actions, securing automation, or making manual adjustments to automated processes).

- The ConOps, Section 2.4.3, Parameter Monitoring, defines conditions in which the operator should increase his or her interaction with a system and explains when the operator should intervene to interrupt an automated process.

18.3.4.3.2 Staff Assessment With regard to Criterion 4.4(5), the staff reviewed the FRA/FA RSR, Table 4-2, and found that it identifies the allocations (including levels of automation) and the technical bases for the functions and components listed in the table. This constitutes reasonable evidence that the process described in the FRA/FA RSR will provide results that are consistent with the criterion.

In June 2018, the staff audited the FRA/FA database (ADAMS Accession No. ML18208A370).

The staff reviewed a sample of entries and confirmed that the sample of results in the FRA/FA RSR adequately represented the contents of the database. Therefore, the staff concludes that both the methodology described in the FRA/FA RSR and the results of the process are sufficient to meet Criterion 4.4(5).

With regard to Criterion 4.4(6), the staff reviewed the description of information in the FRA/FA RSR, Section 4.5, and the examples in the appendices. The columns in the tables show that some tasks are assigned to manual control, indicating that the task is a primary responsibility of the operator. The tables also demonstrate that those conducting the FA process should identify secondary tasks, including monitoring, detection of degradations, and assumption of manual controls, as part of the FA process. The staff examined a sample and found this to be the case.

The FRA/FA database supports the documentation of the information needed to support the allocations identified in the FRA/FA RSR, Section 4.5.

The staff issued RAI 9370, Question 18-25 (ADAMS Accession No. ML18078B315), to clarify a statement in the FRA/FA RSR, which refers to direct operator action, in Section 1.2, Scope.

Some reasonable interpretations of this term might conclude that this means to exclude tasks such as monitoring and the detection of failures, which are perceptual/cognitive tasks that do not require physical operator action. The response to RAI 9370, Question 18-25 (ADAMS Accession No. ML18103A153), clarified that interactions include activities such as monitoring (an activity that does not necessarily involve physical action by the operator); therefore, this terminology does not inappropriately limit the scope of the analysis.

The staff finds this methodological treatment to be consistent with the acceptance criteria and, therefore, acceptable. The June 2018 audit (ADAMS Accession No. ML18208A370) confirmed that the results in the FRA/FA RSR are representative of the set of results in the FRA/FA database.

With regard to Criterion 4.4(7), the applicant described the role of the operators in the ConOps.

The roles describe the expected interactions between operators and automation and considers other tasks that may interfere with this interaction (such as supervising staff or implementing the emergency plan).

18-39

The ConOps describes the operator roles at a relatively high level. The SPV tested the concept of operations and confirmed that the operator roles can be effective in the MCR. The SPV was audited by staff (ADAMS Accession No. ML16137A257). The staff finds this description of operator roles in the ConOps to be an acceptable means of meeting this criterion because this high-level description of the roles has been tested with satisfactory results in the SPV.

Moreover, additional testing will occur during the ISV testing that can be used to further refine the details of the operator role.

18.3.4.3.3 Conclusions The staff reviewed the methodology in the FRA/FA RSR and found that it conforms to the guidance in NUREG-0711 as described above. The results presented in the FRA/FA RSR were derived using this process. In addition, the staff confirmed that the results in the FRA/FA RSR are representative of the results in the FRA/FA database. Therefore, the staff finds this an acceptable means to meet these NUREG-0711 criteria.

Verification that Functional Requirements Analysis/Function Allocation Is Complete (Criterion 4.4(8))

Criterion 4.4(8) focuses on verifying that the results of the FRA/FA are complete and have accomplished the objectives.

18.3.4.4.1 Summary of Application Several sections of the submitted materials address goals similar to this criterion:

  • DCA Part 2 Tier 2, Section 18.3.2.1, states, The HFE team members review the FRA and verify that all high-level functions necessary to achieve safe operation have been identified and analyzed along with the requirements for each of the identified functions.

The verification is documented in the FRA and function allocation database.

  • DCA Part 2 Tier 2, Section 18.3.3, Results, concludes that the FRA/FA processes were conducted in a manner that is consistent with the criterion. Specifically, it indicates that FRA/FA process results include a set of safety functions and provides a pointer to the FRA/FA RSR. It indicates that requirements for each high-level function are identified (e.g., conditions when the function is needed, indication that function is available). It also indicates that the FRA/FA RSR contains the allocation of functions and technical basis.
  • The FRA/FA RSR, Section 3.0, Methodology, describes the methods used to conduct these the FRA/FA processes. Section 4.0, Summary of Results, summarizes the results, and Appendices B-G provide sample database entries.
  • The FRA/FA RSR, Sections 3.4-3.6, provide information about how the allocations to humans and automatic systems are conducted in a way that takes advantage of human strengths and avoids human limitations.

18.3.4.4.2 Staff Assessment The methodologies described provide a means to identify high-level functions needed for safe operation and to track the requirements of the high-level functions (via the FRA/FA database).

18-40

In addition, the FRA/FA RSR provides rules for allocating functions to automation that are consistent with good human factors practice (e.g., using automation for repetitive and predictable tasks, using automation when fast results are necessary, and using automation when it is unsafe for an operator to perform a task).

However, the application contains little information on verification that the goals described in this criterion have been accomplished. The FRA/FA RSR, Section 5.0, Analysis of Conclusions, describes the interdisciplinary approach used in the design/analysis process. It indicates that the results are reviewed and evaluated but provides little detail as to how this is done.

Moreover, the applicant submitted the FRA/FA RSR before an NRC audit that found that several of the analyses were not yet complete.

The staff was unable to independently confirm the execution of these processes to be complete during the May 2017 audit. It is clear from the criterion that the full intent of the criterion is to confirm that the final product of the FRA/FA process is of high quality. The criterions use of the word verify implies that the process must be complete. In addition, the first two bullets of the criterion use the word all, indicating that a sample of results is not adequate to complete this criterion. Moreover, the fact that the applicant submitted the FRA/FA RSR before the completion of FRA/FA activities suggests that any verification process that was executed was ineffective. Therefore, the staff issued RAI 9372, Question 18-14 (ADAMS Accession No. ML18068A727), to clarify what, if any, verification had taken place. The response to RAI 9372, Question 18-14 (ADAMS Accession No. ML18114A822), indicates that an interdisciplinary team verified that the high-level functions reported in the FRA/FA RSR, Table 3-1, were accurate.

The FRA/FA RSR, Section 5.0, discusses this verification. The verification of allocations was initially assessed during the SPV and will be tested during the ISV (to occur concurrently with the NRC Phase 4 review). Because this is credited as a verification activity, RAI 9372, Question 18-14 is Open Item 18-1 until after the ISV is complete and a determination has been made that the allocations are appropriate.

18.3.4.4.3 Conclusion The staff finds that the methodologies described are consistent with the applicable criteria in NUREG-0711; therefore, the methodologies are acceptable. However, because ISV testing is credited as a means of satisfying Criterion 4.4(8), the staff should confirm that the verification of the FRA/FA is complete and accurate after the test is complete. Therefore, the applicant has not met this criterion until a final verification of the FRA/FA process is completed and confirmed.

Combined License Information Items No COL information items are associated with NuScale DCA Part 2 Tier 2, Section 18.3.

Conclusion The staff reviewed the FRA/FA RSR and found that the description of the methodology summarized above, when supplemented by the RAI responses and the ConOps, is satisfactory to meet the criteria as described above.

The June 2018 audit concluded that the results in the FRA/FA database are consistent with NUREG-0711 and that the results in the FRA/FA RSR adequately represent the contents of the 18-41

database. Therefore, the staff concludes that the methodology and results of the FRA/FA analyses documented in the FRA/FA RSR are consistent with NUREG-0711.

Criterion 4.4(8) focuses on verifying the completeness of the FRA/FA process. The June 2018 audit confirmed the preliminary allocations (before ISV testing). However, these allocations may change as a result of ISV testing. Therefore, the staff cannot make a final determination of this review element until the final verification is conducted. The staff will confirm the completion of Criterion 4.4(8) during the NRC Phase 4 technical review (see Open Item 18-20 in Section 18.10.4.5.4 of this SER).

18.4 Task Analysis Introduction TA identifies the tasks that plant personnel must perform to accomplish the functions that are allocated to HAs. TA also identifies the alarms, information, controls, and task support that must be available for plant personnel to successfully perform these tasks. TA generates input to several program elements: S&Q, HSI design, procedure development, training program development, and V&V.

Summary of Application DCA Part 2 Tier 1: Refer to Section 18.1.2 of this SER.

DCA Part 2 Tier 2: The applicant provided a description of this HFE element in DCA Part 2 Tier 2, Section 18.4, Task Analysis.

ITAAC: There are no ITAAC associated with this element.

TS: There are no TS associated with this element.

Topical Reports: There are no topical reports associated with this element.

Technical Reports: Refer to Section 18.1.2 of this SER.

Regulatory Basis The following NRC regulations contain the relevant requirements for this review:

  • 10 CFR) 52.47(a)(8) as it pertains to the information necessary to demonstrate compliance with any technically relevant portions of the Three Mile Island requirements set forth in 10 CFR 50.34(f), except paragraphs (f)(1)(xii), (f)(2)(ix), and (f)(3)(v)
  • 10 CFR 50.34(f)(2)(iii) - Provide, for Commission review, a control room design that reflects state-of-the-art human factor principles prior to committing to the fabrication or revision of fabricated control room panels and layouts SRP Chapter 18,Section III, Acceptance Criteria, lists the acceptance criteria adequate to meet the above requirements, as well as review interfaces with other SRP sections. Acceptance criteria for HFE design methodology are provided in NUREG-0711 (listed 18-42

below). (NUREG-0711 references NUREG-0700, Human-System Interface Design Review Guidelines, which provides detailed acceptance criteria for HFE design attributes.)

  • NUREG-0711, Revision 3, Human Factors Engineering Program Review Model, Chapter 5, Task Analysis, Section 5.4, Review Criteria.

Technical Evaluation The staff used the criteria in NUREG-0711, Section 5.4, to evaluate the applicants TA RSR.

NUREG-0711, Section 5.4, includes 10 criteria for this topic. The tenth criterion addresses plant modifications and is not applicable to new reactors; thus, the staff evaluated only the first nine criteria, as discussed below.

Scope (Criterion 5.4(1))

The staff reviewed the TA RSR, Section 3.2, Task Screening, which addresses the scope of the applicants TA. The staff compared this information to Criterion 5.4(1), which lists tasks that should be part of the scope of the applicants TA, including (1) all IHAs (determined by probabilistic and deterministic means), (2) tasks that represent the full range of plant operating modes, and (3) eight specific types of tasks as listed in Criterion 5.4(1) (e.g., tasks that are new compared to those in predecessor plants).

In the TA RSR, Section 3.2, the applicant stated that the scope of its TA includes IHAs (probabilistic and deterministic), the full range of plant operating modes, and the eight types of tasks listed in Criterion 5.4(1).

In addition, the staff reviewed a sample of TA results provided in the application as well as during a May 2017 audit (ADAMS Accession No. ML17181A415). The staff observed that the tasks included were within the applicants stated scope. Accordingly, the staff finds that the application conforms to this NUREG-0711 criterion.

Screening Methodology (Criterion 5.4(2))

The staff reviewed the HFE PMP; DCA Part 2 Tier 2, Section 18.4.2.1, Task Identification Methodology; and the TA RSR, Section 3.3.2, Surveillance, Test, Inspection, and Maintenance Procedure Tasks. The staff compared this information to Criterion 5.4(2), which states that the applicant should describe the screening methodology used to select the tasks for analysis, based on criteria specifically established to determine whether analyzing a particular task is necessary.

The HFE PMP, Section 6.3, states, Tasks are first screened. From the wide range of plant operating conditions, any task that meets the following criteria receives a more detailed TA. A list of eight criteria follows the statement. Additionally, in DCA Part 2 Tier 2, Section 18.4.2.1, the applicant further explained that Determination of tasks to be analyzed is performed by subject matter experts on the basis of their experience at current operating nuclear plants. The process typically includes review of operating experience and available system design material.

In the TA RSR, Section 3.3.2, the applicant stated the following:

To select which risk-significant surveillance, test, inspection, and maintenance tasks are to be analyzed, the SME reviews the design material available, 18-43

including system design packages, piping and instrument diagrams (P&IDs),

logic diagrams, and electrical schematics for each system the task involvesactivities that by SME judgment have challenged operating crews at current commercial U.S. operating nuclear plants, or which potentially impact the ability of a NuScale plant operating crew to manage up to twelve units in one control room, are selected for TA. An SME who did not conduct the evaluation for a specific system reviews the results documentation for completeness and confirmation of the task selections.

The staff issued RAI 8805, Questions 18-3 (ADAMS Accession No. ML17255A737), requesting clarification on the applicants screening methodology. The response to RAI 8805, Question 18-3 (ADAMS Accession No. ML17304B488), stated that detailed TA was performed on all tasks.

The applicant stated it would revise the DCA Part 2 Tier 2, Section 18.4, Task Analysis; the HFE PMP; and the TA RSR to specify that all tasks, regardless of their importance, were analyzed. Therefore, the staff concludes that screening criteria did not need to be established because the applicant chose to analyze all tasks that were included in the scope of TA.

RAI 8805, Question 18-3, is being tracked as a Confirmatory Item pending the incorporation of the changes into next revision of the DCA Part 2.

Tasks Attributes and Iterative Process (Criteria 5.4(3)-(8))

NUREG-0711, Criteria 5.4(3)-(7), state that the applicant should (1) begin TA with detailed narratives of what personnel have to do, along with specifying the alarms, information, controls, and task support needed to accomplish the task, (2) identify the relationships among tasks, (3) estimate the time required to perform tasks, (4) identify the number of people required to perform each task, and (5) identify the knowledge and abilities required to perform each task.

Criterion 5.4(8) states that the applicants TA should be iterative and updated as the design is better defined. The staff reviewed both the DCA Part 2 and multiple sections of the TA RSR and compared this information to Criteria 5.4(3)-(8).

Task Narrative (Criterion 5.4(3))In DCA Part 2 Tier 2, Section 18.4.2.2, Personnel Task Narrative, the applicant described the task narrative. It includes a description of the objectives of a specific system's operator tasks; an overview of the activities personnel are expected to accomplish to complete the task; a definition of alarms, information, controls, and task support needed to accomplish the task; and a basic outline of the procedure steps. The TA RSR, Section 3.5, Detailed Task Narratives, provides details on the information that is included in the task narrative (e.g., associated alarms, anticipated workload, communications needs). In the TA RSR, Table 3-1, Task Considerations, the applicant listed specific task considerations addressed in the task narratives, which is consistent with NUREG-0711, Table 5-1, Task Considerations.

The applicant addressed the processes used to identify relationships among tasks, estimate the time required to perform tasks, identify the number of people required to perform each task, and identify the knowledge and abilities required to perform each task in the following ways:

  • Relationships Among Tasks (Criterion 5.4(4))In DCA Part 2 Tier 2, Section 18.4.2.3, Relationships Among Tasks, the applicant stated the following:

18-44

each task is decomposed by identifying the parent task, subtasks, and task elements. An operational sequence diagram is created and used for certain tasks as necessary to aid in evaluating the flow of information between the operators and the HSI from the beginning to the end of the task. Information flow includes operator decisions, operator and control activities, and the transmission of data.

  • Time Required (Criterion 5.4(5))In DCA Part 2 Tier 2, Section 18.4.2.4, Time Required for Performing Tasks, the applicant stated the following:

The time required to complete a task is a combination of cognitive processing time, physical movement time, and HSI response time (e.g., screen navigation, control operation, I&C platform processing, plant system response). Calculations of time required for task performance consider decision-making (which may or may not be part of cognitive processing depending on task complexity), communications with the operations team, task support requirements, situational and performance-shaping factors, and workplace factors and hazards for each step of a task. The analysis of time required is also based on a documented sequence of operator actions. Time estimates for individual task components (e.g., acknowledging an alarm, selecting a procedure, verifying that a valve is open, starting a pump), and the basis for the estimates are established through a method applicable to the HSI characteristics of digital computer-based I&C.

  • Number of Personnel (Criterion 5.4(6))In the TA RSR, Section 3.5.2, Personnel Required for Performing Tasks, the applicant stated, The number of personnel required to perform each task is determined by the task narrative, complexity of the task, time required to perform the task, and the time available.
  • Knowledge and Abilities (Criterion 5.4(7))In TA RSR, Section 3.5.6, Knowledge and Abilities Identification, the applicant stated the following:

each task is analyzed to determine the knowledge and abilities needed for success of the task. The knowledge and abilities are benchmarked against a modern pressurized water reactor using NUREG-2103, and a gap analysis is performed. The results of this analysis are used to develop the NuScale-specific KA [knowledge and abilities] catalog written to specifically address the unique nature of the design.

The TA RSR, Section 4.4, Knowledge and Abilities, provides specific examples of the types of knowledge and abilities captured and how they are associated with tasks.

The staff reviewed task examples in the TA RSR, Section 4.0, Summary of Results, and confirmed that they contained detailed narratives of what personnel need to do to accomplish the task, as well as the alarms, information, controls, and task support personnel need to accomplish the tasks. The staff also found that the examples addressed each of the task considerations listed in NUREG-0711, Table 5-1. Further, the examples showed how task decomposition allows for the identification of relationships among tasks in the TA database used by the applicant. For example, each task is linked to the function(s) it supports, and each 18-45

component, instrument, alarm, and control in the database is linked to the tasks it supports.

Thus, task relationships can be identified via common functions, HSI components, and the like.

The operational sequence diagrams demonstrate the sequential relationships between tasks.

The staff also found that the examples addressed the estimated time required to perform each task, the number of people required to perform each task, and the knowledge and abilities needed to perform each task. The staff concludes the examples provided are consistent with Criteria 5.4(3)-(7).

The staff conducted two audits of the applicants TA results in the applicants TA database to verify the methodology used and confirm the completeness of the TA results. During the May 2017 audit (ADAMS Accession No. ML17181A415), the staff found that the TAs sampled were either complete or incomplete. For those analyses that were complete, the staff found that the analyses were consistent with Criteria 5.4(3)-(7). For tasks that were incomplete, the staff found that the task had been entered into the database, and the database included fields for each of the attributes discussed in Criteria 5.4(3)-(7). Some tasks were partially complete.

During the audit, the applicant explained that the TAs had been completed for tasks that were part of the sample of tasks included in the SPV, which is discussed in more detail in SER Section 18.5.4. The applicant explained that it was continuing (1) to perform new TAs and modify existing analyses as the design developed and (2) to prepare for the final design validation test (i.e., the ISV test) scheduled in 2018.

The staff conducted an audit in June 2018 (ADAMS Accession No. ML18208A370) and found that the analyses for all of the tasks sampled had been completed, and the TAs conformed to Criteria 5.4(3)-(7). Therefore, the staff concluded that the applicants TA was iterative and was updated as the design was developed, which is consistent with Criterion 5.4(8). Accordingly, the staff finds that the application conforms to these NUREG-0711 criteria.

Reliability and Feasibility (Criterion 5.4(9))

NUREG-0711, Criterion 5.4(9), states that the applicant should analyze the feasibility and reliability of IHAs and lists topics that should be considered in doing so.

The staff reviewed the TA RSR, DCA Part 2, and TIHA RSR and compared the information to Criteria 5.4(9). The TA RSR, Section 3.7, Analysis of Feasibility and Reliability for Important Human Actions, states that the time available to perform actions is the length of time from the initiation of the task to the time the task needs to be completed as defined in the analysis (i.e.,

PRA). The TIHA RSR, Section 4.1, Identification of Risk Important Human Actions from the PRA/HRA, states that two IHAs associated with the NuScale plant design were identified. The applicant relied upon the PRA to specify the time available for IHAs.

The time required for the IHAs was analyzed using an integrated MCR simulator that reflected the NuScale design to date. This analysis was part of the SPV testing. The IHAs were simulated, and the time required for completion was recorded. Staffing for the analysis was nominal (i.e., three ROs, three SROs). The procedures developed from the applicants TA guided the sequence of operator actions. The testing included the applicable alarms, controls, and displays.

The PRA originally specified 90 minutes as the time available to complete the IHAs; however, the staff observed some inconsistencies between PRA information used in SPV testing and that submitted with the DCA. The staff issued RAI 9409, Questions 18-36 and 18-40 (ADAMS 18-46

Accession No. ML18082B397) to seek clarification on the inconsistencies. The applicants response to RAI 9409, Question 18-36 (ADAMS Accession No. ML18143B532) explained that

[[ ]]. However, the SPV results still remained within the applicants established SPV acceptance criteria. The staff understands that this is an iterative process and that the results were still acceptable.

Additionally, the applicant established two criteria related to the amount of margin between time available and time required to perform the IHAs:

(1) IHAs must have been completed within [[ ]] of the time available, as calculated by the PRA (i.e., a [[ ]] margin). If this criterion was not met, then the scenario would not have been considered successful.

(2) All tasks with time constraints, including IHAs, that were not completed within [[ ]] of the time available (i.e., a [[ ]] margin) were [[ ]].

Given that the most limiting times for these IHAs in the PRA are about 30 minutes and that the actions are generally simple, operators are trained, procedures are available, the IHAs all occur in the MCR, and the controls and displays operators need to use to complete the IHAs are provided in the MCR design, the staff finds that the time margins and the estimate of time required are reasonable.

Because the actions were simulated in a MCR simulator that included the procedures for completing these IHAs, the staff concludes that the time for operators to complete these actions was sufficient to allow for the successful execution of applicable steps in the procedures.

Therefore, the staff concludes the applicant addressed the topics in Criterion 5.4(9) and analyzed whether the IHAs can be performed reliably and feasibly. Thus, the staff finds that the application conforms to Criterion 5.4(9).

Combined License Information Items No COL information items are associated with DCA Part 2 Tier 2, Section 18.4.

Conclusion The staff evaluated the applicants TA methodology and results and found that all of the criteria in NUREG-0711, Section 5.4, are met. Therefore, the staff concludes that the applicants TA identifies the specific tasks personnel perform to accomplish their functions, identifies the necessary control room inventory to accomplish those tasks, and provides reasonable assurance that operator tasks identified can be executed with the available inventory.

Accordingly, the staff finds this treatment to be acceptable.

18.5 Staffing and Qualifications Introduction The objective of the staffs review is to verify that the applicant has systematically analyzed the number and necessary qualifications of personnel in concert with task requirements and regulatory requirements.

18-47

Summary of Application DCA Part 2 Tier 1: Refer to Section 18.1.2 of this SER.

DCA Part 2 Tier 2: The applicant provided a description of this HFE element in DCA Part 2 Tier 2, Section 18.5, Staffing and Qualifications.

ITAAC: There are no ITAAC associated with this element.

Technical Specifications: The following TS are associated with this element:

  • TS 5.2.2 contains requirements for the minimum number of licensed operators at a NuScale plant.
  • TS 5.1.2 requires that the shift manager shall be responsible for the control room command function, and during the shift managers absence from the control room while any unit is in MODE 1, 2, 3, 4, or 5, an individual with an active SRO license shall be designated to assume the control room command function.

Topical Reports: There are no topical reports associated with this element.

Technical Reports: Refer to Section 18.1.2 of this SER.

Regulatory Basis The following NRC regulations contain the relevant requirements for this review:

  • 10 CFR 52.47(a)(8) as it pertains to the information necessary to demonstrate compliance with any technically relevant portions of the Three Mile Island requirements set forth in 10 CFR 50.34(f), except paragraphs (f)(1)(xii), (f)(2)(ix), and (f)(3)(v)
  • 10 CFR 50.34(f)(2)(iii) - Provide, for Commission review, a control room design that reflects state-of-the-art human factor principles prior to committing to the fabrication or revision of fabricated control room panels and layouts SRP Chapter 18,Section II, lists the acceptance criteria adequate to meet the above requirements, as well as review interfaces with other SRP sections.
  • NUREG-0711, Revision 3, Chapter 6, Staffing and Qualifications, Section 6.4, Review Criteria The following documents also provide additional criteria, or guidance in support of the SRP acceptance criteria to meet the above requirements:
  • Brookhaven National Laboratory (BNL) TR No. 20918-1-2015, Methodology to Assess the Workload of Challenging Operational Conditions In Support of Minimum Staffing Level Reviews, issued March 2015 (ADAMS Accession No. ML15083A205) (BNL Tech Report)

Technical Evaluation NUREG-0711, Section 6.4, includes six criteria for the S&Q element. Section 13.1 of this SER addresses Criterion 6.4(1). Criterion 6.4(2) addresses NRC requirements for minimum staffing of licensed operators that are applicable to facility licensees; these requirements are not applicable to DC applicants. The applicant proposed a staffing level for its design that would not allow a facility licensee to meet some requirements in 10 CFR 50.54(m). Therefore, the applicant provided the methodology used to conduct, and the results of, a performance-based test, referred to as the SPV, as technical justification to support a new design-specific staffing requirement that a facility licensee can use in lieu of 10 CFR 50.54(m). The applicant provided a new design-specific staffing requirement in DCA Part 7, Section 6, to be added to the DC rule such that a licensee for a NuScale plant can use the design-specific staffing rule in lieu of the requirements in 10 CFR 50.54(m). The staff evaluates the applicants technical basis supporting the proposed minimum staffing level in Section 18.5.4.2 of this SER.

The remaining review criteria in NUREG-0711 address inputs from the TA to S&Q analyses (Criterion 6.4(3)), staffing for the full range of plant conditions and tasks (Criterion 6.4(4)),

iteration (Criterion 6.4(5)), and staffing-related issues (Criterion 6.4(6)). The staff addresses these criteria in Section 18.5.4.3 of this SER.

Before discussing the review criteria, the staff provides relevant background information in Section 18.5.4.1 of this SER.

Rationale for a Design-Specific Staffing Requirement The requirements in 10 CFR 50.54(k) and 10 CFR 50.54(m) identify the minimum number of licensed operators that must be on site, in the control room, and at the controls. The requirements are conditions in every nuclear power reactor operating license issued under 10 CFR Part 50. The requirements are also conditions in every COL issued under 10 CFR Part 52, Licenses, Certifications, and Approvals for Nuclear Power Plants; however, they are only applicable after the Commission makes the finding under 10 CFR 52.103(g) that the acceptance criteria in the COL are met.

In a letter to the NRC dated September 15, 2015 (ADAMS Accession No. ML15258A846),

NuScale proposed that six licensed operators will operate up to 12 reactor modules from a single control room. However, the staffing proposal would not allow a facility licensee to meet the requirements in 10 CFR 50.54(m)(1) because the table in 10 CFR 50.54(m)(1) does not address operation of more than two units from a single control room. The proposal also would not allow a facility licensee to meet 10 CFR 50.54(m)(2)(iii) because the regulation requires a licensed operator at the controls for each fueled unit (i.e., 12 licensed operators). Absent alternative staffing requirements, future NuScale licensees would need to request an exemption from these requirements.

In a letter dated January 14, 2016 (ADAMS Accession No. ML15302A516), the staff provided two options for the DCA to address the regulatory requirements of 10 CFR 50.54(m) in order to 18-49

provide the greatest degree of issue finality and regulatory certainty on the issue of control room staffing. In a letter dated April 8, 2016 (ADAMS Accession No. ML16099A270), the applicant stated that it would pursue the first option given in the January 14, 2016, letter, in which the applicant would propose for certification as part of the DC rulemaking an alternative approach to control room staffing that a facility licensee can use in lieu of 10 CFR 50.54(m). Under this approach, the applicant would need to provide as part of the DC the technical basis for rulemaking language that addresses control room staffing in conjunction with control room configuration. A future NuScale licensee would follow the certified NuScale approach and would not need an exemption from 10 CFR 50.54(m) because the DC rule would address the applicability of the regulation.

On June 23, 2016, and August 30, 2016, the staff held public meetings with the applicant to discuss the regulatory process for implementing this approach in the NuScale DC rulemaking.

The public meeting summary (ADAMS Accession No. ML16252A258) lists the information the staff stated the applicant should include with the DCA:

  • DCA Part 2 Tier 1, Part 2: State that the minimum staffing requirements are located in the applicable appendix to 10 CFR Part 52 (DC rule) (i.e., a pointer to the DC rule).
  • DCA Part 2 Tier 2, Part 2: State in Chapter 18 that the minimum staffing requirements are located in the applicable appendix to 10 CFR Part 52 (DC rule). Provide the technical basis; that is, describe how the HFE program conforms to the guidance in NUREG-0711 and NUREG-1791.
  • DCA Part 7: Include the request for an alternative staffing requirement; a description with justification, including the requirement provisions, staffing table, and appropriate table notes; and a discussion of why an exemption from the table, Minimum Requirements per Shift for On-Site Staffing of Nuclear Power Units by Operator and Senior Operators Licensed under 10 CFR Part 55, in 10 CFR 50.54(m)(2)(i) is not necessary for the applicant.

The staff also stated that paragraph V, Applicable Regulations, of the DC rule in the applicable 10 CFR Part 52 appendix will include the alternative staffing requirement rule language, including the requirement provisions, staffing table, and appropriate table notes.

The staff reviewed the DCA to assess whether it contained the information listed in the meeting summary and determined the DCA included all of the information except for a statement in DCA Part 2 Tier 1 that the minimum staffing requirements are located in the DC rule and a discussion of why an exemption from 10 CFR 50.54(m)(2)(i) is not necessary for the applicant.

Additionally, the staff compared the staffing table and table notes in DCA Part 7, Section 6, to TS 5.2.2 in DCA Part 4, Technical Specifications, and found inconsistences. Therefore, the staff issued RAI 8747, Question 18-10 (ADAMS Accession No. ML17307A447). In the responses to RAI 8747, Question 18-10 (ADAMS Accession Nos. ML17354A845 and ML18172A319), the applicant explained why the exemption is not necessary for the applicant and also aligned the staffing table and table notes in DCA Part 7, Section 6, to be consistent with TS 5.2.2. These changes appear as markups in the RAI responses. The applicant also stated that a pointer should not be added to Tier 1 because such information is not typical of the contents of Tier 1, and the requirements in the DC rule and TS 5.2.2 are sufficient to ensure that a licensee is aware that alternative staffing requirements are to be used in lieu of 10 CFR 50.54(m). The staff agrees that the pointer statement does not need to be added to 18-50

Tier 1 because the requirements of the DC rule and TS 5.2.2 are sufficient to ensure a licensee is aware that alternative staffing requirements may be used in lieu of 50.54(m) and finds the response acceptable. RAI 8747, Question 18-10, is a Confirmatory Item pending verification of the markups in the next revision of the DCA.

Evaluation of the Applicants Technical Basis (Criterion 6.4(2))

Criterion 6.4(2) states that the staff should assure that the applicants proposed staffing meets the requirements of 10 CFR 50.54, Conditions of Licenses, and, if not, the NRCs reviewers should use the guidance in NUREG-1791 and NUREG/CR-6838.3 The executive summary of NUREG-1791 states the following:

The purpose of this review is to ensure public health and safety by verifying that the applicants staffing plan and supporting analyses sufficiently justify the requested exemption. The applicants submittal should include (1) the description of the request, the concept of operations, and operational conditions considered, (2) supporting analyses and documentation from the operating experience, functional requirement analysis and function allocation, task analysis, job definition, and staffing plan, and (3) data and analysis from validation exercises performed to demonstrate the effectiveness and safety of the proposed staffing plan.

The validation exercise discussed in the quotation above is the SPV.

The abstract of the SPV Results TR states the following:

A staffing plan validation was conducted using guidance in NUREG-0711, NUREG-1791, and NUREG/CR-6838 as well as other industry guidance. The staffing plan validation included performance-based tests using a simulator focused on operator performance, workload, and situation awareness during challenging plant operating conditions which included design basis events, beyond design basis events, multi-module events, and events in series and parallel. The results of the analysis, performed using the methods described above, confirm that up to 12 NuScale power modules and the associated plant facilities may be operated safely and reliably by a minimum staffing contingent of three licensed reactor operators and three licensed senior reactor operators from a single control room during normal, abnormal, and emergency conditions.

The staff used the guidance in NUREG-1791, Appendix A, Review Checklists, which contains 11 review steps, and the guidance in the BNL Tech Report to review the results of the applicants SPV and evaluate whether the results support the applicants proposed design-specific minimum staffing level. Because the applicant conducted the SPV using the simulator, which according to NUREG/CR-6838 is the most realistic means of validating the acceptability of minimum staffing levels, the staff focused the review on the evaluation of these results to determine the acceptability of the minimum staffing level, as summarized below.

3 NUREG/CR-6836 contains the technical basis for the staffs guidance in NUREG-1791. The staff used NUREG/CR-6836 as a reference if it needed clarification of the review guidance in NUREG-1791.

18-51

Step 1: Review the Request NUREG-1791, Section 1.1, Discussion, explains that the staff needs to understand the scope of the review and ensure that the applicant has provided the necessary information for the staff to perform the review. As explained above, the applicant does not need to request an exemption because the minimum staffing requirements apply to facility licensees, not DC applicants. However, in order to provide the greatest degree of regulatory certainty and finality for COL applicants, the DCA will address staffing to support the establishment of a design-specific staffing rule for a licensee to use in lieu of 10 CFR 50.54(m).

The SPV Methodology, Section 6.1, Operating Staff Assignments, states the following:

The following staff and qualifications are assumed to be available as part of the on-shift operating crew. Six licensed operators in the main control room consisting of the following: one shift manager maintaining an active senior reactor operator license, one control room supervisor maintaining an active senior reactor operator license, one shift technical advisor maintaining an active senior reactor operator license and having a degree in a science or applied science field, and three unit supervisors maintaining active reactor operator licenses.

The staff also reviewed DCA Part 7, Section 6, which contains the applicants proposed rule to be used in lieu of 10 CFR 50.54(m). The proposed requirements are as follows:

(1) A senior operator licensed pursuant to Part 55 of this chapter shall be present at the facility or readily available on call at all times during its operation, and shall be present at the facility during initial start-up and approach to power, recovery from an unplanned or unscheduled shutdown or significant reduction in power, and refueling, or as otherwise prescribed in the facility license.

(2) Licensees shall meet the following requirements:

a. Each licensee shall meet the shall meet the minimum licensed operator staffing requirements in the following table:

Table 1: Minimum Requirements Per Shift for On-Site Staffing of NuScale Power Plants by Operators and Senior Operators Licensed Under 10 CFR Part 55 Number of units operating (a nuclear power Position One to unit is considered to be operating when it is in twelve units MODE 1, 2, or 3 as defined by the unit's One control technical specifications) room None Senior operator 1 Operator 2 One to twelve Senior operator 3 Operator 3 Source: DCA Part 7, Section 6.1.3, Requested Action.

18-52

b. Each licensee shall have at its site a person holding a senior operator license for all fueled units at the site who is assigned responsibility for overall plant operation at all times there is fuel in any unit.
c. When a nuclear power unit is in MODE 1, 2, or 3, as defined by the unit's technical specifications, each licensee shall have a person holding a senior operator license for the nuclear power unit in the control room at all times. In addition to this senior operator a licensed operator or senior operator shall be present at the controls at all times. In addition to the senior operator and licensed operator or senior operator present at the controls, a licensed operator or senior licensed operator shall be in the control room envelope at all times.
d. Each licensee shall have present, during alteration or movement of the core of a nuclear power unit (including fuel loading, fuel transfer, or movement of a module that contains fuel), a person holding a senior operator license or a senior operator license limited to fuel handling to directly supervise the activity and, during this time, the licensee shall not assign other duties to this person.

The staff found that the proposed requirements included in DCA Part 7, Section 6, meet the intent of the requirements in 10 CFR 50.54(m), and only the requirements in 10 CFR 50.54(m)(2)(i) and 10 CFR 50.54(m)(2)(iii) have been modified to account for the operation of up to 12 reactors from a single control room. The only other requirement related to licensed operator staffing is in 10 CFR 50.54(k). The applicant has not proposed any rule to be used in lieu of 10 CFR 50.54(k) because the applicants proposed rule includes the requirement in 10 CFR 50.54(k); therefore, a licensee can comply with it.

The staff also reviewed TS 5.2.2 in DCA Part 4 and found that the requirements in TS 5.2.2 are consistent with those proposed in DCA Part 7, Section 6. Also, TS 5.1.2 requires either the shift manager or an SRO to be in the control room when any unit is in MODES 1-5, which is also consistent with the proposed requirements.

The audit reports, dated May 26, 2016 (ADAMS Accession No. ML16137A129), and November 30, 2016 (ADAMS Accession No. ML16259A110), document the results of the staffs audit of the SPV methodology and observations of SPV testing, respectively. The knowledge the staff gained from these audits; DCA Part 2 Tier 2, Chapter 7, Instrumentation and Controls; Chapter 15, Transient and Accident Analyses; and Chapter 19, Probabilistic Risk Assessment; the HFE TRs included with DCA Part 2 Tier 2, Chapter 18 to address the 12 HFE elements in NUREG-0711; the SPV Methodology TR; and the SPV Results TR provided sufficient information to enable the staffs review of the applicants SPV.

Step 2: Review the Concept of Operations NUREG-1791, Section 2.1, Discussion, states that the staff performs Step 2 to gain a comprehensive understanding of how the proposed staffing fits into the overall design and operation of the plant. NUREG-1791, Appendix A, lists topics the applicant should address in order to provide a complete concept of operations. The staff reviewed the applicants ConOps 18-53

and SPV Results TR and found that together they address these topics. As such, the staff concludes that the applicants description of the concept of operations is complete and addresses the roles of control personnel (i.e., personnel (control room operators) who will have plant monitoring and operational control responsibilities on each shift), and the applicant has adequately explained how the proposed staffing relates to the plants design and operation.

Step 3: Review the Operational Conditions NUREG-1791, Section 3.1, Discussion, states that the staff performs Step 3 to ensure that the operational conditions that present the greatest potential challenges to the effective and safe performance of control personnel were analyzed by the applicant and support the request. The BNL Tech Report, Section 5.1, Identify Challenging Operating Conditions, states, The applicant should identify the plant specific operating conditions that are challenging and create high workload. The objective of identifying these conditions is the evaluation of the minimum staffing level needed to address immediate and short-term actions. The BNL Tech Report, Section 5.1, lists plant conditions, personnel tasks, and situational factors applicants should consider when choosing the sample of challenging operating conditions to use for the staffing analysis. The list includes multiunit monitoring, management of off-normal conditions and emergencies, fatigue situations (e.g., repetitive tasks), tasks requiring interaction with other plant personnel, tasks with high cognitive workload, and tasks that are performed to complete IHAs.

The SPV Methodology TR, Section 3.0, Identify Challenging Operational Conditions, explains that the applicant used the TA results as one input to the selection of the sample of challenging operational (i.e., operating) conditions and states the following:

The task analysis contains numerous attributes that have been recorded into a VISION database. The VISION database allows task attributes to be searched and correlated to assess which impact workload the most.

The SPV Methodology TR, Section 3.0, is proprietary and describes in detail the applicants method and criteria for selecting the challenging operating conditions using the task attributes in the VISION database. The staff reviewed the applicants method and criteria and found that the applicant considered the plant conditions listed in the BNL Tech Report, Section 5.1, as well as conditions in which operators need to perform actions within any time constraints in order to develop the sample of operating conditions (SOC) for the staffing analysis. The S&Q RSR, Section 3.3.2, Staffing Plan Validation Scenario Development, summarizes the selected conditions, including changing conditions on multiple modules, common system interface failures and their effect on multiple modules, high levels of automation, and beyond design basis events.

The SPV Methodology TR, Section 3.0, contains the detailed information about the SOC and the events included in the sample. The staff reviewed Section 3.0 and found that the SOC included the following tasks and conditions:

  • Multiunit monitoring, management of off-normal conditions and emergencies, fatigue situations (e.g., repetitive tasks), tasks requiring interaction with other plant personnel, tasks with high cognitive workload, and tasks that are performed to complete the risk-important HAs identified in the PRAs. As discussed in the BNL Tech Report, Section 5.1, these kinds of conditions are likely to result in high operator workload.

18-54

  • Conditions that require operators to perform actions in a relatively short amount of time (relative to other tasks that operators perform at a NuScale plant) before any other staff can be called in for support. These conditions also can create stress, which may increase the chance of human performance errors, and therefore are relatively more challenging to successfully manage.
  • Conditions that require operators to perform actions under situations that would be unfamiliar because they are infrequently performed, which would likely result in higher cognitive demand. These conditions would also likely be stressful because of the potential for safety-significant consequences if tasks are not performed correctly. As a result, there could be perceived higher levels of workload, which could make these scenarios more challenging to operators and also could increase the likelihood of human performance errors during task performance.
  • Highly dynamic and unusual situations that would be relatively complex to manage, which would likely cause high cognitive workload demands and make these scenarios more challenging for the operators. Such conditions could also increase the likelihood of human performance errors during task performance.

Thus, the staff concludes the applicants selected SOC conforms to the guidance in NUREG-1791, Section 3.1, and the BNL Tech Report, Section 5.1, and as such includes challenging operational conditions.

Step 4: Review Operating Experience NUREG-1791, Section 4.1, Discussion, states that the staff performs Step 4 to ensure that the applicant reviewed relevant operating experience to identify and address staffing-related lessons learned. The purpose of the applicants review of operating experience should be to identify previous staffing-related problems in order to avoid repeating them. It is also used to identify similar staffing practices that have proven to be effective and successful implementations of similar technologies and concepts of operation.

The S&Q RSR, Section 3.1.1, Operating Experience Review, describes how the applicant reviewed relevant operating experience to identify and address staffing-related lessons learned.

The applicant collected and reviewed staffing-related operating experience from a variety of sources. The S&Q RSR, Section 3.1.1, states, Initial staffing goals for the NuScale power plant were developed in consideration of the following factors based on SME knowledge and experience. The factors listed include staffing-related issues documented in NRC information notices and regulatory issues summaries, such as Regulatory Issue Summary 2009-10, Communications Between the NRC and Reactor Licensees During Emergencies and Significant Events, dated June 19, 2009. The OER RSR, Section 4.2, Recognized Industry HFE Issues, states, NuScales staffing plan for licensed operators was tested to ensure required communications to the NRC could be made in high workload situations. The testing referred to in the OER is the SPV.

Also, the OER RSR, Section 3.4, states that the applicant interviewed plant personnel at various sites, including a nuclear power plant that operates multiple units from a single control room.

Appendix E, List of Sample Sites Visited that Impacted Design, lists the sites visited and the topics discussed, and Appendix F lists the applicants conclusions and lessons learned from interviews. The staff found that the lessons learned include those related to the operation of 18-55

more than one unit from a single control room, which is a feature of the NuScale design. The OER RSR, Section 4.4, explains how the applicant used observations of the control personnel working at a Canadian facility that operates four units from one control room to inform the concept of operations at a NuScale plant.

Additionally, the OER RSR, Section 2.2, states that the HFE team consists of previously licensed U.S. operators. These previously licensed operators used their own operating experience to conduct the OER, which includes a review of staffing-related lessons learned.

The S&Q RSR, Section 3.1.1, states the following:

The roles and responsibilities of the three senior reactor operators, specifically the SM [shift manager], CRS [control room supervisor], and STA [shift technical advisor], in existing commercial nuclear plants is considered very effective in establishing and maintaining command and control and technical oversight during normal and off-normal conditions. Therefore, initial staffing goals for the MCR crew levels and qualifications are based, in part, on staffing levels and qualifications from commercial nuclear power plants, while taking into account the passive features and a high degree of automation of the NuScale plant.

Thus, the staff concludes that the applicant reviewed relevant operating experience to identify and address staffing-related problems in order to avoid repeating them and also to identify similar staffing practices that have proven to be effective.

Step 5: Review FRA and FA NUREG-1791, Section 5.1, Discussion, states that the staff performs Step 5 to ensure that the applicant has defined and evaluated the impact of the staffing plan on the plant/system functions that must be performed to satisfy plant safety objectives (i.e., the safety functions).

The second purpose is to ensure that the allocation of functions to humans and systems has resulted in a role for control personnel that uses human strengths, avoids human limitations, and can be performed under the operational conditions evaluated (i.e., the selected SOC).

The applicant provided the FRA/FA RSR with the DCA. (The staff reviews the applicants FRA/FA results in Section 18.3.4 of this SER.) The FRA/FA RSR, Table 3-1, lists all the plant functions, including the safety functions, the applicant identified for the NuScale plant. The applicants SOC described in the SPV Methodology TR, Section 3.0, includes events that the staff determined could potentially impact all the plant safety functions (e.g., beyond-design-basis events (BDBEs)). The staff found that the applicant also specifically addressed monitoring all the safety functions in the selected SOC. Thus, the staff concludes that the applicant evaluated the impact of the staffing plan on the safety functions during the SPV.

The HSI Design RSR, Section 3.1.1, Personnel Task Requirements, explains that the applicant established automation criteria to allocate functions to personnel, automated systems/machine, or a combination of the two. The results of the FA are an input to TA to identify the alarms, displays, and controls (i.e., the HSI) personnel need to perform tasks associated with functions allocated to them. The HSI Design RSR, Figure 3-1, NuScale Main Control Room Simulator Development Venn Diagram, shows that the results of the TA were inputs to the development of the HSI design, and the HSI Design RSR, Section 3.2, states that the HSI design was an input to the development of the control room simulator. The HSI Design RSR, Section 3.2, also states that The NuScale simulator is an evolutionary expression of the 18-56

MCR interface that is built incrementally and represents the design detail as it emerges. The HSI Design RSR, Section 3.7, Human-System Interface Tests and Evaluation Overview, explains that the applicant verified that the simulator included all of the HSIs personnel would need to perform tasks, and thus the functions allocated to personnel, within the SOC before conducting the SPV.

Because the SPV was conducted using a simulator control room design that the applicant confirmed included the HSIs that personnel needed to perform tasks allocated to them, and because the FA results were inputs to the TA, the staff concludes that the SPV tested whether the allocation of functions to humans and systems resulted in a role for control personnel that uses human strengths, avoids human limitations, and can be performed under the selected SOC.

Step 6: Review the Task Analysis NUREG-1791, Section 6.1, Discussion, states that the purpose of the step is to confirm that the applicants TA adequately addresses the set of tasks that personnel will be required to perform for the SOC. The BNL Tech Report, Section 5.2, Identify Primary Tasks; Section 5.3, Identify Dependent Tasks; and Section 5.4, Identify Potential Independent Tasks, describe a method that may be used to comprehensively identify the tasks that personnel will be required to perform for the SOC for the SPV. The BNL Tech Report, Section 5.5, Construct Scenarios and Assign Operator Responsibilities, states that applicants should construct scenarios based on combining the primary, dependent, and independent tasks. These scenarios will be used to conduct the workload analysis (i.e., the SPV). Additionally, NUREG-1791, Section 6.1, states, For each task, the information, control, and task support requirements should be addressed by the applicants task analysis, as applicable.

The SPV Methodology TR, Section 6.5, Creation of Scenario Guide, explains how the applicant identified the primary, dependent, and independent tasks personnel needed to perform for the SOC and constructed scenarios by combining these tasks. The staff reviewed Section 6.5 and found that the applicants method of identifying the primary, dependent, and independent tasks that personnel perform during the SOC for the SPV conformed to the guidance in the BNL Tech Report, Sections 5.2, 5.3, and 5.4.

The SPV Results TR, Appendix D, "Scenario 1 Description and Basis; Appendix E, Scenario 2 Description and Basis; and Appendix F, Scenario 3 Description and Basis; describe the scenarios the applicant developed by combining the primary, dependent, and independent tasks that personnel perform during the SOC. The staff reviewed these appendices and found that the scenario descriptions identify the tasks for each event and also identify the task type (i.e.,

primary, dependent, and independent) for each event included in the scenarios. The staff also found that the scenarios included all of the conditions in the applicants selected SOC.

Therefore, the staff concludes that the applicant identified the tasks operators will be required to perform for the selected SOC.

Because the NuScale standard plant is a new design, the applicant performed TA as one of the HFE design program activities and provided the TA RSR with the DCA. (The staff evaluates the applicants methods for TA and the applicants TA results in Section 18.4.4 of this SER.) The TA RSR, Section 3.5, identifies the task attributes the applicant documented for each task, as applicable, during TA. The TA RSR, Table 3-1, summarizes this information and shows how it conforms to NUREG-0711, Table 5-1. The staff compared the TA RSR, Table 3-1, to 18-57

NUREG-0711, Table 5-1, and found that the attributes include all of those listed in NUREG-0711, Table 5-1. The task attributes include the alarms, information, controls, and task support needed to accomplish a given task as well as the task performance requirements, including time required (only for IHAs), task time (non-IHAs), and accuracy. The applicant used the results of TA as inputs to the SPV in the following ways:

  • The TA RSR, Section 3.5.5, Inventory of Alarms, Controls, and Displays, explains that the TA results are used to develop the HSI inventory for the NuScale plant. The S&Q RSR, Section 4.5, Simulator HSI Testing for Staffing Plan Validation, describes the method the applicant used to verify that the alarms, information, controls, and task support needed for personnel to perform the tasks included in the SPV scenarios were available in the MCR simulator before the SPV. The staff concludes that the applicant identified the information, control, and task support requirements for tasks operators performed in the scenarios for the SPV and verified that they were available to personnel before the SPV.
  • The S&Q RSR, Section 4.1, Staffing and Qualification Results as Compared to NUREG-0711 Review Criteria, explains how some task characteristics identified by TA, such as the time required to perform a task, were used to develop the acceptance criteria for the SPV.

The staff concludes that the applicant adequately identified the set of tasks that personnel will be required to perform for the SOC and also verified that the information, control, and task support requirements that needed to be available for the personnel to perform these tasks were included in the simulator used for the SPV testing.

Step 7: Review the Job Definitions NUREG-1791, Section 7.1, Discussion, states, The purpose of the job definition review is to confirm that the applicant has established clear and rational job definitions for the personnel who will be responsible for controlling the plant. Section 7.1 also states the following:

A job is defined as the group of tasks and functions that are assigned to a personnel position. A job definition specifies the responsibilities, authorities, knowledge, skills, and abilities that are required to perform the tasks and functions assigned to a job. A job that consists of interrelated responsibilities and authorities that do not conflict would be coherent.

The staff reviewed the ConOps, Section 2.2.1, Operating Crew Composition, and found that it identifies the roles, responsibilities, and qualifications of the operators that form the minimum operating crew. The three SROs fulfill the roles of shift manager, control room supervisor, and shift technical advisor. Based on the descriptions in the ConOps, Section 2.2.1, the staff observed that their roles, responsibilities, and authorities are the same as those of licensed SROs in operating reactors.

The crew complement also includes three licensed ROs, distinguished as RO 1, RO 2, and RO 3. The staff observed that the applicant identified specific roles and responsibilities for RO 1 and that these roles and responsibilities differ from those for RO 2 and RO 3 (RO 2 and RO 3 perform the same roles, but have different responsibilities). The staff observed that the roles, responsibilities, and authorities of the ROs are similar to those of the licensed ROs at operating 18-58

reactors, and that the applicant has defined some roles and responsibilities differently as a result of the unique nature of the applicants control room and plant design.

Additionally, when the staff observed the SPV testing, the staff observed that the roles and responsibilities were clearly defined and did not consist of responsibilities and authorities that conflicted with one another. For example, the SRO assigned as the control room supervisor is given supervisory tasks only and does not perform actual control tasks, similar to the role of the control room supervisor at an operating reactor.

The TA RSR, Section 2.1, Task Analysis Process Overview, explains how the results of the FRA/FA are inputs to the TA and states that HAs and actions allocated to automation defined during FRA/FA are decomposed to identify control and monitoring tasks for the operators and the machine (i.e., automation). The S&Q RSR, Section 2.1.2, Task Analysis Inputs, states that personnel tasks are assigned to staffing positions considering task characteristics, such as the knowledge and abilities required, relationships among tasks, time available, and time required to perform the task; the operators ability to maintain situation awareness within the area of assigned responsibility; teamwork and team processes such as peer checking; and workload associated with each job within the crew. Thus, the staff concludes that the applicant assigned tasks to job positions using the results of the FRA/FA and TA.

NUREG-1791, Section 7.1, also states the following:

An important aspect of the job definition review is to ensure that the qualifications required for each position are delineated. The qualifications required for a plant staff position consist of the knowledge, skills, and abilities/aptitudes (KSAs) an individual must possess to meet the performance criteria established for the tasks assigned to the position. The information derived from the function and task analyses should provide a basis for identifying the required KSAs for each position.

The applicant specified the qualifications required for the staffing positions as either senior operator or operator. Additionally, the TA RSR, Section 3.5.6, explains that as part of TA, each task is analyzed to determine the knowledge and abilities needed to successfully complete the task, and the VISION database contains the results of TA. The TA RSR, Section 4.4, includes examples of the knowledge, skills, and abilities/aptitudes for a given task documented in the VISION database. The S&Q RSR, Section 2.1.2, states the following:

TA results are used to determine the crew roles and responsibilities and are used as input to the initial licensed operator staffing level. Personnel tasks, addressed in TA, are assigned to staffing positions considering: task characteristics, such as the knowledge and abilities required.

Therefore, the staff concludes that the applicant has delineated the qualifications required for each position, including the knowledge, skills, and abilities that are required to perform the tasks and functions assigned to these positions.

Step 8: Review the Staffing Plan 18-59

NUREG-1791, Section 8.1, Discussion, states the following:

The purpose of the staffing plan review is to ensure that the applicant has systematically analyzed the requirements for the numbers of qualified personnel that are necessary to operate the plant safely under the operational conditions analyzed. The applicants staffing plan should be supported by the results of the functional requirements analysis and function allocation, task analyses, and the job definitions for each position required under the operational conditions considered. In addition, the applicants submittal should define the proposed shift composition and shift scheduling.

The S&Q RSR, Section 3.1, Establishing the Basis for Staffing and Qualification Levels, explains that the applicant initially started with a staffing goal that was an input to the other HFE program elements. (To complete some analyses, such as TA, it is necessary to start with assumptions about the number and qualifications of personnel who will be available in the control room.) The S&Q RSR, Section 3.3, Evaluation of Staffing Levels and Operator Qualifications, states, The bases for licensed operator personnel staffing are established as described in Section 3.1 using input from other HFE program elements to support the initial staffing goals for the MCR crew (numbers and qualifications baseline) described in Section 3.2.

The S&Q RSR, Section 3.1, also states, The initial staffing goals were subject to revision based on the results of HFE analyses, including operating experience review (OER), FRA/FA, TA, HSI Design, and S&Q. These analyses provide the basis for any changes to the initial staffing levels. The S&Q RSR, Section 4.1, also states, NuScales staffing analysis methodology is iterative as described in Section 2.1.4. Although staffing levels have not changed from initial goals, they have been continuously evaluated throughout the HFE analysis and design process.

The applicant did not define proposed shift scheduling. The staff concludes that this is not necessary at this time because the COL holder that operates a NuScale plant schedules when personnel will be on shift.

Therefore, the staff concludes that the applicant evaluated whether the initial staffing goal needed to be revised based on the results of the OER, FRA/FA, and TA the applicant performed. Because the results did not require changes to be made, the staff concludes that the applicant determined that the results of the FRA/FA and TA support the staffing level. Thus, the staff concludes that the applicant systematically analyzed the requirements for the numbers of qualified personnel that are necessary to operate the plant safely under the operational conditions analyzed. The staff discusses the results of the applicants validation of the staffing level, which confirmed the minimum staffing level was adequate, under Step 10.

Step 9: Review Additional Data and Analysis The applicant did not provide any additional data as described in NUREG-1791, and the staff determined that no additional data or analyses were necessary to complete the review.

Step 10: Review the Staffing Plan Validation NUREG-1791, Section 10.1, Discussion, states the following:

18-60

The purpose of reviewing the validation of the staffing plan is to ensure that the applicant fully considered the dynamic interactions between the plant design, its systems, and control personnel for the operational conditions identified for the exemption request. The applicant should provide data or demonstrations that the control personnel specified in the staffing plan can satisfy the plant and human performance requirements identified in the functional requirements analysis, function allocation, and task analyses. The data or demonstrations should include the full range of operational conditions identified for therequest, as well as a reasonable representation of the human performance variability expected in the context of the operational conditions.

NUREG-1791 identifies four components of the SPV review: operational conditions sampling, human performance measures and criteria, data sources or demonstration methods, and outcomes. The staff discusses these four components below.

Operational Conditions Sampling NUREG-1791, Section 10.1.1, Operational Conditions Sampling, states the following:

The applicants submittal should identify the operational conditions included in each scenario. The submittal should identify the key plant and system parameters relevant to the scenario and the state of these parameters at the start of the scenario, during critical transition points in the scenario, at times when action by control personnel is expected, the results of control actions, and the status of the parameters at the end of the scenario.

The submittal should sample a sufficient number of operational conditions such that the personnel and plant performance are challenged.

The submittal should also identify the criteria for determining successful performance of the plant, system, and control personnel within the scenarios.

The SPV Results TR, Appendices D, E, and F, contain the scenario descriptions and the scenario bases that summarize the information in the scenario guides. The staff reviewed Appendices D, E, and F and found that, together, they fully incorporate the SOC and identify the key plant and system parameters relevant to the scenario and the state of these parameters at the start of the scenario, during critical transition points in the scenario, and at times when action by control personnel is expected; the results of control actions; and the status of the parameters at the end of the scenario. Additionally, as discussed in the review of Step 3, the staff found the applicants SOC to be sufficiently challenging and liable to produce a high workload, and therefore the number of conditions is adequately sampled in the applicants scenarios.

The SPV Methodology TR, Section 9.0, Conclusions and Acceptance Criteria, identifies the criteria the applicant established to evaluate whether a scenario was successfully performed. It also states that the failure of any scenario to meet the criteria requires the applicant to perform corrective actions and then conduct subsequent retesting.

The staff reviewed the applicants criteria for determining successful scenario performance and found that the criteria focused on assessing acceptable task performance and workload levels, 18-61

which the applicant defined and are discussed below with human performance measures. The staff found that the applicants criteria for determining whether task performance was successful were based on ensuring the safe operation of the plant by not violating certain time limits assumed in the PRA, which would help to ensure successful plant performance. Because the goal of the test is to show that workload is acceptable to help ensure that the minimum staffing levels can achieve successful task performance, the staff determined that the criteria were relevant to the purpose of the SPV and therefore were adequate for determining successful performance.

Human Performance Measures and Criteria NUREG-1791, Section 10.1.2, Human Performance Measures and Criteria, states the following:

The applicant needs to identify the measures of human performance used to evaluate individual and crew performance of the control personnel in the scenarios. In addition to defining the measures of human performance used in validating the staffing plan, the applicant should identify the criteria established to determine the acceptability of the results obtained.

The BNL Tech Report, Section 3.1, General Considerations for the Review of Minimum Staffing Exemption Requests, provides additional guidance for the selection of human performance measures:

Successful task performance is the main criterion for evaluating a proposed staffing level. That is, if the crew at the minimal staffing level cannot perform their tasks, the staffing level is not acceptable. However, while task performance is an important acceptance criterion, its not the only one. High workload, inattention, and poor SA [situation awareness] are examples of the factors that can lead to poor task performance and hence should be considered in staffing evaluations.

Therefore, the staff evaluated whether the applicant at a minimum identified task performance, workload, and situation awareness as human performance measures and also evaluated the applicants criteria for determining whether the results measured were acceptable.

The staff found that the applicant identified the completion of tasks as one of the SPV human performance measures in the SPV Methodology TR, Section 4.0, Identify Primary and Dependent Tasks, and Appendix A, Scenario Testing Plan, Section A.1, Introduction. The staff found that the SPV Results TR, Section 6.1.1, Time Analysis, explained that the applicant also established additional criteria for determining whether task performance was successful.

The staff reviewed the criteria and found that they were based on assumptions about task performance in the PRA, industry standards, SME judgment, and other regulatory requirements, such as NRC reporting requirements. Therefore, the staff concludes that the applicants criteria for evaluating whether task performance was successful are reasonable and relevant to the tasks included in the SPV.

The SPV Methodology TR, Appendix C, Situational Awareness Questionnaire, explains the method the applicant used to measure situation awareness and identifies the minimum numerical value the applicant established to determine whether measured situation awareness 18-62

was acceptable. The staff found the applicants method of measuring situation awareness to be an explicit situation awareness probe method as described in NUREG/CR-7190, Workload, Situation Awareness, and Teamwork, issued March 2015. The staff considered the applicants numerical value for determining whether measured situation awareness would be acceptable.

Because situation awareness is context specific, the actual value of situation awareness measured does not provide as much insight as an evaluation of the trends in the operators responses to the questions. For example, if all operators incorrectly answer a certain question, then the applicant should investigate to determine why it was widely missed. Such a result could indicate a problem with the HSI or a problem with the wording of the question.

In the SPV Methodology TR, Appendix C, the applicant provided examples of the types of situation awareness questions administered to the operators. The staff found that these questions were context specific. In addition, the SPV Methodology TR, Appendix A, explains how trends in the operators responses would be identified and evaluated and the actions that would be taken if measured situation awareness was below the minimum numerical value. The staff finds the applicants minimum numerical value a reasonable threshold for triggering a more in-depth evaluation to understand whether there is a problem with the HSI design or the staffing level if measured situation awareness is not acceptable.

The SPV Results TR, Section 6.1.2, Task Load Index, explains that the applicant used the National Aeronautics and Space Administration Task Load Index (NASA-TLX) to measure workload. The staff finds the applicants use of the NASA-TLX to measure workload acceptable because, as discussed in NUREG/CR-7190, the method is a commonly used means of measuring workload, and it has been used previously in the nuclear power plant domain.

The executive summary of the SPV Results TR states the following:

the task load index (TLX) data collection methodology and the data analysis approach used were intentionally designed to identify potential high workload by facilitating the examination of deviations in data regardless of the absolute value of the data. This was done so that even small deviations at low workload levels would be identified. In these instances, other tools such as direct questioning, observations, and self-critiques were used to validate or gather further evidentiary information on actual or perceived level of workload and stress and their impact on performance.

The SPV Results TR, Section 6.1.2, further explains the method the applicant used to determine whether measured workload was high. The staff reviewed the alternative method and concluded that it could account for individual biases that might be reflected in the subjective measures of workload, and the method could enable the applicant to identify relative levels of high workload for the individual rater or raters. The staff agrees that the applicants method helps to account for the individual subjectivity reflected in the workload ratings. The staff also notes that the BNL Tech Report, Section 4.3, Identifying Approaches to Workload Analysis, contains guidance for acceptable values of workload and is relevant to the staffs review of the applicants SPV results:

The Department of Defense (DoD, 1999) gives the following criteria for workload analysis based on time utilization: In general, workloads over 100 percent are not acceptable, between 75 percent and 100 percent are undesirable, and under 18-63

75 percent are acceptable provided that the operator is given sufficient work to remain reasonably busy.

For these reasons, the staff concludes that the applicant identified relevant human performance measures (i.e., task performance, situation awareness, and workload) and established appropriate criteria to evaluate whether the results of these performance measures were acceptable. The staff documents its review of the SPV results below under Staffing Plan Validation Outcomes.

Data Sources or Demonstration Methods NUREG-1791, Section 10.3.3, Data Sources and Demonstration Methods, states that the staff should confirm that the applicant met the following criteria:

  • The selected design of the SPV, the data sources, and the demonstration methods comprehensively address the dynamic aspects of the staffing plan.
  • The data sources and demonstration methods were used appropriately.
  • The appropriate quantitative, objective measures and criteria were defined and captured.

NUREG/CR-6838, Section 5.2, Validating Staffing Plans, contains further information.

  • The data collection and analysis were conducted appropriately.
  • The scope and data quality were adequate.
  • The outcomes were reasonable and valid.

DCA Part 2 Tier 2, Section 18.5.3, states the following:

The staffing plan validation included performance-based tests using a simulator focused on operator performance, workload, and situation awareness during challenging plant operating conditions, which included design basis events, beyond design basis events, multi-module events, and events in series and parallel. Two independent crews were trained and qualified to conduct three challenging and workload-intensive scenarios utilizing conduct of operations guidance that was reflective of the current industry standards with respect to communications and use of human performance tools. A team of trained and qualified observers consisting of operations, management, and HFE personnel observed and analyzed the crew performances utilizing multiple methods of monitoring crew performance, workload, and situation awareness.

Because the applicant used performance-based tests in the NuScale plant simulator to perform the SPV, the staff relied on the guidance in NUREG/CR-6838, Section 5.2.4, Simulator Studies, to evaluate whether the criteria in NUREG-1791, Section 10.3.3, were met.

NUREG/CR-6838, Section 5.2.4, lists the steps applied below for conducting valid simulator testing and states, Using methods for conducting in-simulator studies for staffing, which are similar to those described in NUREG-0711 Section 11.4.3, Integrated System Verification, leads to effective and robust data collection. Thus, the staff also considered the guidance for 18-64

ISV testing in NUREG-0711, Section 11.4.3, when evaluating as follows in accordance with the steps from NUREG/CR-6838, Section 5.2.4:

  • Define the test objectives. The executive summary of the SPV Results TR explains that the objective of the SPV was to demonstrate that the proposed NuScale licensed operator staffing size is sufficient to protect public health and safety while operating a twelve-unit NuScale nuclear power plant from a single control room. The results of the test are necessary to provide a basis for the design-specific minimum staffing requirement that will be included in the DC rule. Thus, the staff concludes that the applicant defined the test objective.
  • Validate the testbed. The S&Q RSR, Section 3.3.4, Simulator Scenario Based Testing, states the following:

Scenario-based testing is performed in accordance with the NuScale Simulator Scenario-Based Testing Procedure described in detail in Reference 6.2.10. The testing is conducted by determining a set of key parameters and ensuring those parameters behave as expected for the developed staffing plan validation scenarios. ANSI/ANS-3.5-2009 Nuclear Power Plant Simulators for Use in Operator Training and Examination (Reference 6.1.21) is referenced to select steady state and transient parameters.

The SPV Results TR, Section 4.2, Simulator Testing and Validation, discusses in more detail the scenario-based testing method the applicant used to validate the fidelity of the simulator to the control room HFE design and the plant systems modeled. The staff finds that this method is similar to the method of scenario-based testing for licensed operator exams discussed in Nuclear Energy Institute 09-09, Nuclear Power Plant-Referenced Simulator Scenario Based Testing Methodology, Revision 1, dated December 8, 2009, which the NRC endorsed in Regulatory Guide (RG) 1.149, Nuclear Power Plant Simulation Facilities for Use in Operator Training and License Examinations, Revision 4, issued April 2011. This method is a comprehensive method that demonstrates simulator fidelity to the reference plant for the selected scenarios.

Therefore, the staff concludes that the applicant validated the testbed (i.e., the simulator) before conducting the SPV testing.

  • Select plant personnel. The executive summary of the SPV Results TR states the following:

Two independent participant crews were selected based on having prior nuclear control room operating experience and some experience with the NuScale design. All participants had previous licensed operator experience at nuclear facilities, which allowed the training to be condensed and drew on the operators experience with nuclear power plant fundamentals and control room etiquette.

The SPV Results TR, Section 3.1, Crew Selection, discusses the test participants (i.e., the operators) and lists the biographical information for the participants. The staff reviewed the biographical information and, given that information and the fact that the 18-65

participants were all previously licensed operators in the United States, determined that the participants are representative of licensed operators in the U.S. nuclear industry, which are the possible pool of operators that may be licensed at a NuScale plant. Thus, the staff concludes that the applicant selected a pool of representative test personnel.

  • Define scenarios. As discussed under Steps 3 and 6, the staff determined that the applicant defined an adequate set of scenarios for the challenging SOC identified for SPV testing.
  • Define performance measures. As noted above under the discussion of human performance measures and criteria, the staff determined that the applicant defined a sufficient set of human performance measures and acceptance criteria.
  • Design test.

- Couple control room personnel and scenarios. The executive summary of the SPV Results TR states, each of the two crews performed three challenging high workload scenarios for a total of six tests. NUREG-0711, Section 11.4.3.6.1, Scenario Sequencing, calls for providing each crew with a similar, representative range of scenarios and balancing the order of presentation such that the scenarios are not always given in the same sequence (e.g., the easiest scenarios are not always presented first). Because each of the crews participated in all three scenarios, and because all of the scenarios were designed to be challenging, the staff concludes that the applicant adequately coupled scenarios and personnel.

- Create test procedures. NUREG-0711, Section 11.4.3.6.2, Test Procedures, contains elements that test procedures should include to ensure that testing is conducted in a controlled manner, which is necessary to ensure the validity of the test results. The SPV Methodology TR, Appendix A, describes the test procedures the applicant used to administer the SPV tests. The SPV Methodology TR, Appendix B, Simulator Crew Evaluation Pre-Job Brief, contains a prejob brief sheet used during testing. The staff reviewed Appendix A and Appendix B and found that, together, they include all of the elements in NUREG-0711, Section 11.4.3.6.2.

Additionally, the audit report dated November 30, 2016 (ADAMS Accession No. ML16259A110), states the following:

The NuScale staff followed its testing plan and scenario guidance for interacting with participants during the scenarios. The NRC staff observed the NuScale staff maintaining physical control of the scenario documentation to prevent the operators from viewing it and learning the scenario events before the scenarios commenced. NuScale used video and sound recording to support analysis of the scenarios. During the pre-job briefs for the NuScale staff conducted prior to the start of each scenario, the NuScale staff confirmed that the testing prerequisites were satisfied. They also reviewed the scenario events and discussed when the workload and situational awareness assessments would 18-66

be performed during the scenarios. In summary, the NRC staff observed the NuScale staff exercised appropriate test controls during performance of the staffing plan validation.

Therefore, the staff concludes that the applicant created adequate test procedures and followed them.

- Train test personnel. The SPV Results TR, Section 3.2, Observer Selection, discusses the number of HFE and operations SMEs who conducted observations during the SPV testing. The SPV Results TR, Section 3.3, Support Staff, discusses the simulator support staff and their roles for setting up the test environment and administering the data collection tools during the scenarios.

Section 3.3 indicates that the support staff had limited interaction with the test participants, as necessary to conduct the testing. Together, the observers and the simulator support staff are the test personnel.

The SPV Results TR, Section 3.2.1, Observer Training, and Appendix I, Training,Section I.3, Observer Lesson Plan, together describe the training the observers received before the SPV testing. NUREG-0711, Section 11.4.3.6.3, Training Test Personnel, lists topics on which test personnel should be trained during validation testing. The staff reviewed the SPV Result TR, Section 3.2.1 and Appendix I.3, and found that the training topics addressed the topics listed in NUREG-0711, Section 11.4.3.6.3. The SPV Results TR, Section 3.3, also states that simulator support personnel were specifically briefed on their roles and that the test procedures guided their interactions with test participants. The staff concludes that the applicant adequately trained the test personnel in order to prevent them from introducing bias or errors into the data through failures to follow test procedures or to interact with participants properly.

- Train test participants. The executive summary of the SPV Results TR states the following:

Two independent participant crews were selected based on having prior nuclear control room operating experience and some experience with the NuScale design. Information on the testing scenarios was not shared with the crew participants prior to performance of the testing. The crew participants were trained in basic fundamental operation of the safety systems, important to safety systems, and applicable support systems. The participants also received basic human-system interface (HSI) navigation, conduct of operations, and administrative task training. All participants had previous licensed operator experience at nuclear facilities, which allowed the training to be condensed and drew on the operators experience with nuclear power plant fundamentals and control room etiquette.

The SPV Results TR, Section 3.1.1, Crew Training, describes the training program that test participants received. The staff finds this to be a minimal amount of training when compared to the training that licensed operators complete at operating reactors. For ISV testing, NUREG-0711 states that 18-67

participants should be trained to near-asymptotic performance as operators in the actual plant. This is in part to help ensure the test participants are representative of the actual users of the HSI in the actual plant. However, given that the scope of this test was limited to operating the plant during a subset of all operating conditions, whereas ISV samples a much broader range of all possible operating conditions, participants for SPV testing do not necessarily need to be trained to the same level as participants for the ISV. The training for this test should at a minimum be sufficient so the participants can understand the information provided by the HSI to diagnose plant conditions and also understand how to use the HSI. The SPV Results TR, Section 3.1.1, states that the participants were required to demonstrate a minimum level of proficiency before the SPV. Thus, the staff concludes that using participants who can rely on their previous experience as licensed operator, have received training on how to use the HSI and how the NuScale plant operates, and have demonstrated an acceptable minimum proficiency level before testing is acceptable.

- Perform pilot test. The SPV Results TR, Section 2.0, Test Design and Assumptions, states the following:

Pilot testing of the scenarios was performed by the members of the simulator design team and observers. Scenario based testing criteria was established and implemented to ensure the testing scenarios would achieve their desired evaluation goals.

Refinements to the scenarios were made to ensure the scenario guides were accurate, the simulator performed as expected, and testing methods were well defined.

Thus, the staff concludes that the applicant conducted a pilot test for the SPV scenarios and made adjustments to the scenarios as needed.

  • Collect, analyze, and interpret data. The SPV Results TR, Section 6.0, Data Collection, explains how the applicant collected data. Test personnel observed the test participants during the SPV scenarios and collected data by direct observation and also by administering questionnaires to the observers and test personnel. The SPV Results TR, Appendix A, Section A.4, Observation Comments (HFE and Operations), lists the observers comments and the participants comments from Scenario 1. The SPV Results TR, Appendices B and C contain the same information for Scenarios 2 and 3.

The applicant listed HFEITS item numbers for those items requiring further resolution in accordance with the HFE issue resolution process discussed in the HFE PMP.

The audit report dated November 30, 2016 (ADAMS Accession No. ML16259A110),

documents the following observation by the staff of the applicants data collection methods:

NuScale used multiple types of data collection tools to measure human performance, workload and situation awareness (SA) during the staffing validation testing. The NRC staff observed the NuScale staff use the following data collection tools:

o Audio/video logs 18-68

Used to assess timing, accuracy, and completeness of operator actions.

o Expert Observers Used to assess timing, accuracy, and completeness of operator actions.

o Simulator logs Used to assess timing, accuracy, and completeness of operator actions.

o Self-report evaluation o National Aeronautics and Space Administration Task Load Index (NASA TLX) used to assess workload.

o SA questionnaire administered via real-time probe (i.e., at predetermined times during the scenarios) to collect self-report information about the scenario events from memory (i.e., an explicit SA metric).

o Post-scenario critiques.

o Stopwatch Used to assess timing of operator actions.

Additionally, the audit report documents the staffs observations of the applicants methods for collecting data and controlling the SPV testing. For example, the audit report states that the staff observed that the applicant exercised appropriate test controls by following its testing plan and scenario guidance for interacting with participants during the scenarios. The audit report states that the staff observed that the NuScale staff exercised appropriate test controls during performance of the SPV.

  • Validate conclusions. The conclusion as to whether the objective of the SPV was met is determined by reviewing the results and comparing them to the predetermined acceptance criteria. The staff documents these results below, under Staffing Plan Validation Outcomes.

Because the applicant used simulator testing, the operators were able to perform their assigned roles and responsibilities and also interact together and with the HSI to accomplish the tasks in the scenarios. Before testing, the applicant verified that the simulator provided feedback to the operators that was representative of the expected plant response. The test personnel were able to observe the operators task performance and also measure workload and situation awareness. Thus, the staff concludes that the demonstration method (i.e., simulator testing) addressed the dynamic aspects of the staffing plan, and the simulator was used appropriately.

Also, because the applicant identified acceptable human performance measures and criteria and also collected data during the testing to evaluate whether those criteria were met, the staff concludes that the scope and data quality were adequate and the data collection and analysis 18-69

were conducted appropriately. Therefore, the staff concludes that the data collection methods were adequate. The staff discusses the applicants analysis and interpretation of the data below, under Staffing Plan Validation Outcomes.

Staffing Plan Validation Outcomes NUREG-1791, Section 10.3.4, Staffing Plan Validation Outcomes, states the following:

The reviewer should confirm that the following criteria have been met, as applicable:

  • The results of analyses demonstrate that control personnel, individually and working in crews, if applicable, can accomplish their tasks within performance criteria.
  • The results of analyses demonstrate that the staffing plan does not result in either excessively high or minimal workload demands on control personnel for the operational conditions considered.
  • The results of the analyses demonstrate that the staffing plan does not compromise control personnel situational awareness.
  • The staffing plan effectively addressed any identified environmental conditions or staffing practices that could potentially degrade individual or crew performance.

The SPV Results TR, Appendix A; Appendix B, Scenario 2 Results Report; and Appendix C, Scenario 3 Results Report, contain the results of the applicants SPV. Specifically, they contain conclusions as to whether task performance, situation awareness, and workload were acceptable based on the predetermined acceptance criteria. The following discusses the staffs review of the SPV results in these appendices:

  • Task performance was successful. Appendix A, Section A.6, Conclusions; Appendix B, Conclusion section; and Appendix C, Section C.7, Conclusion, document that all scenario tasks met established task performance criteria and were completed successfully.

The staff observed that for the SPV, the applicant used a value of time available for the two risk-important HAs that was greater than the time available for those two actions documented in DCA Part 2 Tier 2, Chapter 19. DCA Part 2 Tier 2, Table 19.1-14, Modeled Human Actions (Post-Initiator), states that the minimum completion time (i.e.,

the time available) for the two risk-important HAs is 30 minutes. It was not clear to the staff why the applicant used a different value of time available for the SPV than what was in DCA Part 2 Tier 2, Chapter 19. Also, the SPV Results TR, Section 6.1.1, discusses a time ratio that was calculated for these risk-important HAs based on the time available. The staff observed that the SPV Results TR, Section 6.2.1, Scenario Completion Acceptability, also discusses a time ratio that was calculated for the risk-important HAs, which was different from the time ratio in Section 6.1.1. The appropriate acceptance criteria for the time ratio were not clear to the staff. Additionally, if the time available specified in DCA Part 2 Tier 2, Chapter 19, was more limiting than that used for 18-70

the SPV, then the conclusions about the performance of these tasks documented in the SPV Results TR, Appendix A, Section A.1, Results Summary, would not have been valid. Therefore, the staff issued RAI 9409, Questions 18-37, 18-38, 18-39, and 18-40 (ADAMS Accession No. ML18082B397).

In the response to RAI 9409, Questions 18-37, 18-38, 18-39, and 18-40 (ADAMS Accession No. ML18143B532), the applicant explained that the PRA was revised after the SPV was conducted, and, as a result, the time available for these two risk-important HAs documented in the DCA Part 2 is different than the time available that was used to evaluate the performance of these actions during the SPV. Additionally, the applicant explained that the time ratio criteria documented in the SPV Results TR, Sections 6.1.1 and 6.2.1, are separate and distinct criteria, and therefore both criteria were applicable.

[[ ]]. Thus, although the time available in the DCA Part 2 was more limiting than that used for the SPV, it did not change the applicants conclusions that the SPV scenarios were completed successfully.

The staff reviewed the performance times of these two risk-important HAs documented in the SPV Results TR, Appendix A, Section A.1, and compared them to the time available in DCA Part 2 Tier 2, Chapter 19, and the time ratios in the SPV Results TR, Sections 6.1.1 and 6.2.1. The staff determined that the results still met the applicants criteria related to successful scenario performance for the SPV, and the change to time available did not invalidate the results of the SPV. The staff finds the applicants response acceptable, and therefore RAI 9409, Questions 18-37, 18-38, 18-39, and 18-40, are resolved and closed.

  • The staffing plan does not compromise situation awareness. The SPV Results TR, Appendix A, Section A.3, Situation Awareness Results; Appendix B, Section B.4, Situation Awareness Results; and Appendix C, Section C.4, Situation Awareness Results, summarize the results obtained for situation awareness for each of the scenarios. The staff reviewed the results and found that, in general, situation awareness met the acceptance criteria for each scenario trial such that situation awareness was determined to be adequate. Additionally, the staff found that the applicant provided additional analysis when the majority of personnel missed situation awareness questions, which is consistent with the methodology in the SPV Methodology TR. Given these results, the staff concludes that the staffing plan does not compromise situation awareness.
  • The staffing plan does not result in excessive high or minimal workload demands. The SPV Results TR, Appendix A, Section A.2, TLX Results; Appendix B, Section B.3, TLX Results; and Appendix C, Section C.3, TLX Results, summarize the results obtained for workload in each of the scenarios. The applicant provided workload results for the crew as well as for individuals. The SPV Methodology TR, Section 8.0, Analyze Workload, and Appendix D.2, Instructions to Complete the Workload Worksheets, explain the method the applicant used to calculate the values of workload. The applicant determined weighted and nonweighted workload values. The staff considers weighted TLX ratings and nonweighted TLX ratings to both be valid methods; weighting is typically not used because there is evidence that (1) there is no major difference between weighted and nonweighted TLX scores, and (2) using the nonweighted method might increase experimental validity.

18-71

The staff found that one aspect of the applicants method for calculating the weighted workload, as discussed in the SPV Methodology TR, Section 8.0, was not consistent with the method described in the NASA-TLX v. 1.0 Manual. The staff did not know whether the deviation could affect the validity of the weighted workload results, and the staff was also not aware of any research verifying the validity of the applicants weighting methodology. Also, the applicant only provided the weighted workload results in the application, so the staff could not review the nonweighted results, which were not affected by the weighting methodology. Therefore, the staff issued RAI 9392, Question 18-48 (ADAMS Accession No. ML18180A357), to ask the applicant to explain the reason for deviating from the NASA-TLX weighting method and whether the nonweighted workload values were significantly different than the weighted values included in the application.

In the response to RAI 9392, Question 18-48 (ADAMS Accession No. ML18198A521),

the applicant provided a reason for the deviation in the methodology for determining weighted workload. The applicants rationale did not address whether HFE studies supported the deviation from the NASA-TLX weighting methodology, and therefore the staff could not determine whether the way the applicant performed the weighting impacted the weighted workload scores. However, the applicant also stated in the RAI response that there were no significant differences between the weighted and nonweighted workload values. Therefore, RAI 9392, Question 18-48, is resolved and closed.

Because the applicant stated that there are no significant differences between the weighted and nonweighted values, the staff reviewed the weighted workload values for each scenario in the SPV Results TR, Appendices A, B, and C, to evaluate whether measured workload was acceptable. For Scenario 1, the staff found that all measured TLX values for the total crew and each individual participant were well below the threshold for unacceptable levels of workload (i.e., values of 75-100 are considered undesirable, and levels above 100 are considered unacceptable in accordance with the BNL Tech Report, Section 4.3). [[ ]]. When the threshold was exceeded, the applicant further analyzed the reasons for those participants perceived relatively higher workload levels. In Appendix A, Section A.6, Conclusion, the applicant explained that improvements to the HSI design could be implemented to help reduce workload for those cases in which relatively higher levels of workload were measured. The staff found that the applicant entered these opportunities for improvement in the HFEITS as shown in Appendix A.5, Post Job Critique Comments.

For Scenario 2, the staff found that all measured TLX values for the total crew and each individual participant were well below the threshold for unacceptable levels of workload,

[[ ]]. When the threshold was exceeded, the applicant further analyzed the reasons for those participants perceived relatively higher workload levels. In the SPV Results TR, Appendix B, Conclusion section, the applicant explained that improvements to the HSI design could be implemented to help reduce workload for those cases in which relatively higher levels of workload were measured. The staff found that the applicant entered these opportunities for improvement in the HFEITS as shown in Appendix B.6, Post Job Critique Comments.

18-72

For Scenario 3, the staff found that all measured TLX values for the total crew and each individual participant were well below the threshold for unacceptable levels of workload.

[[ ]]. When the threshold was exceeded, the applicant further analyzed the reasons for those participants perceived relatively higher workload levels. In Appendix C, Section C.7, Conclusion, the applicant explained that improvements to the HSI design could be implemented to help reduce workload for those cases in which relatively higher levels of workload were measured. The staff found that the applicant entered opportunities for improvement in the HFEITS as shown in Appendix A.5.

In general, the staff concludes that most of the participants at some point during the scenarios experienced a relatively high amount of workload. The staff would expect this to occur because the purpose of the SPV scenarios is to simulate challenging, high workload situations. Furthermore, because the SPV participants received a relatively minimal amount of training as compared to the training that will be provided to actual operators of a NuScale plant, the staff believes that the lack of familiarity with operations may have contributed to higher perceived levels of workload. Perceived workload for actual operators, who will receive much more training, may be less in the plant. In all cases, measured workload did not exceed unacceptable levels. In nearly all cases, measured workload ratings did not exceed undesirable workload levels (i.e., TLX ratings between 75 and 100); in the one scenario when it did exceed the undesirable level for one participant, and in other cases in which workload thresholds were exceeded, task performance was not affected. When workload was found to exceed the applicants thresholds for high workload, the applicant conducted additional analysis to understand the reason. The applicant also identified potential improvements to the HSI design to help reduce workload and entered them for evaluation in the HFEITS. Therefore, the staff concludes that the applicants staffing plan does not result in excessive high workload demands.

With respect to excessive minimal workload, the BNL Tech Report, Section 4.3, Identifying Approaches to Workload Analysis, states that levels of reported workload under 75 percent are acceptable provided that the operator is given sufficient work to remain reasonably busy. The staff reviewed the observer comments and the participant comments in Appendices A, B, and C of the SPV Results TR and did not find any comments related to participants experiencing too low workload. Situation awareness was found to be acceptable during the scenarios, indicating that participants were actively engaged in the scenarios such that they had sufficient awareness of plant status, and task performance was successful. Thus, for the selected SOC for the SPV, the results did not indicate the workload was too low. The SPV cannot fully assess whether the applicants staffing plan results in excessive minimal workload demands because the SPV focuses only on the most challenging and high workload scenarios.

Unlike the SPV, the ISV samples a much broader range of operating conditions, including scenarios in which it is highly likely operators could experience low workload levels. As such, whether the applicants staffing plan results in workload levels that are excessively low such that safe operation of the plant may be affected will be evaluated when the applicant completes ISV testing and provides the ISV test results.

Open Item 18-2: The staff will review the ISV results to verify they support the staffing plan when the applicant submits the V&V RSR.

18-73

  • Environmental conditions were considered. The SOC consisted of events performed in the MCR. The staff reviewed the scenario descriptions in the SPV Results TR, Appendices D, E, and F, to determine whether any of the events in the SPV scenarios might result in changes to the control room environment that might lead to degraded individual or crew performance. Of these scenarios, the staff observed that the SPV Results TR, Appendix E, Section E.2, Detailed Description, included an event that the staff thought could have the potential to affect environmental conditions in the control room, which could impact human performance. However, it was not clear to the staff whether the MCR environment would experience any changes during this event, and whether those conditions were simulated or otherwise evaluated to determine whether there could be an impact to the staffing plan. Therefore, the staff issued RAI 9392, Question 18-49 (ADAMS Accession No. ML18180A357). In the response to RAI 9392, Question 18-49 (ADAMS Accession No. ML18198A521), the applicant stated that for that particular event, it would not expect the control room environment to change because backup power systems would still be available to provide power to plant systems. Thus, the staff concludes there were no environmental conditions in the SOC for the SPV, and therefore none needed to be simulated in the SPV. Therefore, RAI 9392, Question 18-49, is resolved and closed.

Step 11: Determine Acceptability of the Request NUREG-1791, Section 11.1, Discussion, states that the staff is to make a final decision as to whether acceptance of the applicants proposed staffing plan will provide at least the same level of assurance that public health and safety are maintained as the current regulations require. As a result of the staffs review, the staff determined that task performance results, situation awareness results, and workload results were satisfactory, and therefore the SPV results support a proposed minimum staffing of six control room operators on shift in the control room, which was the staffing level validated by the SPV testing. However, because of the iterative nature of the HFE design process, the staffing level is subject to change if the ISV identifies any issues indicating that staffing is not acceptable. Therefore, the staffs final determination on the acceptability of the proposed request is contingent in part upon whether the ISV testing identifies any staffing issues that would change the applicants proposed staffing level. The final determination is Open Item 18-2.

Furthermore, the staff also observed that for the SPV, the assumed contingent of on-shift control room operators was in the MCR simulator for the duration of the testing, with one exception. [[ ]]. The staff concludes that it is acceptable to not include a test participant to fill the role of this operator because, given the operators roles as defined in the applicants ConOps, it is reasonable to assume that this operator will not be available to perform primary tasks during abnormal events. However, when the staff reviewed the proposed rule in DCA Part 7, Section 6, the staff observed that the requirement for six licensed operators is an onsite requirement, and the DCA Part 2 provides separate requirements for the minimum numbers of licensed operators inside the control room (i.e., one SRO must be in the control room, and one RO must be at the controls). It is possible during any given shift that some control room staff may be on site outside of the control room, and only one RO and one SRO may be in the control room at any given time. Because the applicant did not simulate this level of staffing during the SPV, the staff issued RAI 9392, Question 18-50 (ADAMS Accession No. ML18180A357), to ask the applicant to explain whether this level of staffing would be evaluated during the ISV.

18-74

In the response to RAI 9392, Question 18-50 (ADAMS Accession No. ML18198A521), the applicant stated that several ISV scenarios will simulate cases in which one or two control room operators are on site outside the control room for some portion of the scenario. The operators can be recalled to the control room, and it is expected that they will be able to return to the control room within 10 minutes or less. The applicant also said that because no operator actions need to be performed during this timeframe to safely operate the plant (in fact, no actions need to be performed within 30 minutes, and those plant conditions that require actions to occur within 30 minutes are BDBEs), the applicant has not added further reduction in the staffing level to the ISV scenarios because simulating such reduced staffing would only verify that a plant page can be made and control can be maintained for at least 10 minutes. The staff considered the applicants response and concluded that it is not necessary to simulate scenarios in which only one SRO and one RO are in the control room to verify that the plant can be safely operated at that staffing level for the following reasons:

  • Because at least one SRO and at least one RO must be in the control room at all times, these operators will be present at the start of any design-basis accident that occurs. As discussed in DCA Part 2 Tier 2, Chapter 15, the accident analyses do not credit operator actions to mitigate the consequences of design-basis accidents. Any actions operators would take as directed by their procedures mitigate the consequences of these events; however, performance of these actions is not required to meet the acceptance criteria for these events in DCA Part 2 Tier 2, Chapter 15.
  • Because at least one SRO and at least one RO must be in the control room at all times, these operators will be present at the start of any beyond-design-basis accident that occurs. As discussed in DCA Part 2 Tier 2, Chapter 19, operators are assumed to perform two actions in certain BDBEs that occur as a result of multiple failures of the plant safety systems. Operators can perform these two actions from the control room, and, as the staff observed during the SPV, one operator can perform these relatively simple actions. In the unlikely event the either of these two actions needs to be accomplished and only one RO and one SRO are in the control room to perform them, the staff concludes that there is reasonable assurance that performing these actions is well within the capabilities of one RO and one SRO.

The staff finds that the applicants proposal to simulate that one or two control room operators are outside the control room during the ISV scenarios incorporates realism into these scenarios.

Therefore, RAI 9392, Question 18-50, is resolved and closed.

Other Review Criteria (Criteria 6.4(3)-(6))

Inputs from TA to S&Q Analyses (Criterion 6.4(3))

Criterion 6.4(3) states that the applicant should use the results of the TA as input to the S&Q analyses. It also states that personnel tasks should be assigned to staffing positions to ensure that jobs are defined, considering task characteristics, team processes, and the persons ability to maintain situation awareness. The TA RSR, Section 2.1, states the following:

Output from TA to other HFE program elements includes the following:

Tasks are arranged into specific job categories and assigned to staff positions (e.g., licensed operators, non-licensed operators). This provides input to the 18-75

control room staffing plan validation operator training (Reference 6.2.3) and is analyzed in the staffing and qualifications (S&Q) HFE element.

Tasks are assigned knowledge and abilities (KA) required to perform the tasks.

These KA requirements provide the foundation for the operator training program development.

Additionally, the S&Q RSR, Section 2.1.2, states the following:

As described in the Human Factors Engineering Task Analysis RSR (Reference 6.2.4), TA results are used to determine the crew roles and responsibilities and are used as input to the initial licensed operator staffing level.

Personnel tasks, addressed in TA, are assigned to staffing positions considering:

  • task characteristics, such as the knowledge and abilities required, relationships among tasks, time available, and time required to perform the task;
  • the operators ability to maintain situation awareness within the area of assigned responsibility;
  • teamwork and team processes such as peer checking; and
  • workload associated with each job within the crew.

The staff concludes that the applicant used the results of TA as an input to the S&Q analyses and assigned tasks to jobs considering the task characteristics, impact on the ability to maintain situation awareness, and teamwork and team processes. Accordingly, the staff finds that the application conforms to this criterion.

Staffing for the Full Range of Plant Conditions and Tasks (Criterion 6.4(4))

Criterion 6.4(4) states that the applicants staffing analysis should determine the number and qualifications of operations personnel for the full range of plant conditions and tasks (including operational tasks conducted under normal, abnormal, and emergency conditions; plant maintenance; plant surveillance; and testing) and should address how interactions with plant personnel working outside of the control room interface with the operators in the control room.

As discussed in the staffs evaluation of Criterion 6.4(2), the applicant conducted the SPV to determine the minimum number of licensed operators needed in the MCR by simulating challenging, high workload conditions and evaluating task performance, workload, and situation awareness under those conditions. The SPV simulated normal, abnormal, and emergency conditions and also included tasks related to maintenance, surveillance, and testing. The applicant also simulated interactions with plant personnel outside the control room during the SPV. The S&Q RSR, Section 2.1.1, Initial Staffing Levels, states the following:

The results of the analysis, performed using the methods described above, confirm that up to 12 NuScale power modules and the associated plant facilities may be operated safely and reliably by a minimum staffing contingent of three licensed reactor operators and three licensed senior reactor operators from a single control room during normal, abnormal, and emergency conditions.

18-76

The operations personnel are qualified as either licensed RO or licensed senior operator.

When challenging conditions are used to create high workload conditions and task performance, situation awareness, and workload results are measured and found to be acceptable, then it would seem logical to conclude that under less challenging conditions, workload levels, situation awareness and task performance will still be acceptable. However, when workload levels are too low, operators may lose some degree of situation awareness (e.g., operators may shift focus to performing other administrative tasks and may not promptly notice changes in plant status), which could impact task performance (e.g., the time to determine which actions need to be taken may increase, which could be important if any task needs to be performed within a relatively short period of time to ensure the safe operation of the plant). For the following reasons, the staff concludes there is reasonable assurance that, even when underload (i.e., low levels of workload) conditions occur, the NuScale plant can still be safely operated:

  • The applicants proposed staffing level includes, and the ConOps describes, an operator whose main responsibility is to monitor plant conditions. Therefore, at least one member of the control room team is continuously responsible for monitoring the status of the plant.
  • The applicants control room design includes an alarm system to notify operators of changes in plant conditions.
  • There are no actions that operators need to take to mitigate the consequences of a DBE, and the few actions that operators do need to take to mitigate the consequences of a BDBE do not need to be taken until a relatively long period of time after event initiation.

Therefore, the staff concludes that the applicants staffing analysis determined the number and qualifications of operations personnel for the full range of plant conditions and tasks.

Accordingly, the staff finds that the application conforms to this criterion.

Iteration (Criterion 6.4(5))

Criterion 6.4(5) states that the applicants staffing analysis should be iterative; that is, the initial staffing goals should be modified as information from the HFE analyses from other elements becomes available. The S&Q RSR, Section 2.1.4, Iterative Nature of Staffing Analysis, states the following:

Initial staffing level goals and staffing roles and responsibilities are evaluated and modified, as required, in an iterative fashion through NuScale design change control procedures, through the use of the human engineering discrepancy (HED) process, and as information from other HFE elements and S&Q analyses, evaluations, and tests becomes available.

HEDs are generated during human factors verification and validation (V&V) activities within the NuScale HFE program as described in the Human Factors Engineering Program Management Plan (Reference 6.2.1). Design discrepancies identified during HFE design development activities are resolved as part of the NuScale design process, whenever possible. Those HFE issues that cannot be immediately resolved or that potentially change the initial staffing goals for the MCR or potentially impact their roles and responsibilities are 18-77

captured in the Human Factors Engineering Issues Tracking System (HFEITS) for evaluation and resolution during staffing plan validation (SPV) or integrated system validation (ISV), as appropriate.

Because the SPV results validated the applicants initial staffing goal, the applicant did not need to modify its initial staffing goal following the SPV. If the ISV results indicate the staffing level needs to be modified, the staff concludes that the applicant has a method of addressing any changes that need to be made to the staffing level.

Therefore, the staff concludes that the applicants staffing analysis is iterative such that the initial staffing goals will be modified as information from the HFE analyses from other elements becomes available. Accordingly, the staff finds that the application conforms to this criterion.

Staffing-Related Issues (Criterion 6.4(6))

Criterion 6.4(6) states that the applicant should address the basis for S&Q levels and lists topics to be considered. The topics are associated with the following HFE elements: OER, FRA/FA, TA, TIHA, procedure development, and training program development.

As discussed in the evaluation of Criterion 6.4(2), the applicant submitted the SPV results in the SPV Results TR as the basis for its proposed MCR operator staffing levels and qualifications.

The applicant also used the results of the OER, the FRA/FA, and the TA as the basis for the staffing level that was validated during the SPV.

The S&Q RSR, Section 3.1.4, Treatment of Important Human Actions, states the following:

The staffing plan validation conducted as part of the S&Q element includes all of the IHAs and confirmsthe assumptions that IHAs can be conducted within the time available by the minimum licensed MCR staff for all applicable plant operating modes and conditions.

Thus the staff concludes that the applicant considered the effect of the staffing level on the performance of the IHAs by including the IHAs in the SPV, which demonstrated that the crew could perform these IHAs within time constraints.

The S&Q RSR, Section 3.1.5, Procedure Development, states the following:

S&Q analyses use task sequencing from TA as preliminary procedures and assume specific personnel numbers, and a certain level of secondary tasks such as communication. S&Q analyses also consider when task sequencing suggests the concurrent use of multiple procedures. Computer-based procedures are utilized during scenario-based testing of operator and crew performance tests, workload analysis, and situation awareness assessment.

S&Q related HEDs identified during procedure development are entered into the HED database. Procedure development related HEDs that affect human factors V&V scenarios (Reference 6.2.7) are resolved prior to ISV. Other procedure development related HEDs may be resolved prior to completion of the design implementation HFE element (see Reference 6.2.8).

18-78

During the SPV, which is discussed under the staffs evaluation of Criterion 6.4(2), the applicant included a scenario that required the crew to use multiple procedures concurrently. The results of the scenario indicated that the staffing level was sufficient to meet demands resulting from the concurrent use of multiple procedures. Thus, the staff concludes that the applicant considered demands resulting from requirements to concurrently use procedures on the staffing level.

The S&Q RSR, Section 3.1.6, Training Program Development, states the following:

S&Q analyses provide input to the training program development related to knowledge, skills, and abilities to be attained and maintained. As S&Q analyses encompass licensed operator personnel, they provide input essential to coordinating actions between individuals inside and outside the MCR. The training program includes this set of knowledge, skills, and abilities.

S&Q related HEDs identified during training program development are entered into the HED database. Training program development related HEDs are resolved during human factors V&V (Reference 6.2.7) or design implementation (Reference 6.2.8), as applicable.

The criterion specifically addresses concerns with coordinating personnel that are identified in the development of training. The development of training programs is an operational program, which is the responsibility of the COL holder. The applicant explained that any staffing concerns identified during the development of training may be documented as an HED, and the HED can be addressed during design implementation, which is an HFE activity performed by the COL holder. As discussed in Section 18.11.4.4.2 of this SER, the DI IP explains that all HEDs will be closed as part of design implementation activities. Thus, the applicant has identified a means by which staffing concerns may be addressed by a COL holder, and the staff concludes that the applicant considered how concerns with coordinating personnel identified during training development will be addressed.

Accordingly, the staff finds that the application conforms to this criterion.

Combined License Information Items Table 18.5-1 lists one COL information item related to staffing from DCA Part 2 Tier 2, Section 18.5.1, Objectives and Scope.

Table 18.5-1 NuScale COL Information Items for DCA Part 2 Tier 2, Section 18.5 Item No. Description DCA Part 2 Tier 2 Section 18.5-1 A COL applicant that references the NuScale Power Plant design 18.5 certification will address the staffing and qualifications of non-licensed operators.

18-79

The staff concludes that the applicant appropriately assigned the determination of the number of nonlicensed operators to the COL holder because the number will depend in part on the number of units constructed on site. For example, nonlicensed operators will likely have more tasks to perform at a NuScale plant that consists of 12 units than at a NuScale plant that consists of six units.

Conclusion The staff evaluated the applicants S&Q analysis method and finds that it conforms to the criteria in NUREG-0711, Section 6.4. Therefore, the staff concludes that the applicants S&Q analysis method provides for systematically analyzing the required number and necessary qualifications of personnel, in concert with regulatory requirements and task requirements.

Additionally, the staff finds that the results of the applicants SPV test provide a substantial part of the technical basis for the proposed staffing requirements in DCA Part 7, Section 6, that a COL holder may use in lieu of the requirements in 10 CFR 50.54(m). The results of the ISV test could indicate that changes to the staffing plan are necessary before DC, and therefore the staffs conclusion on the acceptability of the staffing requirements depends in part on confirmation that the ISV test results do not indicate the need for changes to the proposed staffing requirements.

18.6 Treatment of Important Human Actions Introduction The TIHA program element identifies the HAs that are most important to safety and considers those HAs in the HFE design of the plant. The design should minimize the likelihood of personnel error and help ensure that personnel can detect and recover from any errors that occur.

Probabilistic and deterministic analyses are used to identify IHAs. The PRA, which includes HRA, identifies risk-important HAs. Deterministic engineering analyses identify important HAs that are credited with the prevention or mitigation of accidents and transients.

Summary of Application DCA Part 2 Tier 1: Refer to Section 18.1.2 of this SER.

DCA Part 2 Tier 2: The applicant provided a description of this HFE element in DCA Part 2 Tier 2, Section 18.6, Treatment of Important Human Actions.

ITAAC: There are no ITAAC associated with this element.

TS: There are no TS associated with this element.

Topical Reports: There are no topical reports associated with this element.

Technical Reports: Refer to Section 18.1.2 of this SER.

18-80

Regulatory Basis The following NRC regulations contain the relevant requirements for this review:

  • 10 CFR 52.47(a)(8) as it pertains to the information necessary to demonstrate compliance with any technically relevant portions of the Three Mile Island requirements set forth in 10 CFR 50.34(f), except paragraphs (f)(1)(xii), (f)(2)(ix), and (f)(3)(v)
  • 10 CFR 50.34(f)(2)(iii) - Provide, for Commission review, a control room design that reflects state-of-the-art human factor principles prior to committing to the fabrication or revision of fabricated control room panels and layouts SRP Chapter 18,Section III, Acceptance Criteria, lists the acceptance criteria adequate to meet the above requirements, as well as review interfaces with other SRP sections. Acceptance criteria for HFE design methodology are provided in NUREG-0711 (listed below). (NUREG-0711 references NUREG-0700, Human-System Interface Design Review Guidelines, which provides detailed acceptance criteria for HFE design attributes.)
  • NUREG-0711, Revision 3, Human Factors Engineering Program Review Model, Chapter 7, Treatment of Important Human Actions, Section 7.4, Review Criteria The following documents also provide additional guidance in support of the SRP acceptance criteria to meet the above requirements:
  • NUREG/CR-7202, NRC Reviewer Aid for Evaluating the Human-Performance Aspects Related to the Design and Operation of Small Modular Reactors Technical Evaluation The staff used the criteria in NUREG-0711, Section 7.4, to evaluate the methodology applied to generate the results as described in the TIHA RSR. Section 7.4 includes four criteria for this topic. The fourth criterion addresses plant modifications and is not applicable to new reactors; therefore, the staff evaluated the first three criteria as discussed below.

The staff supplemented its review with NUREG/CR-7202, which identifies relevant issues related to SMRs.

Identification of Important Human Actions (Criteria 7.4(1)-(2))

Staff Assessment The staff reviewed the TIHA RSR, Section 4.1, Identification of Risk Important Human Actions from the PRA/HRA, which identifies two risk-important HAs and briefly summarizes the actions and conditions needed to successfully complete each IHA. The scope of HFE review is limited to confirming that the HFE process includes those actions identified in the NRC review under SRP Chapter 19, Severe Accidents. The staff confirmed that the HFE process covers the actions currently included in the DCA Part 2 Tier 2, Chapter 19 submittal. However, the staffs review of Chapter 19 has identified potential issues that may or may not change the risk-important HAs. The reviewers of Chapter 19 issued RAI 9128, Question 30706 (ADAMS Accession No. ML17340A626), to request clarification with regard to refueling module drop accidents, which appear to be credible accidents with significant PRA impacts. The Chapter 19 18-81

reviewers are currently evaluating the response to RAI 9128, Question 30706 (ADAMS Accession No. ML17340A626). If the response to this RAI yields new IHAs, the applicant will need to include them in the HFE process. Therefore, the staff will need to track the resolution of RAI 9128 before making a determination about the acceptability of the use of risk-important HAs in the HFE process.

Similarly, the scope of the HFE review is limited to confirming that the HFE process includes those actions identified in the NRC reviews under SRP Chapter 7, Instrumentation and Controls, and SRP Chapter 15, Transient and Accident Analysis. The TIHA RSR, Section 4.2.1, Transient and Accident Analysis, indicates that the NuScale design will not credit any operator actions to mitigate anticipated operation occurrences, infrequent events, accidents, or special events associated with the DCA Part 2 Tier 2, Chapter 15 analysis.

Automatic actions are used to place the reactor in a safe state without operator actions. The TIHA RSR, Section 4.2.2, Diversity and Defense-in-Depth Coping Analysis, indicates that the diversity and defense-in-depth coping analysis did not identify any IHAs. The fact that no deterministic actions are credited is a strength of the design, assuming the analyses used to justify this determination are sound. The staff finds that this treatment is consistent with NUREG-0711, Criterion 7.4(2), which indicates that the applicant should identify deterministic actions.

Human factors reviews do not independently confirm the quality or accuracy of the reviews under SRP Chapters 7, 15, and 19. The scope of the HFE review is limited to ensuring that the results of these other reviews are appropriately applied to the human factors process. Because the results of the Chapter 7, 15, and 19 reviews may change during the course of the DCA review, the staff will need to confirm that any changes to IHAs that may occur during the DCA review are also appropriately incorporated.

Conclusion The staff finds that the treatment currently described in the TIHA RSR is consistent with NUREG-0711, Criteria 7.4(1)-(2). However, the iterative nature of this interaction with the SRP Chapter 19 review would make it inappropriate to draw a conclusion until RAI 9128 and any other RAIs that may be issued related to IHAS in DCA Part 2 Tier 2, Chapters 7, 15, or 19 are adequately resolved.

Therefore, staff will confirm that the NUREG-0711, Criteria 7.4(1)-(2), are still met when the DCA Part 2 Tier 2, Chapter 7, 15, and 19 reviews are complete. The staff will confirm that the HFE process has treated all IHAs appropriately, including any new IHAs that may be identified during the DCA Chapter 7, 15, or 19 reviews. This is Open Item 18-23.

The design implementation activities (described in NUREG-0711, Chapter 12, Design Implementation) can be relied upon to verify that any newly identified IHAs will be properly validated after integrated system validation is complete. Section 18.11.4.1, NUREG-0711, Section 12.3, Applicant Products and Submittals, of this SER discusses Open Item 18-22 to ensure the COL holder adequately performs design implementation activities, including validation of new IHAs.

18-82

Treatment of Important Human Actions in the Human Factors Engineering Process (Criterion 7.4(3))

The TIHA RSR, Section 3.3, Addressing Important Human Actions in Other Human Factors Engineering Program Elements, describes the process used to ensure that IHAs are appropriately considered within other HFE program elements. The TIHA RSR, Sections 1.1, Purpose, and 1.2, Scope, include a high-level summary.

The TIHA RSR, Sections 3.3.1-3.3.8, provide additional details on individual HFE program elements, including the following:

  • The IHA are entered in the OER database and assessed to determine whether relevant operating experience may impact the IHA and, if so, the IHA is flagged for further analysis.
  • FAs are revisited to reconsider whether initial allocations are appropriate for IHAs.
  • Tasks involving IHA receive a detailed TA. Evaluations of the S&Q analysis consider the impact of IHAs.
  • Special HSI design considerations are applied to HSIs used to conduct IHA analysis.
  • Additional evaluations of procedures used to conduct IHA analysis are conducted, and these procedures are used during the ISV process. The COL applicant is responsible for the final procedures. The TIHA RSR, Section 3.3.6, Addressing Important Human Actions During Procedure Development, indicates that the design implementation process will be used to ensure that COL applicants include IHAs in the final procedures.
  • NuScale will consider IHAs when training ISV participants. This will help to ensure valid ISV results. However, the COL applicant is responsible for the final training program.

The TIHA RSR, Section 3.3.7,"Addressing Important Human Actions During Training Program Development, indicates that SRP Section 13.2.1, Reactor Operator Requalification Program; Reactor Operator Training, ensures that COL training programs include training for normal, abnormal, and emergency operating procedures.

Because these procedures include the IHAs, the training program will ultimately address all IHAs.

  • The ISV process uses scenarios designed to challenge operators while performing IHAs.

ISV acceptance criteria for operator performance of IHAs with respect to timing and errors are used to assess performance.

The staff reviewed the TIHA RSR and found that that the processes described in the TIHA RSR, Section 3.3 and Sections 3.3.1-3.3.8, use IHAs as inputs to the various applicable human factors program elements. This treatment is consistent with the criterion. However, the RSRs for other human factors program elements have specifically excluded refueling operations from the scope of analysis. The justification for this limitation of scope was unclear, given the potential PRA impact of a refueling module drop (see RAI 9128, Question 19-37 (ADAMS Accession No. ML17340A626)). Section A.5, Treatment of Important Human Actions, of Appendix A, Questions for SMR Applicants Organized by NUREG-0711 Element, to NUREG/CR-7202 identifies specific questions about the IHA element of NUREG-0711 and how 18-83

they apply to the analysis. Section A.5 considers refueling actions as well as other IHAs that may apply to SMRs. Therefore, the staff issued RAI 9360, Question 18-42 (ADAMS Accession No. ML18180A359), to clarify the limitations of scope for the HFE program. The response to RAI 9360, Question 18-42 (ADAMS Accession No. ML18172A226) draws distinctions between NuScale designed local control stations (which are typically derivatives of MCR displays and controls) and vendor supplied local control stations.

Staff considered the capability of the design implementation element of NUREG-0711 to address any emergent issues that may arise post-ISV. The design implementation activities (described in section 18.11 of this SER) should ultimately identify any changes to risk-important human actions (such as those that may arise while resolving RAI 9128) or from the selection of vendor designed local control stations. Therefore, the staff found reliance on the design implementation element to address these concerns an adequate strategy to resolve RAI 9360.

The staff found that the applicant adequately described how the human factors program addresses risk-important human actions as described above. This treatment was found to be consistent with this criterion. Therefore, the staff finds this treatment to be acceptable.

Combined License Information Items No COL information items are associated with Section 18.6 of the DCA Part 2. However, if new IHAs are identified during the course of the NRC review, it is possible that a new ITAAC or COL item will be necessary to address it.

Conclusion The information described in the TIHA RSR adequately considers those important actions currently identified by the Chapter 7, 15, and 19 DCA Part 2 submittals and addresses them in the human factors program. This treatment is consistent with the applicable criteria as described above.

All three criteria in this section are dependent upon other chapter reviews, some of which currently have unresolved RAIs. Since none of these reviews are complete, there is a possibility that there may be new IHAs that are identified, or that significant changes may occur to those that have already been identified. Therefore, no final determination should be made until the RAIs in those reviews are resolved. Staff will continue to track the progress of the Chapter 7, 15, and 19 reviews as Open Item 18-23. If changes to the DCA Part 2 Tier 2, Chapters 7, 15, or 19 occur, staff will reevaluate the impact on this review element.

18.7 Human-System Interface Design Introduction The HSI design element represents the translation of function and task requirements into HSI design specifications. The objective of this review is to evaluate how HSI designs are identified and refined. The review verifies that the applicant has a process to translate functional and task requirements to the detailed design of alarms, displays, controls, and other aspects of the HSI through the systematic application of HFE principles and criteria.

18-84

Summary of Application DCA Part 2 Tier 1: Refer to Section 18.1.2 of this SER.

DCA Part 2 Tier 2: The applicant provided a description of this HFE element in DCA Part 2 Tier 2, Section 18.7, Human-System Interface Design.

ITAAC: There are no ITAAC associated with this element.

TS: There are no TS associated with this element.

Topical Reports: There are no topical reports associated with this element.

Technical Reports: Refer to Section 18.1.2 of this SER.

Regulatory Basis The following NRC regulations contain the relevant requirements for this review:

  • 10 CFR 52.47(a)(8), as it pertains to the information necessary to demonstrate compliance with any technically relevant portions of the Three Mile Island requirements set forth in 10 CFR 50.34(f), except paragraphs (f)(1)(xii), (f)(2)(ix), and (f)(3)(v)
  • 10 CFR 50.34(f)(2)(iii) - Provide, for Commission review, a control room design that reflects state-of-the-art human factor principles prior to committing to the fabrication or revision of fabricated control room panels and layouts
  • 10 CFR 50.34(f)(2)(xix), with regard to instrumentation for monitoring postaccident conditions that include core damage

SRP Chapter 18,Section III, lists the acceptance criteria adequate to meet the above requirements, as well as review interfaces with other SRP sections.

  • NUREG-0711, Revision 3, Chapter 8, Section 8.4, Review Criteria
  • NUREG-0700, Revision 2 The following documents also provide additional criteria, or guidance in support of the SRP acceptance criteria to meet the above requirements.
  • NUREG-0737, Clarification of TMI Action Plan Requirements, Supplement 1, "Clarification of TMI Action Plan RequirementsRequirements for Emergency Response Capability," issued January 1983
  • NUREG/CR-7202, NRC Reviewer Aid for Evaluating the Human-Performance Aspects Related to the Design and Operation of Small Modular Reactors, issued June 2015
  • RG 1.97, Criteria for Accident Monitoring Instrumentation for Nuclear Power Plants, with regard to instrumentation for light-water-cooled nuclear power plants to access plant and environmental conditions during and following an accident
  • NUREG-0696, Functional Criteria for Emergency Response Facilities, with regard to functional criteria for emergency response facilities Technical Evaluation The staff reviewed the applicants HSI Design RSR and Style Guide, which discusses the following aspects of the applicants HSI design:
  • inputs to the HSI design process
  • the concept of how HSIs are used and an overview of the HSI design
  • the guidance used for the detailed HSI design
  • the detailed HSI description of the MCR, TSC, EOF, RSS, and LCSs, covering their form, function, and performance characteristics
  • how the design minimizes the effects of degraded I&C and HSI conditions on personnel performance
  • the outcomes of tests and evaluations undertaken to support the HSI design Additionally, NUREG-0711 states the following:

The HSIs with which personnel interact should be designed through a structured methodology guiding designers in identifying and selecting candidate HSI approaches, defining the detailed design, and performing HSI tests and 18-86

evaluations. The methodology should cover the development and use of HFE guidelines tailored to the unique aspects of the applicants design, including a style guide to define the design-specific conventions (e.g., colors, symbols) that will be used in the HSI design.

The staff has organized its technical evaluation of the applicants HSI design into the 11 subsections below that align with those in NUREG-0711, Section 8.4. Additionally, the staff relied on the guidance in NUREG/CR-7202, which identifies potential human performance issues for the staff to consider when reviewing an SMR applicants HFE design process. Thus, the staff evaluated how NuScale selected and applied HFE guidelines to address potential human performance issues that are unique to the NuScale plant.

Human-System Interface Design Inputs (Criteria 8.4.1(1)-(4))

NUREG-0711, Criterion 8.4.1(1), lists analyses and inputs that the applicant should use early in the design process to identify requirements for HSI design, including operational experience, FRA, FA, TA, and S&Q analysis. Criteria 8.4.1(2)-(4) state that the applicant should identify constraints on HSI design from system requirements, regulatory requirements, and other sources, such as customer requirements, as inputs to the HSI design.

The staff reviewed the OER RSR, Section 3.0, OER Methodology, which describes the applicants process to identify and screen operational experience for applicability to the NuScale HSI design. The applicant reviewed operational experience from predecessor and related plants and systems, recognized industry HFE issues, related HSI technology, issues identified by nuclear and nonnuclear plant personal, and results from the treatment of IHAs. The applicant entered findings from this review into the OER database. Members of the HFE design team screened operating experience in the OER database for applicability to the NuScale HFE design. Designers also reviewed OER issues to identify design features to support or enhance human performance for the NuScale design. For each relevant and applicable operating experience item in the database, the HFE design team determined whether the design addresses the item. If not addressed, it became an HFE item tracked in the HFEITS database for resolution by the HFE design team. Therefore, the staff concludes that the applicant used the results of the OER as inputs into the NuScale HSI design, which is consistent with Criterion 8.4.1(1).

The staff reviewed the FRA/FA RSR, which describes how the NuScale plant system functions and safety functions were defined and analyzed to determine tasks, how tasks are performed (manual, automated, or both), and the role of the operator. Safety functions were used as inputs to the design of overview screens in the MCR. Automation criteria established during FA defined the levels of automation for the HSI design. Therefore, the staff concludes that the results of the FRA/FA were used as inputs into the NuScale HSI design, which is consistent with Criterion 8.4.1(1).

The staff reviewed the HFE TA RSR, which describes how TA results establish HSI inventory requirements, including alarms, controls, displays, procedures, and training programs, to support the accomplishment of tasks across a range of plant operating conditions. TA generates the HSI inventory and its characteristics, such as alarm conditions, indication range and resolution, control function modes and accuracy. Detailed results from TA are used during HSI design to establish alarm logic, display and control designs, procedure step acceptance criteria, and a grouping of HSI inventory. HSI inventory grouping leads to HSIs designed for 18-87

specific tasks. Task support requirements that were defined during TA are either implemented during HSI design or tracked as an issue for resolution by other engineering processes.

The IHAs determined during the TIHA were analyzed to determine whether HSI characteristics such as reduced screen navigation time and the development of dedicated HSI displays and alarms for the IHA should be implemented during HSI design. Therefore, the staff concludes that the results of the TA and IHA analyses were used as inputs in the NuScale HSI design, which is consistent with Criterion 8.4.1(1).

DCA Part 2 Tier 2, Section 18.7.2.4.1, General Considerations (of HSI Design), explains how S&Q activities were used as an input to the early stages of HSI design. For example, the initial number of video display units and their location in the MCR layout and the hierarchy of individual HSI screens for each workstation were based on job analysis, frequency and sequence of use, and operator roles defined during S&Q analysis. The S&Q RSR, Section 4.5, describes the HSI testing performed during the SPV, and the S&Q RSR, Section 5.0, Analysis Conclusions, describes the results of this HSI testing. Following the SPV, NuScale started recording HEDs, and the HSI used for the SPV established the baseline HSI for HED documentation. The HFE PMP describes the HED process. HEDs related to the HSI design are sent to the HFE design team for resolution and may result in changes to the NuScale HSI design. Therefore, the staff concludes that the results of S&Q analyses were used as inputs in the NuScale HSI design, which is consistent with Criterion 8.4.1(1).

The HSI Design RSR, Section 3.3, Human-System Interface Design Overview, states that the HFE team presents findings and solicits input from the I&C and computer systems design disciplines in order to consider whether the HFE design concepts are technically feasible, with a special emphasis on performance requirements. The HSI Design RSR, Section 4.1.2.1, states that there are no known I&C system constraints related to the MCR layout or HSI design for monitoring and controlling multiple units. Therefore, the staff concludes that the applicant considered I&C system constraints as an input to the HSI design, which is consistent with Criterion 8.4.1(2).

In DCA Part 2 Tier 2, Section 18.7.2.1.3, Regulatory and Other Requirements, the applicant stated that the NuScale HSI design incorporates the regulatory requirements and guidance that is listed in the applicable elements of NUREG-0711 and NUREG-0700.

The staff found that the applicants list of regulatory requirements matches the list of regulatory requirements in SRP Chapter 18,Section II.A.7, Human-System Interface, and concluded that the applicant has identified the applicable regulatory requirements as inputs to the HSI design process, which is consistent with Criterion 8.4.1(3). The staff evaluates how the applicant addressed each applicable regulatory requirement in Section 18.7.4.5 of this SER.

The applicant identified other requirements, such as inputs from vendor-supplied LCSs and COL-generated procedures, as inputs to the HSI design, which the staff concludes is consistent with Criterion 8.4.1(4).

Accordingly, the staff finds that the application conforms to NUREG-0711, Criteria 8.4.1(1)-(4).

18-88

Concept of Use and Human-System Interface Design Overview (Criteria 8.4.2(1)-(2))

The ConOps describes the roles and responsibilities of operations personnel based on an anticipated control room staff level of six licensed operators and provides an overview of how control room personnel will work with the HSI resources. Section 18.6 of this SER gives a detailed assessment of the control room staffing.

The staff reviewed the ConOps, Sections 2.2.1 and 2.3, and found that these sections contained a detailed description of the roles and responsibilities of the six control room operators. The staff reviewed the ConOps, Section 3.2, Workstations, Displays, and Working Positions, and found that the applicant provided a detailed description of the HSI resources available in the MCR, which includes the sitdown operator workstations, standup unit workstations, a standup common systems panel, and safety display and indication system panels. The applicant also described how personnel will work with these HSI resources and the type of information displayed by these resources. For example, the ConOps, Section 3.2, states that six sitdown operator workstations provide each control room operator with access to displays and controls located on the plant control system (PCS) and module control system (MCS) networks for oversight and plant control activities. Each sitdown workstation includes four video display units, a keyboard, and a mouse. The displays are navigable and contain the alarms, controls, indications, and procedures necessary to monitor and manage any unit chosen by the operator during normal, abnormal, emergency, shutdown, and refueling operations.

The MCR operators interface with other licensed and nonlicensed members of the plant organization for a variety of activities, including maintenance and technical support. According to the ConOps, Section 2.3.2, Operations Crew Interaction, the MCR is also equipped with communication systems to allow for the coordination of plant activities with personnel outside the MCR. For example, communications between the MCR and locations outside the MCR are normally by secure telephone or radio. Within the MCR, operators communicate with teammates to share information, confirm receipt, recommend actions, and give direction. As members of an integrated multiunit team, operators perform differing tasks, and therefore each operator has unique situational information. An operator performing tasks on a specific unit will typically respond to off-normal conditions on that unit depending on the nature of the condition.

The Control Room Supervisor will ensure the appropriate operator response based on the current resource loading. Therefore, the staff concludes that the applicant developed a detailed concept of use that addresses the topics in Criterion 8.4.2(1).

Additionally, the staff reviewed the HSI Design RSR, Section 4.3, HSI Design Overview, and its overview of the NuScale HSI design, including the following topics:

  • the facility layouts and technologies to support teamwork and communication within the MCR and between the MCR and the RSS, the TSC, and the EOF
  • the key HSI resources, including the alarms, displays, controls, and computer-based procedures, and their functionality
  • the responsibilities of the crew for interacting with automatic systems
  • the HFE guidelines and standards that will be applied to the NuScale HSI design 18-89

The staff also reviewed the HSI Design RSR, Section 2.2.2, HSI Development Responsibility, which states that the NuScale HSI design incorporates the results of the OER, literature reviews, informal tradeoff evaluations, and consideration of alternatives, tests, and evaluations, which provide the technical basis for demonstrating that the design is state-of-the-art and supports personnel performance. The NuScale simulator was a major part of the applicants iterative HSI design process. MCR interfaces were built incrementally into the simulator software. Thus, the simulator became the emergent design and was available for rapid prototype testing, as explained in the HSI Design RSR, Section 3.3.4. The HSI Design RSR, Section 3.3.3, Conceptual Sketches, states that conceptual screen sketches of displays were developed, and prototype screens were integrated with the simulator software and tested to solicit feedback from users. Additionally, the HSI design was tested during the SPV, which provided the opportunity for the applicant to collect feedback from operator users, HFE test observers, and operations test observers and make any necessary improvements to the HSI design before the final validation testing (i.e., the ISV). Therefore, the staff concludes that the applicants HSI design process is iterative, such that the applicant has evaluated alternative HSI designs using feedback from HSI users and from the results of tests and evaluations.

The staff concludes that the applicant has provided an overview of the HSI and described the technical bases. The overview describes the facility layouts, key HSI resources, technologies to support teamwork and communication, and responsibilities of the operators for interacting with automatic systems. The staff also concludes that the NuScale HSI design is an acceptable starting point for a state-of- the-art HFE design for two reasons. First, the HSI design evolved from an iterative process that incorporated operating experience from other designs as well as the results of HSI tests and evaluations. Second, the applicant developed a style guide for the NuScale HSI design based on NUREG-0700, which contains acceptable industry standards for HFE design. Section 18.7.4.3 of this SER includes a detailed review of the applicants Style Guide.

The staff also finds that the HSI evaluations that will be performed in accordance with the HSI V&V IP provide an additional opportunity to implement technology improvements. Therefore, the staff concludes that the application conforms to Criteria 8.4.2(1)-(2).

Human Factors Engineering Design Guidance for Human-System Interfaces (Criteria 8.4.3(1)-(5))

NUREG-0711, Section 8.4.3, HFE Design Guidance for HSIs, includes five criteria that the staff used to address design-specific HFE design guidance that an applicant should develop and use for HSI features, layout, and environment. NUREG-0711 refers to this type of design guidance as a style guide. For HSI design, the style guide should do the following:

  • Address the scope of HSIs and their form, function, operation, and environmental conditions that are relevant to human performance (Criterion 8.4.3(1)).
  • Contain guidance derived from generic HFE guidance and HSI design-related analyses and reflect the applicants decisions in addressing specific goals of the HSI design (Criterion 8.4.3(2)).
  • Contain precisely expressed individual guidelines and observable HSI characteristics and details for design personnel to use for the purpose of design consistency and verifiability (Criterion 8.4.3(3)).

18-90

  • Contain procedures, supplemented with graphical examples, figures, and tables to facilitate comprehension, for determining where and how HFE guidance will be used in the overall design process (Criterion 8.4.3(4)).
  • Be readily accessible and usable by designers, with references to source documents included, and be updated as the design matures (Criterion 8.4.3(5)).

The staff used these five criteria in conjunction with NUREG-0700 to review NuScales HSI Style Guide.

The staff found that the Style Guide, Section 1.0, Introduction, states that the guidance in the Style Guide is for work location and workstation design at the NuScale plant and that the scope of the standard covers all aspects of plant design. However, the majority of the guidelines listed in the Style Guide, Section 3.7, Workplace Design, are applicable to either the MCR or LCS only. The staff issued RAI 9411, Question 18-45 (ADAMS Accession No. ML18204A188), to ask NuScale to clarify whether the guidance in the Style Guide, Section 3.7, is also applicable to the design of the RSS and TSC. In the response to RAI 9411, Question 18-45 (ADAMS Accession No. ML18164A394), the applicant explained that the Style Guide is applicable to the design of the RSS. However, the Style Guide is not applicable to the design of the TSC because the TSC is designed to comply with NUREG-0696. The staff finds this is acceptable because NUREG-0696 contains guidance for the design of emergency response facilities that may be used in lieu of NUREG-0700. RAI 9411, Question 18-45, is a Confirmatory Item pending update to the scope section of the HSI Style Guide.

The staff finds that the Style Guide contains generic HFE guidance derived from NUREG-0700 as well as design-specific HFE standards and guidance that address the form, function, and operation of the HSI resources. The staff found that the Style Guide contained most of the HFE guidance identified in NUREG-0700. The Style Guide did not address some NUREG-0700 criteria because they did not apply to the design or the applicant addressed the criteria in other design documents.

The staff reviewed the Style Guide to determine whether it addressed the environmental conditions relevant to human performance for HSIs. The staff found that the Style Guide, Section 3.7, addressed environmental conditions for HSIs inside the MCR and at LCSs outside the MCR.

With regard to design consistency and verifiability criteria, the staff found that the Style Guide contains design-specific guidance that the applicant has designated as either an HSI requirement or an HSI guideline. The applicant defined HSI requirements as specifications that must be implemented unless the HFE design team approves a deviation. The applicant defined HSI guidelines as best practices for user interfaces; deviations from guidelines do not require approval. With each guideline or requirement, the applicant included HSI Design Criteria, which it defined as HSI characteristics for use when reviewing the acceptability of the HSI.

Volume II of the Style Guide contains a common set of system-dependent HSI requirements and guidelines for visual displays, plant notifications, CSFs and safety parameter displays, computer-based procedures, communication systems, workstation and workplace designs, hardware, and automation. The NuScale HFE design team uses the Style Guide during the cyclical development of individual display pages. As display pages mature during iterative 18-91

cycles of development, the design team enters information in the Style Guide for each display page. Thus, the staff finds that the Style Guide reflects the applicants decisions in addressing specific goals of the HSI design. This process creates common user guidance documents based on the applicants HFE guidance in Volume II for the various HSIs. Volume III of the Style Guide provides HSI descriptions and display page examples for specific HSIs. Style Guide appendices address concepts or techniques that are employed on all systems, such as display page navigation and operator notification. For example, the Style Guide, Appendix C, User Interfaces, contains specific icons for use in NuScale HSI design to ensure consistency across HSIs.

According to the HSI Design RSR, Section 4.5.5, Volume III System HSI Description and Display Page Examples, Volume III of the HSI Style Guide contains chapters with specific information about a system, location, or concept in a NuScale plant. Chapter topics include work location-specific information for the MCR, TSC, RSS, EOF, and LCS. However, the staff reviewed Volume III of the Style Guide and did not find work-specific information or chapters containing specific information about the TSC, RSS, EOF, and LCS. As a result, the staff issued RAI 9411, Question 18-45 (ADAMS Accession No. ML18204A188), on Volume III of the Style Guide. In the response to RAI 9411, Question 18-45 (ADAMS Accession No. ML18164A394), the applicant stated that Volume III of the HSI Style Guide is intended to contain an HSI library as an example of how the design criteria of Volume ll are applied to support maintaining design consistency across time. The applicant stated that the final bullet in the HSI Design RSR, Section 4.5.4, incorrectly states that Volume III of the HSI Style Guide will contain work location specific information, which is intended to be contained in Appendix G, HFE Design, to the HSI Style Guide. RAI 9411, Question 18-45, is a Confirmatory Item pending revision to the Style Guide.

The staff found that the HSI Style Guide addresses the scope of HSIs and their form, function, operation, and environmental conditions that are relevant to human performance. The HSI Style Guide contains guidance derived from generic HFE guidance and HSI design-related analyses and reflects the applicants decisions in addressing specific goals of the HSI design.

As such, the applicants style guide conforms to Criteria 8.4.3(1)-(2).

Criterion 8.4.3(3) states that the guidelines in the style guide should be expressed precisely and describe easily observable characteristics. The staff reviewed the applicants Style Guide and observed that some guidelines are not expressed precisely and that some guidelines do not describe observable HSI characteristics and details. DCA Part 2 Tier 2, Section 18.7.2.3.3, HSI Style Guide, explains that as the HSI design progresses, the guidelines will become more precise and detailed and that easily observable guidance statements will be incorporated into subsequent revisions. The version of the Style Guide submitted with the application and reviewed by the staff provides a snapshot of the HSI design up to a certain point in the design.

The design will continue to evolve as the applicant finalizes the design before conducting V&V activities. The Style Guide, Section 1.1, Purpose, states that the guide is considering a living document, meaning that the applicant will update it continuously as needed because of the iterative nature of the design process. HEDs are used to document and resolve issues identified during the development of the HFE design. HEDs are tracked using the HFEITS, which is available to all HFE team members to track specific needs. The staff concludes that as the design matures, design personnel can use HEDs to document any changes that need to be incorporated in the HFE documents, including the Style Guide. As such, the staff finds that the applicant updates its Style Guide as the design matures.

18-92

As explained in the criteria from NUREG-0711 listed above, the applicant should update the Style Guide as the design matures. Thus, the staff finds it acceptable that, as the applicant continues to make design decisions, it will add detailed guidelines to the Style Guide. However, as discussed in Section 18.10.4.2 of this SER, the applicant will perform HFE design verification to confirm that the HSIs conform to HFE guidelines. The applicant explained in DCA Part 2 Tier 2, Section 18.10.2.2.3, Human Factors Engineering Design Verification, that the Style Guide contains the acceptance criteria for design verification. When HSIs do not meet the acceptance criteria, the applicant will document an HED. Thus, the individual guidelines in the Style Guide need to be expressed precisely and need to describe easily observable HSI characteristics by the time the applicant conducts design verification so that the design personnel have verifiable acceptance criteria. Therefore, the staff plans to review the final version of the Style Guide to confirm that the applicant updated it to contain precisely written individual guidelines that are the acceptance criteria for design verification. Additionally, the Style Guide was not incorporated by reference into the DCD as a Tier 2 document. Because the staff has relied on information in the Style Guide to support findings about the HFE design, and because the Style Guide contains design information about the HFE control room design, the Style Guide needs to be part of the DCA Part 2. Given the applicant will submit the V&V RSR prior to Phase 4 of the DC review, and that the V&V RSR will also need to be added to Table 1.6-2, the staff will ensure these two HFE technical reports are added to Table 1.6-2 when the applicant submits the V&V RSR.

Open Item 18-3: Staff will review the updated Style Guide after the applicant submits the V&V RSR to verify it conforms to Criterion 8.4.3(3), and the staff will ensure the Style Guide and V&V RSR are incorporated by reference in DCA Part 2 Tier 2.

The HSI Design RSR, Section 4.2.1, describes how HFE guidance is used in conjunction with the NuScale simulator software. The staff reviewed this section and found that NuScale developed proprietary software to help ensure that the HFE design guidelines in the Style Guide are applied consistently to the applicable HSIs. The HFE design team uses this software for designing MCR HSIs. Furthermore, instructions in the Style Guide are written so that designers can readily understand them, and the text is supplemented with graphical examples, figures, and tables to facilitate ease of use. Therefore, the staff concludes that the Style Guide is readily accessible and usable by designers.

The HFE PMP, Section 4.2, describes various HFE tools and techniques used to support each of the HFE elements. DCA Part 2 Tier 2, Section 18.1.3, Human Factors Engineering Process and Procedures, states that HFE activities are conducted, and design documents and control processes are retained, in accordance with NuScales quality assurance program. The staff evaluates the applicants quality assurance program in Chapter 17 of this SER. As such, the staff finds that the Style Guide is readily accessible by designers.

In summary, the staff finds that the Style Guide contains useable information about the form, function, and operation of HSIs in the NuScale plant and allows for a consistent and verifiable design. The Style Guide lists environmental conditions relevant to human performance for HSIs. The applicant derived the guidelines in the Style Guide from generic HFE guidance in NUREG-0711 and tailored them based on the applicants design decisions. The guidelines are detailed and contain information about when to use them. Graphical examples and figures supplement the text of the guidelines. The staff observed that each guideline and requirement in the applicants Style Guide includes a source reference. Because the applicant will update the Style Guide as the design matures, the staff will verify the final version of the Style Guide 18-93

used as the acceptance criteria for design verification during Phase 4. Accordingly, the staff concludes that the applicants Style Guide conforms to Criteria 8.4.3(1), 8.4.3(2), 8.4.3(4), and 8.4.3(5).

General Human-System Interface Design and Integration (Criteria 8.4.4.1(1)-(8))

NUREG-0711, Section 8.4.4.1, General, includes eight criteria for this topic that address general attributes of the HSI design. The eighth criterion addresses plant modifications and is not applicable to new reactors; thus, the staff evaluated the first seven criteria as follows:

(1) HSI design supports IHAs (Criterion 8.4.4.1(1))

The staff reviewed the HSI Design RSR to determine how the HSIs in the NuScale MCR minimize the probability that errors will occur and maximize the probability of error detection for IHAs. In general, the [[ ]]. For the two IHAs that NuScale has identified, several deliberate operator actions are required at the standup unit workstation, which is clearly labeled. Procedures have to be used when performing IHAs, which helps reduce human performance errors, and the HSI design provides valve position indication and other parameters to inform the operator that the action has been taken. Additionally, for the two IHAs, [[ ]]. The staff also reviewed the TA RSR, Section 3.3.5, for this criterion.

The staff found that [[ ]]. If an LCS is required for conducting an IHA, that LCS HSI is designed using the same Style Guide as the MCR HSIs. This ensures HSI design consistency, training efficiency, clear labeling, easy accessibility, and avoidance of hazardous locations. For maximizing the probability of error detection, [[ ]]. Because the applicant has designed controls and displays to minimize error for the execution of IHAs and maximize opportunities for error detection, the staff finds that this criterion is met.

(2) HSI layout based on job analysis (Criterion 8.4.4.1(2))

The staff reviewed the HSI Design RSR to determine the basis for the HSI layout and found that that the number and location of displays in the MCR, the hierarchy of the individual HSI screens for each workstation, and the arrangement of the workstations within the MCR are based on job analysis, an understanding of the frequency and sequence of use (e.g., startup, shutdown, normal operating, abnormal operating, and accident situations), and the roles defined for operators during S&Q analysis.

The HSI layout in the MCR is specifically designed to support minimum staffing during all operating plant modes. Shared system displays and unit and plant overview displays can be observed from multiple locations within the MCR. Unit workstations are spaced so that side-by-side operation at adjoining units allows sufficient elbow room.

Additionally, an operator at any sitdown workstation can access and display the HSI for all units on any of the screens at the workstation. Dedicated screens continuously display safety parameters. NuScale tested the MCR HSI layout during the SPV and used ISV to validate the MCR HSI layout. Because NuScale used several analyses to determine and then validate the HSI layout in the MCR, the staff finds that this criterion is met.

(3) HSI design supports inspection, maintenance, and testing activities (Criterion 8.4.4.1(3))

18-94

The staff reviewed the HSI Design RSR, Section 4.6.1.3, Human-System Interface Support for Inspection, Maintenance, and Testing, to determine whether the HSI design supports inspection, maintenance, testing, and repair of both plant equipment and the HSIs. The staff also verified that HSI maintenance would not interfere with plant control activities. The NuScale Information and Records Management (IRM) system is used to review technical documents, reports, test results, and other work documents to confirm the readiness of structures, systems, and components. Operators use the IRM system to control work and manage component tagging for out-of-service conditions. IRM information is used (directly or indirectly) to communicate status information to the HSI, which uses a shading and color scheme to alert the operators of those conditions on the system displays. This obviates the need for physical tags in the MCR. Because of the capabilities of the IRM system, the staff concludes that the operators can manage maintenance activities to prevent interference with other plant-control activities, and the HSI design supports maintenance and testing of both plant equipment and the HSIs.

Therefore, the staff concludes that this criterion is met.

(4) HSI design supports task performance under conditions of minimal, typical, and maximum staffing (Criterion 8.4.4.1(4))

The staff reviewed the HSI Design RSR, Section 4.6.1.4, Support for Staffing Conditions, to determine how the applicants HSI design supports personnel task performance under various staffing levels.

Minimum staffing is supported through the NuScale plants passive features, modular design, and high degree of automation, which reduce the number of alarms, controls, indications, and procedures. The automation, along with the reduced task burden of managing the HSI, enhances the ability of operators to maintain situational awareness of overall plant conditions. The use of minimum staffing to operate the plant safely was confirmed through the analysis of IHAs, TA, and S&Q and preliminarily validated during the control room SPV.

The HSI Style Guide, Section 3.7.2.1, Control Room Configuration, includes a guideline for designers to ensure coverage of controls, displays, and other equipment for all modes of operations. Maximum staffing, including augmented staff during an accident scenario, is accounted for in several aspects of the NuScale design. The ConOps, Section 3.2.5, states that the HSI layout in the MCR is specifically designed to support minimum, nominal, and enhanced staffing during all operating plant modes. A concave MCR layout provides control room personnel with a panoramic view of each of the unit overview displays and the common systems overview display. Large overview displays in the MCR allow any observer or operator in the control room to determine plant status and safety functions. The spatially dedicated, continuously visible safety display indication system (SDIS) panels provide easily accessible information about accident monitoring parameters to anyone in the MCR. The NuScale design includes provisions for a TSC in close proximity to the MCR where additional personnel can congregate to support emergency response functions. A work control center supports operations and maintenance personnel by providing a location outside of the MCR for maintenance-related tasks. Because the HSI design supports task performance for all modes of operations, the staff finds that this criterion is met.

(5) HSI design process accounts for fatigue (Criterion 8.4.4.1(5))

18-95

The staff reviewed the HSI Design RSR, Section 4.6.1.5, Human Performance/Fatigue, to determine how the NuScale HSI is designed to enhance human performance by reducing fatigue. Specifically, the automation of plant functions, including many routine tasks, reduces repetitive tasks required of operators. The NuScale design automates most functions to aid the operators in managing the workload for 12 units. SER Section 18.3 discusses automation in detail. HSI displays at each workstation allow the operator to monitor automation activities. In addition, navigation between individual screens at workstations is reduced because of a simplified plant design and through overview displays of plant status. Task-based displays are incorporated to reduce navigation steps during procedure use. The arrangement or hierarchy of individual HSI screens is based on job analysis, the frequency and sequence of use, and operator roles to increase the simplicity of navigation.

In addition, MCR facility attributes that are known to affect fatigue, such as lighting, ergonomics, and overall physical layout, were considered during HSI design and incorporated into the Style Guide.

With regard to the concern for decreased operator vigilance associated with a highly automated, low-workload environment, the staff concludes there is reasonable assurance that even when underload (i.e., low levels of workload) conditions occur, the NuScale plant can still be safely operated for the following reasons:

  • The applicants proposed staffing level includes, and the ConOps describes, an operator whose main responsibility is to monitor plant conditions. Therefore, at least one member of the control room team is continuously responsible for monitoring plant status. When a nuclear power unit is in MODE 1, 2, or 3, as defined by the unit's TS, each licensee shall have a person holding a senior operator license for the nuclear power unit in the control room at all times. In addition to this senior operator, a licensed operator or senior operator shall be present at the controls at all times. Therefore, the operator assigned to plant monitoring has an additional operator present in the control room with him or her to ensure that the operator remains attentive.
  • The applicants control room design includes an alarm system to notify operators of changes in plant conditions.
  • The operators do not need to take any actions to mitigate the consequences of a DBE, and the few actions that operators do need to take to mitigate the consequences of a BDBE do not need to be taken until a relatively long period of time after event initiation.

Because the applicant incorporated methods such as automation, task-based displays, and reduced screen navigation into its design, the staff finds that the HSI design process accounts for decrements in human performance as a result of fatigue over the duration of a shift. Therefore, this part of the criterion is met.

(6) HSI characteristics support human performance under a full range of environmental conditions (Criterion 8.4.4.1(6))

18-96

The staff reviewed the Style Guide, Section 3.7, which includes guidance for general workplace considerations, including thermal comfort, illumination, auditory environment, and facility layout. The Style Guide, Section 3.7.2, Main Control Room Requirements and Guidelines, identifies design-specific guidelines for the MCR that conform to the guidelines in NUREG-0700 that help to ensure that the MCR design supports human performance under the full range of environmental conditions by ensuring that adequate lighting, ventilation, and noise reduction are available during normal and credible extreme conditions such as a loss of lighting and ventilation. The Style Guide, Section 3.7.3, Local Control Stations Requirements and Guidelines, identifies design-specific guidelines for the design of LCSs that conform to the guidelines in NUREG-0700 to ensure that LCS design supports human performance under the full range of environmental conditions. However, for the RSS design, the staff could not locate guidance for how HSI characteristics support human performance under the full range of environmental conditions and how the NuScale application considered the ambient environment and the need for and type of protective clothing. Therefore, the staff issued RAI 9411, Question 18-45 (ADAMS Accession No. ML18204A188), to address this issue. The response to RAI 9411, Question 18-45 (ADAMS Accession No. ML18164A394), clarified that the design of the RSS conforms to the Style Guide requirements and guidelines to support human performance under a full range of environmental conditions. It also noted that operator actions from the RSS are not required or anticipated. Because the design of the RSS conforms to the NuScale Style Guide, which has guidelines that support human performance under a range of environmental conditions, the staff finds this response acceptable. The applicant will update the Style Guide with the information from the response. Therefore, RAI 9411, Question 18-45, is a Confirmatory Item.

In addition, the HSI Design RSR, Section 4.6.1.6, Environmental Conditions, states that the MCR environmental conditions conform to RG 1.196, Control Room Habitability at Light-Water Nuclear Power Reactors, with regard to air quality and radiation protection. RG 1.196 contains guidance and criteria that the staff considers acceptable for the design of a control room that meets GDC 19. GDC 19 requires that a control room be provided from which actions can be taken to operate the nuclear reactor safely under normal conditions and to maintain the reactor in a safe condition under accident conditions, including a loss-of-coolant accident. GDC 19 also specifies that adequate radiation protection is to be provided to permit access to and occupancy of the control room under accident conditions without personnel receiving radiation exposures in excess of specified values. In Chapter 6 of this SER, the staff reviews the application with regard to the satisfaction of GDC 19 for MCR habitability (radiation protection) under accident conditions.

The staff finds that the applicant considered a full range of environmental conditions, such as temperature, humidity, lighting, noise levels, and radiation, during HSI design and thus meets this criterion.

(7) The applicant has a change process for HSIs in the operating plant (Criterion 8.4.4.1(7))

The staff reviewed DCA Part 2 Tier 2, Section 18.7.2.4.1, and found that HSI modifications in the operating plant are a COL responsibility to be addressed by the COL holders design change control processes, which are governed by requirements included in the DC rule.

18-97

With regard to temporary HSI changes such as setpoints, the staff reviewed the HSI Design RSR and found that the applicant described how the design allows for operators to make temporary changes to the HSI, such as modifying setpoints. Additionally, within the primary system process control HSI feature, operators can also modify recommended actions for dilution, boration, and letdown actions within limits. Thus, the staff finds that the applicant has identified a method for temporary HSI changes in an operating plant and thus meets this criterion.

The staff reviewed the HSI Design RSR and found that the design allows for MCR operators to create displays that will allow them to monitor specific parameters as needed through NuScale HSIs. Thus, the staff finds that the applicant meets this criterion for temporary displays that allow monitoring of a specific situation.

Main Control Room Design (Includes Three Mile Island Requirements)

The staff reviewed the HSI Design RSR and the DCA Part 2 to determine how the MCR HSI design meets the 15 criteria outlined in NUREG-0711, Section 8.4.4.2, Main Control Room, which includes the requirements in 10 CFR 50.34(f)(2) related to lessons learned from the accident at the Three Mile Island (TMI) reactors. The staff did not evaluate NUREG-0711, Criteria 8.4.4.2(8)-(9), because they are applicable to boiling-water reactors only, and the NuScale design is a pressurized-water reactor.

(1) 10 CFR 50.34(f)(2)(iv)SPDS Criterion 8.4.4.2(1) contains guidance the staff uses to determine whether a design complies with the SPDS requirements. The criterion states that applicants should describe the SPDS, addressing the identification of CSFs, the parameters plant personnel will use to monitor each CSF, and how the SPDS conforms to HFE guidelines.

The staff reviewed the HSI Design RSR, Section 4.6.2, Main Control Room; DCA Part 2 Tier 2, Chapter 7, Instrumentation and Controls; and the Style Guide to verify that the design complies with SPDS requirements.

Identification of CSFs The SDIS (i.e., the NuScale equivalent of the SPDS) includes a spatially dedicated, continuously visible display panel for each unit in the MCR. The applicant identified three CSFs for the NuScale designreactivity control, remove fuel assembly heat, and containment integritythat can be monitored for each unit on the SDIS consoles. These CSFs differ from the CSFs defined for other pressurized-water reactor designs in that the NuScale plant CSFs do not include separate functions for reactor coolant system (RCS) integrity, inventory control and radioactive effluent control. DCA Part 2 Tier 2, Section 7.1.1.2.2, Post-Accident Monitoring, states that the containment integrity CSF includes aspects of the radioactive effluent control function, and the remove fuel assembly heat CSF includes aspects of the RCS integrity function. The staff issued RAI 9435, Question 13.05.02.01-21 (ADAMS Accession No. ML18116A000), to understand more about the CSFs for the NuScale SMR design.

In response to RAI 9435, Question 13.05.02.01-21 (ADAMS Accession No. ML18176A256), the applicant explained how the NuScale staff determined the CSFs and provided additional information about the CSFs. The NuScale design does not 18-98

include a separate function for RCS inventory control because it is considered an integral part of the reactor core cooling safety function. RCS integrity and radioactive effluent control are addressed as plant safety functions, but they were not identified as safety functions in the PRA. In the NuScale design, the emergency core cooling system (ECCS) provides passive core cooling by retaining primary coolant inside the containment vessel, which facilitates the transfer of heat from the fuel to the ultimate heat sink.

Specifically, the applicant stated the following with regard to the absence of an RCS integrity CSF:

RCS integrity is addressed under the core heat removal CSF by ensuring that the low temperature overpressure (LTOP) system automatically actuates when required. The LTOP system fully actuates ECCS when pressure is above the temperature dependent pressure setpoint. When ECCS actuates, an intentional hydraulic connection between the RCS and CNV [containment vessel] occurs which establishes a natural circulation heat removal path outside of the RCS, but within the containment. A pressurized thermal shock event is not credible at NuScale because of the following factors:

1. All sources of makeup are isolated by the containment isolation system.
2. Actuation of the ECCS system precludes pressurization of the RCS system.
3. The NuScale reactor pressure vessel is designed to withstand the maximum passive system cooldown rate.

The staff finds that NuScale adequately defined and provided the methodology for determining design-specific CSFs, and this RAI response is acceptable.

The HSI Design RSR, Section 4.6.2, states that the NuScale PRA, safety analysis, and plant operations groups used guidance in NUREG-1342 to identify these CSFs for the NuScale design. Additionally, the applicants response to RAI 9435, Question 13.05.02.01-21, explains how NuScale desired a minimum set of CSFs in order to simply the diagnostic activities required of the control room operators during potentially complex, confusing conditions. The HFE FRA/FA RSR documents the incorporation of the PRA results into the HFE design.

The staffs evaluation verified the following goals:

  • The group of CSFs should be a straightforward diagnostic tool for the operator.
  • CSFs should account for passive safety systems. CSFs should not rely on automation or alternating or direct current power.
  • The HAs credited by the NuScale plant PRA should align with the CSFs that the HAs are designed to protect.

18-99

The staff finds that the applicant considered the unique characteristics of the plants design in conjunction with guidance in NUREG-1342 and identified CSFs applicable to the NuScale design.

Identification of Parameters to Monitor Each CSF Institute for Electrical and Electronics Engineers (IEEE) Standard (Std.) 497-2002, IEEE Standard Criteria for Accident Monitoring Instrumentation for Nuclear Power Generating Stations, which is endorsed by RG 1.97, Revision 4, issued June 2006, defines Type B variables as those variables that provide primary information to the control room operators to assess the CSFs. IEEE Std. 497-2002 also describes a method for identifying Type B variables. In DCA Part 2 Tier 2, Section 7.1.1.2.2, the applicant stated that it selected the postaccident monitoring (PAM) variables, including the Type B variables, using the guidance provided in IEEE Std. 497-2002 as modified by RG 1.97, Revision 4.

The staff evaluates the applicants selection of the PAM variables in Section 7.5 of this SER.

DCA Part 2 Tier 2, Section 7.1.1.2.2, lists the Type B variables that the applicant has identified for the purpose of monitoring the CSFs. The Type B PAM variables displayed on the SDIS are also displayed on the MCS or PCS. DCA Part 2 Tier 2, Section 7.0.4.4, Safety Display Indication System, explains that the module protection system (MPS) and plant protection system (PPS) provide the SDIS data to operators in the MCR. Data from the MPS and PPS are displayed on dedicated monitors, with one monitor per division. Both SDIS divisional displays show both divisions of MPS and PPS data. The spatially dedicated, continuously visible SDIS panels indicate the CSF status and PAM variables for each unit. DCA Part 2 Tier 2, Section 7.2.12, Displays and Monitoring, states that each SDIS display panel presents data derived from both divisions of the MPS or PPS. Each unit has two separate SDIS displays, for a total of 24, allowing for two independent displays of the same parameters for each unit. This gives MCR operators the ability to cross check data from independent divisions, sensors, and displays while increasing the amount of information and reducing ambiguity. During the July-August 2018 ISV audit (ADAMS Accession No. ML18298A189), the staff observed the applicants validation of safety function monitoring using the SDIS and found that the displays provide appropriate parameters for monitoring the CSFs. Accordingly, the staff concludes the applicant identified parameters to monitor the CSFs.

Evaluation of SPDS HSIs The staff also uses the guidance in NUREG-0700, Section 5, Safety Function and Parameter Monitoring System, to verify that the applicants SDIS system HSI conforms to acceptable HFE practices. The staff reviewed the applicants HFE guidelines for the SDIS system in the Style Guide, Section 3.3.2, "Requirements and Guidelines," and compared them to the SPDS HFE guidance in NUREG-0700, Section 5. The staff found that the applicants Style Guide contained the applicable guidance from NUREG-0700, Section 5. However, for Items 5.1-7, 5.1-8, 5.1-9, 5.1-10, 5.2-1, 5.2-2, 5.3-2, 5.3-3, and 5.4-3, the applicant did not address how it incorporated this guidance into the SDIS design. As a result, the staff issued RAI 8847, Questions 18-6 and 18-7 (ADAMS18-100

Accession No. ML17287A002). In the response to RAI 8847, Questions 18-6 and 18-7 (ADAMS Accession No. ML17347B713), NuScale provided additional information on the SDIS, including how (1) it will allow users to comprehend changes in status, (2) the sampling rate for each critical variable will be consistent with user needs, (3) critical variables will be displayed with sufficient accuracy for the user, (4) critical plant variable magnitudes and trends will be displayed, (5) it will assist users in monitoring critical parameters and alerting them when values are out of range, (6) the system provides cues to alert personnel to abnormal conditions that may warrant corrective actions, (7) data are validated in real time, (8) data validation status is displayed to the operator, and (9) display devices are labeled and distinguished from other devices. The staff finds this acceptable because the Style Guide, once updated, will describe how the applicant incorporated guidance from NUREG-0700, Section 5, into the design of the SDIS system. RAI 8847, Questions 18-6 and 18-7, are Confirmatory Items. Accordingly, the staff finds that the application conforms to this criterion.

Because the applicant described how the SDIS conforms to the HFE guidelines for the SPDS in NUREG-0700, the staff finds that the applicants SDIS conforms to acceptable HFE practices.

(2) 10 CFR 50.34(f)(2)(v)bypassed and inoperable status indication (BISI)

The staff reviewed the HSI Design RSR, Section 4.6.2.2, Bypassed and Inoperable Status Indication, to identify how the applicants HSI assures the automatic indication of the bypassed and operable status of safety systems, which is an indication required by 10 CFR 50.34(f)(2)(v). NUREG-0711, Criterion 8.4.4.2(2), contains guidance the staff uses to determine whether the applicant complied with the BISI requirements, including the location of indications, administrative procedures for bypassing, confirmation that a bypassed safety function is properly returned to service, annunciation capabilities, decision points for reactor shutdown, and information about the operability of the bypass system. Because the NuScale design includes a common control room for up to 12 NuScale power modules (NPM) with HSI for both unit systems and common plant systems visible or accessible to all operators in the MCR, the staff finds that the applicant has complied with the NUREG-0711 guidance for the control room of all affected units to receive an indication of the bypass for their shared system safety functions.

The HSI Design RSR, Section 4.6.2.2, states that the HSI continuously monitors the operability and position status of components that support the plant safety-related functions and updates this information on the appropriate safety system display pages and at the spatially dedicated, continuously visible locations in the MCR.

DCA Part 2 Tier 2, Section 7.2.13.4, Indication of Bypasses, describes how the HSI provides automatic indication of bypassed or deliberately rendered inoperable safety systems. Equipment status information is automatically sent from the MPS to the MCS and SDIS. The MCS provides continuous indication of MPS actions that are bypassed or deliberately rendered inoperable. MCS displays provide the operator with continuous indications of bypass, trip, and out-of-service status; thus, operators can identify the operability of safety functions and equipment in order to determine whether the TS permit continued operation of the reactor. In addition to status indication, the MCS sounds an alarm in the MCR if more than one MPS bypass is attempted for a given 18-101

protection function and allows operators to manually activate bypass indications in the control room.

DCA Part 2 Tier 2, Section 7.2.13.4, states that NuScale evaluated BISI functions as part of the MPS failure modes and effects analysis. DCA Part 2 Tier 2, Section 7.2.13.5, Annunciator Systems, states that an independent monitoring system monitors the status of the MCS and PCS to detect and alert the operator to a loss of the overall I&C system. The Style Guide, Section 3.1.2.2.8, Indication of Proper System Operation, contains an HSI requirement that each NuScale HSI display page contains a heart beat indication to quickly alert operators that the data on the HSI have stopped operating or updating. The Style Guide, Section 3.1.2.2.9, Indication of Information Failure, contains an HSI requirement for color coding a device white upon a loss of indication or communications failure. Therefore, the staff concludes that the BISI includes the ability for operators to ensure it is functioning properly.

Contrary to the guidance in NUREG-0711, Criterion 8.4.4.2(2), the staff could not find in the application (1) a description of the provisions for allowing the operations staff to confirm that a bypassed safety function was properly returned to service or (2) information about the arrangement of the BISI so that personnel can determine whether it is permissible to continue operating the reactor. Therefore, the staff issued RAI 9318, Question 18-26 (ADAMS Accession No. ML18079A126), to address these two issues. In the response to RAI 9318, Question 18-26 (ADAMS Accession No. ML18141A866), the applicant explained how the individual unit overview display changes when bypassed safety functions are returned to service and how status indication provided by the HSI in conjunction with with operator use of the Technical Specifications allow personnel to determine whether it is permissible to continue operation. Because the applicant provided specific information as to how operators can determine when a function has been returned to service and whether continued operation is permitted, the staff found the response acceptable. RAI 9318, Question 18-26, is a Confirmatory Item pending changes to the HSI Design RSR.

The staff finds that the BISI is on displays in the MCR and provides automatic indication of safety system status, safety subsystem status (operable and inoperable), and the deliberate bypass of a safety function and associated systems. Because MCR operators are alerted in the event of a failure of the digital computer-based I&C system, and the HSI has designated indications for the loss of indication or communications failure for a component, the staff find that the BISI alerts operators about BISI system failures and the operators can verify BISI status. Operators can use the TS in conjunction with bypass status information in the MCR to determine whether it is permissible to continue operating the reactor. Accordingly, the staff finds that NuScales BISI system meets the criteria in NUREG-0711 and the requirements for BISI in 10 CFR 50.34(f)(2)(v).

(3) 10 CFR 50.34(f)(2)(xi)relief and safety valve indication The staff reviewed the NuScale DCA Part 2 and the HSI Design RSR to determine how the applicants HSI assures the direct indication of relief and safety valve position (open or closed) in the control room, which is an indication required by 10 CFR 50.34(f)(2)(xi) and addressed in NUREG-0711, Criterion 8.4.4.2(3).18-102

DCA Part 2 Tier 2, Section 6.3.1, Design Basis, states that valve position indication is provided in the MCR for the five ECCS valves, the trip and reset actuator valves, and the reactor safety valves (RSV). Solenoid power indication for the ECCS trip and reset valves is also provided in the MCR.

The staff reviewed the HSI Design RSR, Section 4.6.2(3), Relief and Safety Valve Position Monitoring, which describes how the HSI provides position indication of these valves in the MCR. The staff found that the applicant identified the specific control room displays that convey this information to operators and how the I&C system provides the indication to the displays. The staff reviewed the HSI Design RSR, Section 7.0, HSI Display Page Examples, which contains display page examples, but the staff did not find all of the required valve indications. Therefore, the staff issued RAI 9318, Question 18-27 (ADAMS Accession No. ML18079A126). In the response to RAI 9318, Question 18-27 (ADAMS Accession No. ML18141A866), the applicant explained that the reactor safety valves and reactor pressure vessel relief valves are synonymous. The applicant will update the HSI Design RSR with the term reactor safety valve for consistency. The staff observed that position indication for the ECCS valves and RSVs is available for each unit on the SDIS displays during the July-August 2018 ISV audit (ADAMS Accession No. ML18298A189). The staff concludes the applicants HSI allows for the direct indication of relief and safety valve positions in the control and thus this requirement is met. RAI 9318, Question 18-27, is a Confirmatory Item pending changes to the HSI Design RSR.

(4) 10 CFR 50.34(f)(2)(xii)manual feedwater control NUREG-0711, Criterion 8.4.4.2(4), for the applicant to describe how the control room HSI provides automatic and manual initiation of the auxiliary feedwater system and indication of flow, does not apply because the NuScale design does not include an auxiliary feedwater system. The staff confirmed that the requirements in 10 CFR 50.34(f)(2)(xii) for automatic and manual auxiliary feedwater initiation and system flow indication in the MCR are not technically relevant to the NuScale SMR design. As stated in NuScale DCA Part 7, in which the applicant requested an exemption from the portion of 10 CFR 50.62(c)(1) requiring diverse equipment to initiate a turbine trip, under conditions indicative of an anticipated transient without scram (ATWS), The NuScale Power Plant design does not include an auxiliary or emergency feedwater system.

(5) 10 CFR 50.34(f)(2)(xvii)containment monitoring The staff reviewed the HSI Design RSR and DCA Part 2 Tier 2, Chapter 7, for information about how the control room alarms and displays inform personnel about (A) containment pressure, (B) containment water level, (C) containment hydrogen concentration, (D) containment radiation intensity (high level), and (E) noble gas effluents for all potential accident release points. NUREG-0711, Criterion 8.4.4.2(5),

contains guidance the staff uses to verify the requirements to provide instrumentation to measure, record, and display these same parameters (A-E) in the control room.

(A) containment pressure and (B) containment water level 18-103

The staff reviewed the HSI Design RSR, Section 4.6.2, and found that the applicant identified the HSIs in the MCR that provide indication of containment pressure and containment water level. In DCA Part 2 Tier 2, Section 7.2.13.5, the applicant stated that alarms alert the operators about deviations from setpoints, excessive rates of change, high or low process values, and contact changes of state from normal.

Therefore, the staff concludes that the applicant described how the HSIs provide information to the operators about containment pressure and water level.

(C) containment hydrogen concentration In the HSI Design RSR, Section 4.6.2, the applicant stated that it seeks an exemption from supplying containment hydrogen concentration parameters. However, DCA Part 7 does not contain a request for an exemption from either 10 CFR 50.44(c)(4) or 10 CFR 50.34(f)(2)(xvii)(C). Furthermore, DCA Part 2 Tier 2, Section 7.2.13, Displays and Monitoring, states that, consistent with 10 CFR 50.34(f)(2)(xvii)(C) and 10 CFR 50.44(c)(4), the containment process sampling system includes nonsafety-related oxygen and hydrogen analyzers to continuously monitor the concentrations of these elements in the containment environment during operation and beyond-design-basis conditions. The analyzers are designed to be functional, reliable, and meet the design criteria discussed in Regulatory Position C.2 of RG 1.7, Control of Combustible Gas Concentrations in Containment. The hydrogen analyzer output signal is sent to the MCS, which can provide readout in the MCR. Additionally, local indication serves as a backup display or indication if information from the MCS cannot be displayed in the control room after an accident.

Because of this conflicting information, the staff issued RAI 9318, Question 18-28 (ADAMS Accession No. ML18079A126). In the response to RAI 9318, Question 18-28 (ADAMS Accession No. ML18141A866), the applicant committed to correcting the information in the HSI Design RSR so that it is consistent with DCA Part 2 Tier 2, Chapter 7 and DCA Part 7. The applicant clarified that containment hydrogen concentration parameters will be displayed in the MCR and stated that continuous monitoring of the containment hydrogen level for combustible gas control will be unavailable when the containment is isolated during an accident scenario. The applicant explained that containment hydrogen monitoring is not necessary immediately after an event because hydrogen combustion scenarios that occur within 72 hours8.333333e-4 days <br />0.02 hours <br />1.190476e-4 weeks <br />2.7396e-5 months <br /> following event initiation have no adverse effect on containment integrity or plant safety functions. The applicant stated that its analysis of combustion events demonstrates that no compensatory measures or mitigating actions are required for any scenario within the first 72 hours8.333333e-4 days <br />0.02 hours <br />1.190476e-4 weeks <br />2.7396e-5 months <br />. For the period after 72 hours8.333333e-4 days <br />0.02 hours <br />1.190476e-4 weeks <br />2.7396e-5 months <br />, the applicant stated that containment gas sampling and monitoring can be reestablished to support postaccident sampling. In DCA Part 2 Tier 2, Chapter 6, the applicant states that containment hydrogen monitoring equipment is capable of functioning following a significant beyond design-basis accident. In Chapter 6 of this SER, the staff evaluates combustible gas concentrations during accidents and how NuScale meets the requirements in 10 CFR 50.44, Combustible Gas Control for Nuclear Power Reactors. As stated in Chapter 6 of this SER, the staff finds that a combustion event in the first 72 hours8.333333e-4 days <br />0.02 hours <br />1.190476e-4 weeks <br />2.7396e-5 months <br /> after an accident does not threaten containment integrity. The staff concludes that because containment integrity is not threatened, it is not necessary to monitor containment hydrogen concentration for the first 72 hours8.333333e-4 days <br />0.02 hours <br />1.190476e-4 weeks <br />2.7396e-5 months <br /> following an event. The staff finds that reestablishing containment hydrogen concentration monitoring in the period after 72 hours8.333333e-4 days <br />0.02 hours <br />1.190476e-4 weeks <br />2.7396e-5 months <br /> is18-104

acceptable for meeting the requirements associated with sampling in 10 CFR 50.44(c).

Therefore, the staff concludes that the applicant described how the HSIs provide information to the operators about containment hydrogen concentration.

(D) containment radiation intensity (high level)

The HSI Design RSR, Section 4.6.2, explained why the MCR indication of containment radiation levels is not necessary. However, this contradicts information provided in DCA Part 2 Tier 2, Section 12.3.4, Area Radiation and Airborne Radioactivity Monitoring Instrumentation, which states that the area and airborne radiological monitoring equipment is designed to monitor containment radiation levels, conforming to 10 CFR 50.34(f)(2)(xvii). Therefore, the staff issued RAI 9318, Question 18-28 (ADAMS Accession No. ML18079A126). In the response to RAI 9318, Question 18-28 (ADAMS Accession No. ML18141A866), the applicant deleted this statement in the HSI Design RSR about containment radiation monitoring and stated that it credits the radiation monitor under the bioshield for this requirement. RAI 9318, Question 18-28, is a Confirmatory Item pending changes to the HSI Design RSR.

DCA Part 2 Tier 2, Section 12.3.4.2, Fixed Area Radiation Monitoring Instrumentation, provides information about the radiation monitors under the bioshield, which are used to detect fuel damage under accident conditions and are considered PAM system B and C variables. Two monitors are located at the top of each NPM beneath the bioshield. The radiation monitors under the bioshield are environmentally qualified to survive an accident and perform their design functions. In Chapter 12 of this SER, the staff evaluated the radiation monitors under the bioshield and found that they meet the requirement for the applicant to provide instrumentation to monitor containment radiation. NuScale area radiation monitors provide both indication and alarm functions to the local plant area, the MCR, and, for selected areas, the waste management control room. This ensures operator and worker awareness of changing radiological conditions that could indicate system leakage or component malfunction and provides a warning to plant personnel before they enter the affected areas. Where appropriate, local visual alarms are provided outside of the monitored area to ensure worker awareness before they enter the affected area. Thus, the staff concludes that visual and audible indications of containment abnormal conditions, and thus high radioactivity, are provided to operators in the MCR.

(E) noble gas effluents for all potential accident release points The HSI Design RSR, Section 4.6.2, explained why an HSI for noble gas effluent monitoring is not necessary. However, DCA Part 2 Tier 2, Section 11.5, Process and Effluent Radiation Monitoring Instrumentation and Sampling System, states that monitoring and sampling equipment has been designed to provide monitoring and sampling instrumentation for measuring and recording noble gas radiological data at release points and that the system also provides continuous monitoring and sampling of radioactive iodine and particulates in gaseous effluents. Because of this contradictory information, the staff issued RAI 9318, Question 18-28 (ADAMS Accession No. ML18079A126). In the response to RAI 9318, Question 18-28 (ADAMS Accession No. ML18141A866), the applicant clarified that noble gas effluent monitoring will be available from the MCR for each potential effluent location. The applicant also provided a proposed revision to the HSI Design RSR to explain that the HSI includes alarms for 18-105

specific emergency plan-related noble gas release rates and rates below emergency declaration setpoints that notify the operators of changing radiological conditions.

Therefore, the staff concludes that the applicant described how the HSI will inform personnel about noble gas effluents, and RAI 9318, Question 18-28, is a Confirmatory Item pending revision to the HSI Design RSR. Therefore, the staff concludes that the applicant described how the HSIs provide information to the operators about noble gas effluents.

Thus, the staff finds that the NuScale MCR HSIs inform personnel about containment conditions and meet the requirements for containment monitoring in 10 CFR 50.34(f)(2)(xvii). The applicant described how the HSIs inform operators about containment conditions.

(6) 10 CFR 50.34(f)(2)(xviii)core cooling The staff reviewed the NuScale DCA Part 2 and the HSI Design RSR using the guidance in NUREG-0711, Criterion 8.4.4.2(6), to determine how the applicants HSI provides unambiguous indication of inadequate core cooling (ICC) in the control room, an indication required by 10 CFR 50.34(f)(2)(xviii).

In DCA Part 2 Tier 2, Section 7.2.13.6, Three Mile Island Action Items, the applicant stated that the following parameters are used to monitor ICC in the control room to satisfy the requirements of 10 CFR 50.34(f)(2)(xviii): core exit temperature, wide-range reactor coolant pressure, degrees of subcooling, wide-range reactor coolant hot temperature, reactor pressure vessel water level, and containment water level. The staff reviewed the HSI Design RSR, Section 4.6.2, and found that the applicant identified the displays in the control room that provide the operators with these indications of ICC.

However, the staff did not find all of these indications on the examples of HSIs provided in the HSI Design RSR. As a result, the staff issued RAI 9318, Question 18-29 (ADAMS Accession No. ML18079A126). In the response to RAI 9318, Question 18-29 (ADAMS Accession No. ML18141A866), the applicant clarified that the SDIS will provide the core cooling information to MCR operators. The staff compared the list of parameters provided on the SDIS as described in DCA Part 2 Tier 2, Chapter 7, to the list of ICC variables and found that the SDIS does include the ICC variables. The staff finds that the parameters available on the SDIS that provide indications of ICC are a suitable combination of information that indicate primary coolant saturation and coolant levels in both the reactor vessel and containment vessel. RAI 9318, Question 18-29, is resolved and closed.

(7) 10 CFR 50.34(f)(2)(xix)PAM The staff reviewed the NuScale DCA Part 2 and the HSI Design RSR using NUREG-0711, Criterion 8.4.4.2(7), to determine how the applicants HSI assures the monitoring of plant and environmental conditions following an accident including core damage as required by 10 CFR 50.34(f)(2)(xviii).

The NuScale design allows for the monitoring of plant and environmental conditions following an accident through the display of PAM variables on the SDIS and through MCS and PCS interfaces.18-106

DCA Part 2 Tier 2, Section 7.1.1.2.2, Post-Accident Monitoring, states that the NuScale design has no PAM Type A variables because no operator actions are credited in any SRP Chapter 15 anticipated operational occurrence, infrequent event, or accident, or in the station blackout (SBO) or ATWS analyses. The staff verified that DCA Part 2 Tier 2, Chapters 15 and 7, did not credit operator actions, and therefore the staff agrees that the NuScale design has no Type A variables. In Section 7.2.13 of this SER, the staff verifies that Type B, C, D, and E variables conform to the performance, design, and qualification criteria in Sections 5 through 9 of IEEE Std. 497-2002, as modified by RG 1.97, Revision 4.

In the discussion below, the staff verifies that Type B, C, D, and E variables conform to the display criteria in Sections 5 through 9 of IEEE Std. 497-2002, as modified by RG 1.97, Revision 4.

Using the display criteria in IEEE-497-2002, the staff reviewed the SDIS screen shots in the HSI Design RSR, Figure 7-3, SDI Page, and the information in DCA Part 2 Tier 2, Section 7.2.13.2, System Status Indication. The display criteria in IEEE-497-2002 for the PAM variables include precision, format, units, response time, ease of use, and how PAM variables are uniquely identified. The PAM variables are displayed in the MCR on the SDIS, MCS, and PCS. Type E PAM variables are only displayed on the MCS and PCS. The staff issued RAI 9318, Question 18-30 (ADAMS Accession No. ML18079A126), to resolve inconsistencies about the display of Type D variables and to understand how Type E variables are displayed in the MCR. The response to RAI 9318, Question 18-30 (ADAMS Accession No. ML18079A126), included a revised figure of the SDIS display to show how the NuScale SDIS displays the PAM variables listed in DCA Part 2 Tier 2, Chapter 7.

The staff compared the PAM variables listed in Revision 1 of DCA Part 2 Tier 2, Chapter 7, Table 7.1-7, Summary of Type A, B, C, D and E Variables, with the SDIS sample page provided in the RAI 9318 response as the revised HSI Design RSR, Figure 7-3, and found that the SDIS displays contain the Type B, C, and D PAM variables as stated by the applicant in the RAI response.

With regard to the 15 PAM Type E variables identified by the applicant, the applicant stated in response to RAI 9318, Question 18-30, that the Type E variables are displayed through the MCS and PCS interfaces. Next, the staff reviewed how the Type B, C, D, and E PAM variable displays meet the display criteria in IEEE Std. 497-2002. The applicant addressed HFE design display characteristics, such as the precision, format, units, and response time of the PAM variables, through the NuScale HSI Style Guide.

The staff found that the Style Guide, Section 3.3, Safety Display and Indication System, as revised by the applicants response to RAI 8847, contains criteria for PAM variable display on the SDIS. The staff found that Type B and C variables are uniquely identified as accident monitoring variables because of the spatially dedicated, continuously visible SDIS displays in the MCR, which constantly display the PAM variables. RAI 9318, Question 18-30, is a Confirmatory Item pending changes to the HSI Design RSR. Once the applicant has made the changes to both the HSI Design RSR and the HSI Style Guide for the SDIS, the staff can conclude that the applicant meets this criterion for determining how the applicants HSIs display the PAM variables.

(8) 10 CFR 50.34(f)(2)(xxvi)leakage control 18-107

Regulations in 10 CFR 50.34(f)(2)(xxvi) require an applicant to provide for leakage control and detection in the design of systems outside containment that contain or might contain radioactive materials. NUREG-0711, Criterion 8.4.4.2(10), states that an applicant should describe how the HSI provides for leakage control and detection in the design of systems outside containment that contain (or might contain) accident source tem radioactive materials after an accident.

Using this criterion, the staff reviewed the HSI Design RSR, Figure 7-3, and observed that it does not include leakage control and detection parameters for systems outside containment, as stated in the HSI Design RSR, Section 4.6.2(8), Leakage Control. As a result, the staff issued RAI 9318, Question 18-31 (ADAMS Accession No. ML18079A126). The response to RAI 9318, Question 18-31 (ADAMS Accession No. ML18141A866), stated that the applicant revised the HSI Design RSR, Section 4.6.2, Main Control Room, and Figure 7-3, to provide this information.

NuScale stated that leakage control and detection parameters for systems outside containment are provided on display pages available at the MCR sitdown operator workstation VDUs. Parameters for leakage control and detection include system flows, pressures, tank levels, radiation levels, and alarms. The staff finds this response acceptable because it included proposed revisions that the staff reviewed and found to address how leakage control and detection are provided. RAI 9318 Question 18-1, is a Confirmatory item pending revision of the HSI Design RSR with these changes.

(9) 10 CFR 50.34(f)(2)(xxvii)radiation monitoring Using NUREG-0711, Criterion 8.4.4.2(11), and 10 CFR 50.34(f)(2)(xxvii), which requires an applicant to provide monitoring of in-plant radiation and airborne radioactivity as appropriate for a broad range of routine and accident conditions, the staff reviewed DCA Part 2 Tier 2 and the HSI Design RSR to determine how NuScales MCR HSI provides appropriate monitoring of in-plant radiation and airborne radioactivity under a broad range of routine and accident conditions.

In DCA Part 2 Tier 2, Section 12.3.4, the applicant stated that both fixed area radiation monitors and continuous airborne radiation monitors placed in selected plant locations provide MCR indication of radiation levels and MCR alarms when predetermined thresholds are exceeded. DCA Part 2 Tier 2, Tables 12.3-10, 12.3-11, and 12.3-12, list the area radiation monitors and continuous airborne radiation monitors. The applicant stated that these radiation monitors provide plant area and airborne radiation level monitoring for a broad range of routine and accident conditions, thus conforming to 10 CFR 50.34(f)(2)(xxvii).

The HSI Design RSR, Section 4.6.2, states that radiation monitoring is a shared system for all units and that in-plant radiation and airborne radioactivity for the range of routine and accident conditions are displayed on the common systems panel VDU in the MCR.

Because MCR HSIs allow for the monitoring of in-plant radiation and airborne radioactivity under routine and accident conditions, the staff finds that the application conforms to this NUREG-0711 criterion and complies with 10 CFR 50.34(f)(2)(xxvii).18-108

(10) manual initiation of protective actions The staff reviewed DCA Part 2 Tier 2, Chapter 7, and the HSI Design RSR to determine how the NuScale HSI design meets NUREG-0711, Criterion 8.4.4.2(12), for supporting the manual initiation of protective actions at the system level for safety systems otherwise initiated automatically.

DCA Part 2 Tier 2, Section 7.2.12.2, Manual Control, states that the MPS conforms to RG 1.62, Manual Initiation of Protective Actions, Revision 1, issued June 2010.

Division I and Division II manual actuation switches are provided in the MCR for each of the following protective actions for the MPS:

  • containment isolation
  • chemical and volume control system isolation
  • pressurize heater trip
  • low-temperature overpressurization protection The staff found that the HSI provided for manual actuation of protective actions in the MCR as shown in the HSI Design RSR does not match the applicants list in DCA Part 2 Tier 2, Chapter 7. As a result, the staff issued RAI 9318, Question 18-32 (ADAMS Accession No. ML18079A126). The response to RAI 9318, Question 18-32 (ADAMS Accession No. ML18141A866), explained that, although some functions were not modeled in the simulator, they are included in the plant design as discussed in DCA Part 2 Tier 2, Chapter 7. The staff finds this acceptable because the applicant included the manually actuated protective actions in the application and documented a simulator deficiency to track two of the manual actions that were not modeled in the simulator.

The staff will track this item as part of the V&V element of the review in Section 18.10 of this SER.

The staff also used the guidance in RG 1.62, Revision 1, to review the HSI design of the applicants manual initiation switches. The staff found that the control interfaces for these manual initiations are located in the control room on a division-level basis. They are easily accessible to the operator so that action can be taken in an expeditious manner during plant conditions for which the protective actions of the safety systems need to be initiated. The HSI supports acknowledgement of safety function operation.

RAI 9318, Question 18-32, is a Confirmatory item pending the addition of this information to the HSI Design RSR.

Because NuScale described how the HSI supports the manual initiation of protective actions at the system level for safety systems otherwise initiated automatically, and the HSI features for manual initiation meet the guidance in RG 1.62, the staff finds that NuScale meets this criterion.

(11) diversity and defense in depth The staff reviewed DCA Part 2 Tier 2, Chapter 7, and the HSI Design RSR to determine how the NuScale HSI provides displays and controls in the MCR for manual, system-18-109

level actuation of CSFs and for monitoring those parameters that support them. These displays and controls should be independent of, and different from, the normal I&C. This criterion corresponds to NUREG-0711, Criterion 8.4.4.2(13), and Point 4 in SRP Branch Technical Position 7-19, Guidance for Evaluation of Diversity and Defense-in-Depth in Digital Computer-Based Instrumentation and Control Systems.

DCA Part 2 Tier 2, Section 7.2.12.2, states that MCR operators can use the safety-related enable nonsafety control switch for manual component-level control of engineered safety feature (ESF) equipment. This control is overridden by any automatic or manual safety-related signal within the actuation priority logic. DCA Part 2 Tier 2, Section 7.1.5.3, Diversity and Defense-in-Depth Assessment Regulatory Conformance, states that the SDIS provides independent and diverse display of CSFs. Chapter 7 of this SER documents the staffs evaluation of the independence and diversity of the display and manual controls.

The staff finds that NuScale meets this NUREG-0711 criterion because it describes how the HSI provides displays and controls in the MCR for the manual actuation of CSFs and for monitoring those parameters that support them. In addition, the displays and controls are different from the normal I&C.

(12) IHAs NUREG-0711, Criterion 8.4.4.2(14), states that the applicant should describe how the HSI provides the controls, displays, and alarms that ensure the reliable performance of identified IHAs.

The staff reviewed the TIHA RSR to understand how the applicant identified IHAs. The applicant has identified two IHAs associated with the NuScale design, listed in the TIHA RSR, Section 4.0, Summary of Results. No IHAs are required to be performed by MCR operators during normal, abnormal, and emergency operating conditions outside the MCR and RSS. The TIHA RSR, Section 4.3.5, HSI Design, and the HSI Design RSR, Sections 4.6.2 and 4.4.5.5, Alarm Definition and Criteria, describe HSI design features intended to reduce the human error probability for the IHAs. When reviewing these sections of the RSRs, the staff found that the operator actions needed to complete the two IHAs are relatively simple and require few steps. MCR displays and alarms inform the operator of the status of the plant, and the required actions and procedures direct the operators when to take these actions based on plant status.

Because the applicant has described how the HSI provides the controls, displays, and alarms to ensure the operators can perform IHAs, the staff finds that the application conforms to this criterion.

(13) computer-based procedure platform The staff reviewed the application using NUREG-0711, Criterion 8.4.4.2(15), to determine whether NuScale computer-based procedures are consistent with the design review guidance in NUREG-0700, Section 8, Computer-Based Procedure System, and in Section 1, Computer-Based Procedures, of DI&C-ISG-5, Task Working Group #5:

Highly-Integrated Control RoomsHuman Factors Issues (HICRHF), Revision 1, dated November 3, 2008.18-110

The staff found that the design guidance for computer-based procedures in the Style Guide is consistent with the guidance in NUREG-0700, Section 8, except that the Style Guide did not address NUREG-0700, Criteria 8.2.2-10 and 8.3.1-1. The HSI Design RSR states that NuScale computer-based procedures are designed in accordance with the guidance in Section 1 of DI&C-ISG-05, Revision 1, but that neither the computer-based procedures nor the paper-based procedures are part of the HSI Design RSR. The staff could not find information about how the applicants computer-based procedures are consistent with the design guidance in Section 1 of DI&C-ISG-05, Revision 1. Therefore, the staff issued RAI 9318, Question 18-33 (ADAMS Accession No. ML8079A126), to address these issues. In response to RAI 9318, Question 18-33 (ADAMS Accession No. ML18141A866), the applicant explained how NuScale computer-based procedures meet NUREG-0700, Criteria 8.2.2-10 and 8.3.1-1. The applicant also provided a detailed explanation of how the computer-based procedures meet each of the 30 general review criteria of DI&C-ISG-05, Section 1, with the exception of general review criterion 20, which is not applicable to new designs. The staff reviewed the RAI response and compared it to the criteria in DI&C-ISG-05, Revision 1. During the July-August 2018 ISV audit, the staff sampled NuScale CBPs available in the Process Library against a sample of the HFE design guidelines and found applicable criteria met or that the applicant had sufficient justification for not meeting the guidance. For example, operators are unable to take notes in the CBPs (NUREG-0700 Criterion 8.3.3-3, Note Taking). The applicant agrees that this capability is useful but they are unable to add this capability to the CBP platform at this time. The applicant explained that operators are expected to make written notes on paper or make an electronic log entry for procedure-related issues. The staff finds that this justification is acceptable because intent of the guidance for note-taking is met through other methods of capturing the information. The staff finds that NuScale computer-based procedures conform to the applicable design guidance in the ISG and NUREG-0700, and therefore this criterion is met.

Technical Support Center Design As discussed in SER Section 18.1.4.1, the scope of the DC applicants activities includes identifying displays and alarms for the TSC. The HFE PMP, Section 2.2.3, states that HSIs in the TSC are derivatives of the MCR HSI. The HSIs in the TSC will comply with the guidance of NUREG-0696 and with the Style Guide; these HSI are for information display only.

DCA Part 2 Tier 2, Section 13.3, Emergency Planning, states that a TSC is provided in the plant design, and it conforms to NUREG-0696. SRP Chapter 18, Revision 3, states the following:

NUREG-0696, Functional Criteria for Emergency Response Facilities, also includes general HFE criteria for these facilities and the staff has accepted a commitment to implement these criteria as an alternative to the NUREG-0711 criteria. As a result, the staff used the NUREG-0696 criteria to review the NuScale DCA Part 2 for the TSC HSI design.

The staff reviewed DCA Part 2 Tier 2, Revision 1, Section 13.3, for the applicants description of how the HSIs in the TSC provide personnel the information needed for analyzing the plants steady-state and dynamic behavior before and during an accident, including environmental and radiological conditions, and communication capabilities for the purpose of understanding the 18-111

accident sequence, deciding mitigation actions, and evaluating the extent of damage for recovery operations. Additionally, NUREG-0696, Section 2.9, Technical Data and Data System, states that at a minimum, the set of Type A, B, C, D, and E variables specified in RG 1.97 and the information displayed by the SPDS should be displayed in the TSC. As stated in DCA Part 2 Tier 2, Section 7.2.13.2, the NuScale design does not include Type A PAM variables.

NUREG-0696, Section 2.8, Instrumentation, Data System Equipment, and Power Supplies, states that the design of the TSC data system equipment shall incorporate HFE with consideration for both operating and maintenance personnel. NUREG-0696, Section 2.9, states that TSC displays shall be designed so that call-up, manipulation, and presentation of data can be performed easily. The TSC data display formats shall present information so that it can be easily understood by the TSC personnel.

DCA Part 2 Tier 2, Section 13.3, states that the TSC includes engineering workstations as described in DCA Part 2 Tier 2, Section 7.2.13.7, Other Information Systems. In RAI 8925, Question 13.03-3 (ADAMS Accession No. ML17206A098), the staff asked NuScale to explain the design features and characteristics of systems related to ensuring that appropriate displays and instrumentation are in place to display and receive parameters during accident conditions within the TSC. In the response to RAI 8925, Question 13.03-3 (ADAMS Accession No. ML17264B172), the applicant explained that the TSC engineering workstations, which are part of the MCS and PCS, will display the PAM variables in the TSC. The TSC engineering workstations have data recording, trending, and historical retention capabilities. The applicant clarified that the PAM variables that are on the MCR SDIS displays are also available on MCS and PCS displays. In the response to RAI 9318, Question 18-30, the applicant stated that Type E variables are only displayed on MCS and PCS displays. Thus, the staff concludes that all Type B, C, D, and E PAM variables are available on TSC engineering workstation displays.

According to DCA Part 2 Tier 2, Section 18.7.2.4.3, TSC, EOF and RSS, the HSIs in the TSC are derivatives of the MCR HSIs and comply with the HSI Style Guide.

NUREG-0711, Criteria 8.4.4.3(1), and NUREG-0696 contain criteria for reliable voice communication facilities both on and off site from the TSC. DCA Part 2 Tier 2, Section 13.3, states that the TSC is equipped with voice communications systems that provide communications between the TSC and plant, local, and offsite emergency response facilities; the NRC; and local and state operations centers.

Because HSIs in the TSC are designed using HFE design criteria contained in the applicants Style Guide, which is based on accepted HFE principles, and the TSC contains communication equipment for voice communication between the TSC and onsite and offsite locations, the staff finds that the TSC HSI design complies with the general HFE design criteria in NUREG-0696.

Emergency Operations Facility Design As discussed in SER Section 18.1.4.1, the scope of the DC applicants activities includes identifying displays and alarms for the EOF. The HFE PMP, Section 2.2.3, states that the EOF HSI will comply with the guidance in NUREG-0696.

NuScale has included the EOF design as a COL item. DCA Part 2 Tier 2, Section 13.3, Emergency Planning, COL Item No. 13.3-2, states that a COL applicant that references the NuScale power plant DC will provide a description of an EOF for the management of the overall 18-112

licensee emergency response that conforms with the guidance in NUREG-0696; NUREG-0737, Supplement 1; and NSIR/DPR-ISG-01, Interim Staff GuidanceEmergency Planning for Nuclear Power Plants, issued November 2011. SRP Chapter 18, Revision 3, states that NUREG-0696 includes general HFE criteria for these facilities, and the staff has accepted a commitment to implement these criteria as an alternative to the NUREG-0711 criteria.

Accordingly, the staff finds that the application conforms to these NUREG-0696 criteria.

Thus, the application includes a commitment for the COL applicant to design the EOF in accordance with NUREG-0696. Accordingly, the staff finds that the application conforms to these NUREG-0711 criteria.

Remote Shutdown Facility Design (Criteria 8.4.4.5(1)-(2))

NUREG-0711, Section 8.4.4.5, Remote Shutdown Facility, includes two criteria for the staff to use during an HFE review of the remote shutdown facility. First, the applicant should describe how the HSI provides a design capability for remote shutdown of the reactor outside the MCR, which is required by 10 CFR Part 50, Appendix A, GDC 19 (Criterion 8.4.4.5(1)). Second, the applicant should describe how the HSIs at the remote shutdown facility are consistent with those in the MCR (Criterion 8.4.4.5(2)). The staff reviewed NuScale DCA Part 2 Tier 2, Chapter 7, and the HSI Design RSR to determine how the NuScale RSS conforms to this criterion.

GDC 19 states, in part, Equipment at appropriate locations outside the control room shall be provided: (1) with a design capability for prompt hot shutdown of the reactor, including necessary instrumentation and controls to maintain the unit in a safe condition during hot shutdown, and (2) with a potential capability for subsequent cold shutdown of the reactor through the use of suitable procedures. DCA Part 2 Tier 2, Section 7.1.1.2.3, Remote Shutdown Station, and Section 7.2.12, explain the indications and controls outside the MCR to permit remote shutdown of the units from outside the MCR. The staff evaluates the adequacy of these indications and controls to achieve safe shutdown in Section 7.1.1.4.1.2 of this SER.

DCA Part 2 Tier 2, Section 7.1.1.2.3, states that the RSS provides an alternate location to monitor the NPM status and to operate the MCS and PCS during an MCR evacuation. The RSS contains the controls necessary to monitor plant status during an immediate hot shutdown of the reactor, maintain the unit in a safe condition during hot shutdown, and perform subsequent cold shutdown of the unit. The MCS equipment in the RSS provides an independent alternative shutdown capability that is physically and electrically separate from the controls in the MCR. The RSS displays contain the process variables necessary to monitor the safe shutdown of each NPM.

The staff could not find information in the application about the number of workstations for monitoring all 12 units in the RSS, and the staff found inconsistencies between the information in DCA Part 2 Tier 2, Chapter 7, and in the HSI Design RSR about controls available in the RSS. As a result, the staff issued RAI 9401, Question 18-34 (ADAMS Accession No. ML18079A127). In the response to RAI 9401, Question 18-34 (ADAMS Accession No. ML18141A661), the applicant clarified that the RSS contains a set of MCS and PCS displays identical to the MCS and PCS displays in the MCR. The RSS provides an alternate location to monitor all 12 NPMs and control each module via the MCS and PCS. The staff finds the RAI response acceptable because it is consistent with other parts of the application and includes specific information about HSIs in the RSS. The staff finds that the applicant meets Criterion 8.4.4.5(2) because all 12 units can be monitored and controlled at the RSS and that 18-113

the MCS and PCS controls and displays in the RSS are identical those in the MCR. Thus, the staff concludes that they will be consistent with the MCR HSI. RAI 9401, Question 18-34, is a Confirmatory Item pending changes to the HSI Design RSR.

In RAI 9612, (ADAMS Accession No. ML18288A257), the staff asked the applicant to provide more specific information about how the NuScale design provides a redundant means of shutting down the plant in the event that the main control room becomes uninhabitable. The staff issued this RAI on October 15, 2018. This is being tracked as Open Item 18-4: the staff will evaluate the applicants response to RAI 9612 to determine whether it impacts the staffs evaluation of the adequacy of these indications and controls to achieve safe shutdown in Section 7.1.1.4.1.2 of this SER.

The staff cannot yet make a decision on whether the applicant meets Criteria 8.4.4.5(1) for the remote shutdown facility HSI because it is not clear to the staff if a shutdown can be performed at the RSS.

Local Control Station Design (Criteria 8.4.4.6(1)-(2))

NUREG-0711, Section 8.4.4.6, Local Control Stations, includes two criteria for the HSI design of an applicants LCSs. The applicant should describe the basis for deciding which HSIs to include in the MCR and which HSIs to provide locally (Criterion 8.4.4.6(1)). The applicant should then describe how HFE is used in the HSI design of the LCSs to ensure that the HSIs for the LCSs are consistent with those in the MCR for ease of understanding (Criterion 8.4.4.6(2)).

The staff could not find in the application a basis for determining which HSIs NuScale included in the MCR and which it provided locally. Therefore, the staff issued RAI 9402, Question 18-20 (ADAMS Accession No. ML18069A001). In the response to RAI 9402, Question 18-20 (ADAMS Accession No. ML18113A641), the applicant provided a basis for this determination. The applicant stated that plant operations SMEs followed the TA methodology discussed in the TA RSR to determine which HSIs to include in the MCR and which to provide locally. The staff reviewed the methodology in the TA RSR and found that an SME assigns task attributes to each task during the TA process. The TA RSR, Section 4.3.1, Task Attributes, states that these task attributes include supporting information for the task, such as personal protective equipment, tools needed, workspace, physical position, and primary operator. During the TA phase, the SMEs also determined which displays to include in an inventory of control room HSIs based on the need for operators to perform tasks in the control room. Section 18.4 of this SER includes the staffs review of the TA methodologies as described in the TA RSR. Additionally, in the response to RAI 9402, Question 18-20, the applicant listed the plant systems that it excluded from the MCR HSI. These systems did not receive further HFE evaluation unless a subsequent HFE phase identified a need to include them. The staff finds that applicants response provides a basis for which HSIs are included in the control room, and therefore this RAI is closed.

The TIHA RSR, Section 3.3.5, states the following:

When a local control station (LCS) is required for conducting an IHA that LCS HSI is designed using the same style guide as the MCR HSIs. This ensures HSI design consistency, training efficiency, clear labeling, easy accessibility, and avoidance of hazardous locations.18-114

Section 18.1.4.1 of this SER discusses the staffs evaluation of the applicants graded approach to applying HFE to the design of LCSs. As such, the applicants plan to design LCSs used for IHAs according to the guidelines in the Style Guide is acceptable because the guidelines will help to ensure that errors are minimized during the performance of operator actions at these LCSs. Therefore, the staff concludes that the application conforms to Criterion 8.4.4.6(1)-(2).

Degraded Instrumentation and Control and Human-System Interface Conditions (Criteria 8.4.5(1)-(4))

NUREG-0711, Section 8.4.5, Degraded I&C and HSI Conditions, includes four criteria for this topic. The staff evaluates how NuScale met each of the criteria in the sections below.

18.7.4.10.1 Automation Failures and Instrumentation and Control Degradations NUREG-0711, Criterion 8.4.5(1), states that the applicant should identify the effects of automation failures on personnel and plant performance and HFE-significant I&C degradations that might adversely affect HSIs used to accomplish IHAs.

NuScale identified the effects of degraded conditions and automation failures of the nonsafety control systems on plant performance in DCA Part 2 Tier 2, Table 7.7-1, Control Groups for the NSSS Control Functions, and the control system common-cause failure analysis.

The applicant identified the effects of degraded conditions and automation failures of the safety-related systems by performing a failure modes and effects analysis documented in DCA Part 2 Tier 2, Section 7.1.1.2, Additional Design Considerations (for the MPS and neutron monitoring system).

Additionally, the applicant identified the effects on plant performance of a common-cause failure of the safety-related and nonsafety-related digital I&C systems during transients, abnormal operating occurrences, and accident conditions, as described in DCA Part 2 Tier 2, Section 7.1.5, Diversity and Defense-in-Depth. The applicant also performed a hazards analysis of the NuScale I&C systems for the neutron monitoring system, MPS, PPS, and SDIS, as described in DCA Part 2 Tier 2, Section 7.1.8, Hazards Analysis. The hazard analysis included internal and external hazards. Chapter 7 of this SER documents the staffs evaluation of these analyses.

The V&V IP, Section 2.1, Sampling Dimensions, describes the types of operational conditions that the ISV will sample, with an emphasis on I&C and HSI failures and degraded conditions, because of the increased use of digital technology in the NuScale MCR. Additionally, all IHAs will be tested during the ISV.

The staff also reviewed 12 ISV scenario basis documents, made available in the NuScale electronic reading room, during an audit from July 25, 2017, through February 14, 2018 (ADAMS Accession No. ML18135A049). NuScale explained that these documents would be used to develop the detailed scenario guides. The staff observed that the scenario basis documents identified the sampling dimensions associated with each event included in the scenarios, and the total SOC was commensurate with those conditions listed in Criteria 11.4.1.1(1)-(3) and includes I&C and HSI failures and degraded conditions. As part of a June 2018 audit (ADAMS Accession No. ML18208A370), the staff reviewed a sample of the 18-115

finalized scenario guides and observed that they contain I&C and HSI failures and degraded conditions.

By including these types of degraded I&C conditions and automation failures in the ISV scenarios and collecting performance measurements described in the V&V IP, Section 4.5, Performance Measurement, the applicant will identify effects on personnel performance. If satisfactory personnel performance, as described in the V&V IP, Section 4.7, Data Analysis and HED Identification, cannot be demonstrated during the ISV scenarios, the applicant will document the issue as an HED and resolve the HED in accordance with the V&V IP, Section 5.0, HED Resolution.

Therefore, the staff finds that the applicant has identified the effects of automation failures and degraded conditions, including HFE-significant I&C degradations, on plant performance. Also, the applicant will identify the effects of automation and degraded I&C and HSI conditions on personnel performance by including a sample of these conditions in the ISV. Accordingly, the staff finds that the application conforms to this NUREG-0711 criterion.

18.7.4.10.2 Alarms and Notifications NUREG-0711, Criterion 8.4.5(2), states that the applicant should specify the alarms and information that personnel need to detect degraded I&C and HSI conditions in timely manner and to identify their extent and significance.

DCA Part 2 Tier 2, Section 7.2.15.3, Fault detection and self-diagnostics, describes the fault detection and alarming functions of the MPS, which includes an indication to operators of the impact of the failure to help determine the overall status of the system. DCA Part 2 Tier 2, Section 7.2.13.2, describes the fault detection and alarming functions of the SDIS.

DCA Part 2 Tier 2, Section 7.2.13.5, states that an independent monitoring system monitors the mutual status of the MCS and PCS to detect and alert the operator to a loss of the overall I&C system. The HSI Design RSR, Section 4.7.1.1, Common Cause Software Failures, states that alarms notify MCR operators upon failure of the PCS or MCS (or both). The HSI Design RSR, Section 4.7, Degraded I&C and HSI Conditions, states that failures of automation sequences are alarmed in the MCR. These indications allow the operators to verify that the HSI at their workstations is capable of communicating information. Because the applicant specified the alarms and information that personnel need to detect degraded I&C and HSI conditions in a timely manner, the staff finds that the application conforms to this NUREG-0711 criterion.

18.7.4.10.3 Backup Systems and Compensatory Actions NUREG-0711, Criterion 8.4.5(3), states that the applicant should determine whether backup systems are necessary to ensure that important personnel tasks can be completed under degraded I&C and HSI conditions. NUREG-0711, Criterion 8.4.5(4), states that the applicant should determine the necessary compensatory actions and supporting procedures to ensure that personnel effectively manage degraded I&C and HSI conditions and transition to backup systems.

The applicant accounted for I&C failures and degradations in the diversity and defense-in-depth coping analysis in DCA Part 2 Tier 2, Chapter 7. The staff evaluates this coping analysis in Chapter 7 of the SER. In the HSI Design RSR, Section 4.7, the applicant stated that failures of 18-116

the MCR HSI VDUs are accommodated by the use of other VDUs at the same workstation, or by use of another workstation or the standup unit workstations. Failures of hardware that lead to a loss of all VDUs at a workstation are accommodated by redundant VDUs in the RSS.

NuScale procedures govern operator response to the various I&C and HSI failure modes.

Because the applicant has determined backup systems and compensatory actions for degraded I&C and HSI conditions, the staff finds that the application conforms to this NUREG-0711 criterion.

Human-System Interface Tests and Evaluations (Criteria 8.4.6(1)-(2))

NUREG-0711, Section 8.4.6, HSI Tests and Evaluations, has two criteria:

(1) Tradeoff evaluations are comparisons between design options, based on aspects of human performance that are important to successful task performance and to other design considerations.

(2) Performance-based tests involve assessing personnel performance, including subjective opinions, to evaluate design options and design acceptability.

The staff reviewed the HSI Design RSR, Section 3.7.2, Testing and Evaluation of Design, and found that the applicant incorporated tradeoff evaluations and assessments of personnel performance across all stages of HSI design. In DCA Part 2 Tier 2, Section 18.7.3, Results, the applicant stated that HSI test and evaluation activities were part of the HSI design analysis and include HSI inventory and characterization, HSI task support verification, and HSI design verification. The staff reviews the applicants methodology for task support verification and design verification in Section 18.10 of this SER. NuScale performed an SPV that meets the performance-based test description in this criterion. The applicants SPV Results TR details the general approach to testing, including information about the participants, testbed, scenarios, performance measures, test procedures, data analyses, and conclusions from the SPV.

The staff finds that NuScale conducted both tradeoff evaluations and performance-based tests during iterative HSI design stages and thus meets these criteria in NUREG-0711.

NUREG/CR-7202 Topics NUREG/CR-7202 identifies potential human performance issues for the staff to consider when reviewing an SMR applicants HFE design process. The staff evaluated how NuScale selected and applied HFE guidelines to address the following potential human performance issues that are applicable to the NuScale plant:

  • Multiple modules are operated by one crew of six personnel in one control room. One design feature is that an operator has the ability to operate any one of the 12 modules from his or her sitdown operator console. Therefore, the HSI design should help to minimize the possibility that actions intended to be taken for one unit are taken on a different unit (i.e., wrong unit errors) (see NUREG/CR-7202, Section 2.17, HSI Design for Multi-unit Monitoring and Control). The design of the sitdown operator workstations in the NuScale MCR currently allows for operation of any module at any workstation.

The staff reviewed the ConOps, which states that the six sitdown operator workstations in the MCR provide each control room operator with access to displays and software 18-117

controls located on the PCS and MCS networks for oversight and plant control activities.

The HSI Design RSR, Section 4.4, HSI Concept of Use, includes details about the HSI design that support the oversight of plant operations and minimize operator errors. The staff reviewed this section and found that the MCR workstation HSI design includes consistent and clear schemes for unit labeling on display pages used for monitoring and control. Thus, the staff finds that the NuScale design satisfies this criterion.

  • If multiple alarms are received at once for more than one unit, the HSI should help operators identify the high-priority alarms to determine what actions, if any, must be taken (NUREG/CR-7202, Section 2.17). Using this criterion, the staff reviewed the HSI Design RSR, Section 4.4.5, Plant Notifications, and finds that the NuScale plant notification system is designed with multiple features that allow operators to identify high-priority alarms and determine how to respond. During the July-August 2018 ISV audit, the staff observed alarm prioritization and did not observe any cascading alarm conditions that impacted operator performance. Thus, the staff finds that the NuScale HSI design satisfies this criterion.
  • NuScale has increased the role of automation relative to U.S. operating reactors in order to keep workload within acceptable limits and therefore minimize human performance errors. The HSI should enable the operators to determine whether the automation is functioning properly (see NUREG/CR-7202, Section 2.4, High Levels of Automation for All Operations and its Implementation). Furthermore, the HSI design should allow operators to detect when automation is degraded (see NUREG/CR-7202, Section 2.11, Operational Impact of Control Systems for Shared Aspects of SMRs). The staff reviewed the HSI Design RSR to determine how HSI features allow the operators to determine proper automation and how operators are made aware of automation degradation. The HSI Design RSR, Section 4.7, states that automation failures are alarmed in the MCR. Operators are also expected to monitor most automations and subsequent plant response and detect automation failures. The staff reviewed NuScales ConOps and found that operators are expected to either directly monitor automation while performing a sequence or rely on limits incorporated with the automation. Within the automation is a feature that terminates, pauses, or alerts the operator to the condition if process parameters reach specific limits. During the July-August 2018 ISV audit, the staff observed (scripted) automation failures and found that the HSI design enables operators to determine whether the automation is functioning properly and thus this criterion is met.
  • Modules may be in different states of operation. The HSI design should ensure that operators can maintain awareness of each units state of operation (see NUREG/CR-7202, Section 2.9, Different Unit States of Operation). The staff reviewed the example displays in the HSI Design RSR, Section 7.0, that show information about each units operating status on a one-page display. Additionally, the individual unit overview displays constantly show information about the status of each units state of operation. The staff observed this feature during the July-August 2018 ISV audit and found obvious indication of unit status available in the main control room. Accordingly, the staff finds that the NuScale HSI design allows the operator to maintain awareness of the operating state of each unit.
  • Unit design differences may exist over time as new units are added. The HSI should indicate the difference to the operator (see NUREG/CR-7202, Section 2.10, Unit Design 18-118

Differences). At the time of this DCA, the staff finds that there are no identified unit differences; thus, this criterion is not applicable.

  • Certain HSIs or control stations are needed for refueling. These should be integrated into the overall control room design and concept of operations similar to other control room HSI designs (see NUREG/CR-7202, Section 2.15, Novel Refueling Methods).

The NuScale control room design does not include HSIs dedicated to refueling. The scope of NuScales ConOps, HSI Design RSR, and HSI Style Guide does not include refueling control station design. Because of the risk significance of the reactor building crane, and because the applicant identified that errors in its operation can significantly raise core damage frequency, the staff wanted to understand whether HFE guidelines have been or will be applied to the HSIs used during module movement to help prevent the occurrence of significant operator errors during module movement. Therefore, the staff issued RAI 9360, Question 18-42 (ADAMS Accession No. ML18180A359). In the response to RAI 9360, Question 18-42 (ADAMS Accession No. ML18172A227), the applicant stated the following:

The LCS HSI used for module movement are vendor-supplied. The HFE design for these controls will be developed by the vendor because the controls must reflect the specialized nature of crane operation. The NuScale HFE design team is working with engineering to develop procurement specifications that characterize the crane control function requirements.

Implementation of the Style Guide standards will be included in the purchase specification to establish as much consistency with NuScale HFE design as possible but on a not to interfere basis with establishing the safety and control standards required by crane design. Since this effort is at an early stage of development and beyond the scope of the current MCR verification and validation (V&V) process, specific details on the scope of HFE related direction in the procurement specification cannot be addressed at this time.

Thus, the staff concludes that the applicant intends to integrate refueling controls and information in the MCR with the existing MCR HSI design through application of the HSI Style Guide, making them similar to the other HSIs in the MCR. The staff acknowledges that the reactor building crane design has not yet been completed, and therefore detailed information about its HSI design does not exist at this time, nor will it exist when the applicant completes the V&V activities. Section 18.6.4.1 of this SER discusses that the staff conducting the review of DCA Part 2 Tier 2, Chapter 19 has identified issues that may or may not change the risk-important human actions and issued RAI 9128, Question 19-37 (ADAMS Accession No. ML17340A626) to resolve these issues. The issues addressed by RAI 9128 are not yet resolved, and the staff is tracking resolution of RAI 9128 to determine whether there are any human actions related to refueling that may be significant enough to be considered IHAs. As discussed in more detail in Section 18.11.4.5.2 of this SER, the staff issued RAI 9415, Question 18-46 (ADAMS Accession No. ML18204A190) in part to request the applicant clarify how any IHAs identified after completion of the V&V activities will be addressed during the design implementation activities to be performed by a licensee prior to fuel load. Resolution of RAI 9415 is Open Item 18-22.18-119

  • The HSI should allow operators to detect and monitor the unplanned shutdowns or degraded conditions of one unit while monitoring multiple units (see NUREG/CR-7202, Section 2.20, Potential Impacts of Unplanned Shutdowns or Degraded Conditions of One Unit on Other Units). As observed by the staff during the audit of the SPV and the ISV audit, when there is an unplanned trip of a unit, specific shutdown-related displays appear for that unit and along with an audible alert. The staff reviewed the HSI Design RSR, Section 4.4.5.13, Safety Function Monitoring Page, and found that information that signals operators to unit specific degradations in safety functions and defense-in-depth capabilities.

During the ISV audit, the staff observed that the applicants HSI design effectively provides operators information about unplanned shutdowns and degraded conditions. Thus, the staff finds that this criterion is met.

  • The HSI should provide operators with indications to allow them to detect and handle off-normal conditions in multiple units (see NUREG/CR-7202, Section 2.21, Handling Off-normal Conditions at Multiple Units). The staff finds that the integration of multiple NuScale HSI design features, such as the notification system, embedded procedures, and the workstation MCS and PCS displays, allows operators to detect and handle off-normal conditions in multiple units. During the SPV and ISV audits, the staff observed that operators could successfully meet specific performance criteria when faced with off-normal conditions at multiple units. Thus, the staff finds that this criterion is met.
  • The HSI should provide operators with indications to allow them to monitor the status and verify the success of passive systems (see NUREG/CR-7202, Section 2.24, Passive Safety Systems). The staff reviewed sample HSI displays in the HSI Design RSR to determine whether operators have indications for the status and success of passive systems. The staff found that several display pages in the MCR include status information and information about the success or failure of a safety system actuation. Because the MCR HSI has several aspects that help operators determine the status and success of passive systems, the staff finds that this criterion is met.

Combined License Information Items No COL information items are associated with Section 18.7 of the NuScale DCA Part 2.

Conclusion The staff evaluated the applicants HSI design against the criteria in NUREG-0711, Section 8.4.

This section of the review has two open items (Open Item 18-4 and Open Item 18-22). More information is needed regarding the HSI at the RSS for remote shutdown purposes to verify the applicant conforms to Criterion 8.4.4.5(1). Updates to the Style Guide are necessary for the purpose of the design verification acceptance criteria. The staff will review the updated Style Guide after the applicant submits the V&V RSR to verify it conforms to Criterion 8.4.3(3).

Because of these open items, the staff cannot make a conclusion on the HSI design at this point in the review.

18.8 Procedure Development NUREG-0711 includes procedure development because HFE attributes are associated with the procedures. However, as an operating program, the staff reviews procedure development and documents its conclusions in Section 13.5 of this SER.18-120

18.9 Training Program Development NUREG-0711 includes training program development because of the interfaces between the HFE design, procedures, and training. However, as an operating program, the staff reviews training program development and documents its conclusions in Section 13.2 of this SER.

18.10 Verification and Validation Introduction The staff used the criteria in NUREG-0711, Section 11.4, Review Criteria, to evaluate the applicants V&V IP, which defines methodologies that will be used for the various activities associated with V&V. The applicant plans to submit a V&V results summary report (RSR) before Phase 4 of the DCA review. For those criteria that address methodology and for which the V&V IP includes adequate information, the staff is able to review the planned method and determine whether it meets the applicable criterion. However, for those criteria that address results, the staff will determine upon receipt of the V&V RSR whether the criteria are met.

Summary of Application DCA Part 2 Tier 1: Refer to Section 18.1.2 of this SER.

DCA Part 2 Tier 2: The applicant provided a Tier 2 description of this HFE element in DCA Part 2 Tier 2, Section 18.10, Human Factors Verification and Validation (ADAMS Accession No. ML17013A289).

ITAAC: There are no ITAAC associated with this element.

TS: There are no TS associated with this element.

Topical Reports: There are no topical reports associated with this element.

Technical Reports: Refer to Section 18.1.2 of this report.

Regulatory Basis The following NRC regulations contain the relevant requirements for this review:

  • 10 CFR 52.47(a)(8) as it pertains to the information necessary to demonstrate compliance with any technically relevant portions of the Three Mile Island requirements set forth in 10 CFR 50.34(f), except paragraphs (f)(1)(xii), (f)(2)(ix), and (f)(3)(v)
  • 10 CFR 50.34(f)(2)(iii) - Provide, for Commission review, a control room design that reflects state-of-the-art human factor principles prior to committing to the fabrication or revision of fabricated control room panels and layouts SRP Chapter 18,Section III, Acceptance Criteria, lists the acceptance criteria adequate to meet the above requirements, as well as review interfaces with other SRP sections. Acceptance criteria for HFE design methodology are provided in NUREG-0711 (listed below). (NUREG-0711 references NUREG-0700, Human-System Interface Design Review Guidelines, which provides detailed acceptance criteria for HFE design attributes.)18-121
  • NUREG-0711, Revision 3, Human Factors Engineering Program Review Model, Chapter 11, Human Factors Verification and Validation, Section 11.4, Review Criteria The following documents also provide additional guidance in support of the SRP acceptance criteria to meet the above requirements:
  • NUREG/CR-6393, Integrated System Validation: Methodology and Review Criteria, issued 1995 Technical Evaluation Sampling of Operational Conditions 18.10.4.1.1 Sampling Dimensions (Criteria 11.4.1.1(1)-(3))

NUREG-0711, Section 11.4.1.1, Sampling Dimensions, includes three criteria. These criteria list the plant conditions, personnel tasks, and situational factors/error-forcing contexts that should be included in the SOC used by the applicant to combine and identify a set of V&V scenarios to guide subsequent analyses.

The V&V IP, Section 2.1, states the following:

A range of plant conditions, personnel tasks, and situational factors is considered within the sampling dimensions included in Section 11.4.1 of Human Factors Engineering Review Model, NUREG-0711, Rev. 3 (Reference 8.1.1) as applicable to the NuScale design.

The staff also reviewed all of the ISV scenario basis documents during an audit from July 25, 2017, through February 14, 2018 (ADAMS Accession No. ML18135A049). NuScale explained that these documents would be used to develop the detailed scenario guides. The staff observed that the scenario basis documents identified the sampling dimensions associated with each event included in the scenarios, and the total SOC was consistent with those conditions listed in NUREG-0711, Section 11.4.1.1. As part of an audit in June 2018 (ADAMS Accession No. ML18208A370), the staff reviewed a sample of the finalized scenario guides and observed that they conform to the criteria in NUREG-0711, Section 11.4.1.1.

18.10.4.1.2 Identification of Scenarios NUREG-0711, Section 11.4.1.2, Identification of Scenarios, includes two criteria that state that the applicant (1) should combine the results of the sampling to identify a set of V&V scenarios to guide subsequent analyses and (2) should not bias the scenarios by overly representing those in which only positive outcomes are expected, those that are relatively easy to conduct, and those that are familiar and well structured.

The V&V IP, Section 2.2, Identification of Scenarios, states that Members of the NuScale HFE Team develop the ISV scenarios using multiple sampling dimensions to accomplish the goals and set the conditions to be included in each scenario based on the SOC. The V&V IP, Section 2.2, also stated the following:

18-122

Biases for individual dimensions are possible, but collectively, the scenarios avoid bias by representing scenarios that:

  • Have both positive and negative outcomes.
  • Require varying degrees of administrative burden to run (test bed set-up, instructor input).
  • Minimize the use of well-known and well-structured sequences (i.e., textbook design-basis accident mitigation).

As part of an audit in June 2018 (ADAMS Accession No. ML18208A370), the staff reviewed a sample of the finalized scenario guides and observed that the scenarios conform to Criteria 11.4.1.2(1)-(2).

18.10.4.1.3 Scenario Definition (Criteria 11.4.1.3(1)-(3))

NUREG-0711, Section 11.4.1.3, Scenario Definition, includes three criteria.

The staff reviewed the V&V IP, Section 2.3, Scenario Definition, which describes the development of the applicants V&V scenarios and the information included for each scenario.

The staff compared this information to Criterion 11.4.1.3(1), which lists the information that should be specified for each V&V scenario, and found the lists to be consistent. As part of an audit in June 2018 (ADAMS Accession No. ML18208A370), the staff reviewed a sample of the finalized scenario guides and observed that the scenario guides specifically define the information in Criterion 11.4.1.3(1), with the exception of a precise definition of workplace factors and staffing levels. The applicant explained that these factors are only specified when they are other than nominal. The applicant identified the nominal workplace factors and staffing level in the ConOps. The staff finds that the application conforms to Criteria 11.4.1.3(1).

Criteria 11.4.1.3(2)-(3) state that the applicants scenarios should (1) realistically replicate operator tasks in the tests and (2) realistically simulate the effects of potentially harsh environments on personnel performance when the applicants scenarios include work associated with operations remote from the MCR. The V&V IP, Section 2.3, states that The ISV scenarios are developed to be representative of the range of events that could be encountered during the plants operation, determined by SOC as described in Section 2.1. The V&V IP, Section 2.3, also states the following:

Tasks performed by operators remote from the MCR are modeled in the ISV scenario to realistically simulate effects on personnel performance due to potentially harsh environments. Effects such as additional time to don protective clothing, set up of radiological access control areas, and employment of damage control, emergency, or temporary equipment are described in scenarios by use of time constraints/additions.

As part of an audit in June 2018 (ADAMS Accession No. ML18208A370), the staff reviewed a sample of the finalized scenario guides and observed that they conform to Criteria 11.4.1.3(2)-

(3).18-123

Design Verification This section has several criteria that address either the methodology used by the applicant for design verification activities or the results of those activities. For those criteria that address methodology and for which the V&V IP includes adequate information, the staff is able to review the planned method and determine whether it meets the applicable criterion. However, for those criteria that address results, the staff will determine upon receipt of the V&V RSR whether the criteria are met.

18.10.4.2.1 Human-System Interface Inventory and Characterization (Criteria 11.4.2.1(1)-(3))

NUREG-0711, Section 11.4.2.1, HSI Inventory and Characterization, includes three criteria for this topic.

18.10.4.2.1.1 Scope and Inventory (Criteria 11.4.2.1(1) and (3))

Criterion 11.4.2.1(1), which states that the applicant should develop an inventory of all HSIs that personnel require to complete tasks identified in its SOC, including aspects of the HSI used for managing the interface and those that control the plant. Criterion 11.4.2.1(3) states that the applicant should verify the inventory description of HSIs to ensure that it accurately reflects their current state.

The V&V IP, Section 3.1.1, Human-System Interface Inventory, describes how an inventory is generated during TA. The TA results define the inventory and characterization for alarms, controls, indications, and procedures needed to execute all operator tasks. This inventory is then compared to the HSIs needed for the tasks included in the applicants SOC, which area subset of all operator tasks. The staff finds that this scope conforms to Criterion 11.4.2.1(1).

Additionally, Appendix A to the applicants HSI Design RSR states the following:

Table A-1 shows the form used by the HFE design team to perform preliminary Inventory and Characterization testing. The purpose of this effort was to formalize a process for the testing and verification of the HSI inventory. Only the elements needed to successfully complete the SPV testing were evaluated.

The same form and process will be followed during ISV testing on every element on the page.

The staff understands that the applicant plans to update and verify the HSI inventory before the ISV, which will ensure that the inventory reflects the HSI to be used during ISV testing, which will be the current state. Thus, the staff finds that this methodology conforms to Criterion 11.4.2.1(3).

18.10.4.2.1.2 Human-System Interface Characterization (Criterion 11.4.2.1(2))

The staff reviewed the V&V IP, Section 3.1.2, in which the applicant listed the minimum set of information provided for HSI characterization. The staff compared the list in the V&V IP, Section 3.1.2, to the list in Criterion 11.4.2.1(2) and found the lists to be consistent. In addition, the staff reviewed the HSI Design RSR, Section 7.0, HSI Design Page Examples, which contains samples of the HSI inventory and characterization. The samples are consistent with the description of the characterization in the V&V IP, Section 3.1.2. Thus, the staff finds that this methodology conforms to Criterion 11.4.2.1(2).18-124

18.10.4.2.2 Human-System Interface Task Support Verification (Criteria 11.4.2.2(1)-(5))

NUREG-0711, Section 11.4.2.2, HSI Task Support Verification, includes five criteria for this topic. The fifth criterion addresses plant modifications and is not applicable to new reactors; thus, the staff evaluated only the first four criteria, as discussed below.

18.10.4.2.2.1 Verification Criteria (Criterion 11.4.2.2(1))

The staff reviewed DCA Part 2 Tier 2, Section 18.10.2.2.2, Design Verification, which describes the applicants HSI task support verification criteria. The staff compared DCA Part 2 Tier 2, Section 18.10.2.2.2 to Criterion 11.4.2.2(1), which states that the HSI task support verification criteria should be based on the HSI identified by the applicants TA. DCA Part 2 Tier 2, Section 18.10.2.2.2 states that the verification criteria are based on the TA results that define the inventory and characterization for the alarms, controls, indications, procedures, automation, and task support needed to execute operator tasks, including manual tasks, automation support tasks, and automation monitoring tasks.

In May 2017, the staff audited the TA results (ADAMS Accession No. ML17181A415) and confirmed that the TA defines the HSI (e.g., alarms, controls, displays) necessary for personnel to complete their tasks. Thus, the staff concludes that the HSI task support verification criteria are based on the HSIs identified in the TA. Thus, the staff finds that this methodology conforms to Criterion 11.4.2.2(1).

18.10.4.2.2.2 Methodology (Criterion 11.4.2.2(2))

Criterion 11.4.2.2(2) states that the applicant should compare the HSIs and their characteristics to the needs of personnel identified in the TA for the defined sampling of operational conditions.

The V&V IP, Section 3.2.2, HSI Task Support Evaluation Methodology, states that the HFE design team conducts task support verification using the personnel task requirements identified by the most recent TA and compares them with the HSI inventory and characterization. The staff also reviewed the applicants HSI Design Verification Test Plan during an audit from July 25, 2017, through February 14, 2018 (ADAMS Accession No. ML18135A049). The HSI Design Verification Test Plan provides detailed methodological information on the verification process and the products that result from that process, which helped the staff to understand how the comparison between the TA and HSI inventory and characterization was carried out.

The staff found that the methodology is acceptable and that the application conforms to Criterion 11.4.2.2(2).

18.10.4.2.2.3 Human Engineering Discrepancy Identification and Documentation (Criteria 11.4.2.2(3)-(4))

Criteria 11.4.2.2(3)-(4) state that an HED should be identified and documented as specified by the criteria.

In DCA Part 2 Tier 2, Section 18.10.2.2.2, the applicant stated that an HED is written when (1) an HSI is needed for completion of a task and is not identified or not available, (2) an HSI is identified as available but is not needed for any task, and (3) an HSI does not meet the 18-125

established requirements for the task. In addition, the V&V IP, Section 5.0, states that HEDs are identified and documented in the HFEITS. The staff issued RAI 9394, Question 18-16 (ADAMS Accession No. ML18068A733), requesting that the applicant clarify the specific information documented in the database for each HED. In particular, the staff asked whether the database captures the HSI, the tasks affected, and the basis for the deficiency (i.e., the aspect of the HSI identified as not meeting task requirements). The applicants response to RAI 9394, Question 18-16 (ADAMS Accession No. ML18123A540) contained detailed information on NuScales procedural guidance for tracking HFE issues, including how HEDs are categorized and the specific information captured for each issue. The applicant will document the information identified in Criteria 11.4.2.2(3)-(4) for HEDs. RAI 9394, Question 18-16 is a Confirmatory Item.

Open Item 18-5: The staff will confirm the applicant identified and documented HEDs identified during task support verification consistent with Criteria 11.4.2.2(3)-(4) when the applicant submits the V&V RSR.

18.10.4.2.3 Human Factors Engineering Design Verification (Criteria 11.4.2.3(1)-(5))

NUREG-0711, Section 11.4.2.3, HFE Design Verification, includes five criteria for this topic.

The fifth criterion addresses plant modifications and is not applicable to new reactors; thus, the staff evaluated the first four criteria, as discussed below.

18.10.4.2.3.1 Verification (Criterion 11.4.2.3(1))

Criterion 11.4.2.3(1) states that the applicant should base the criteria used for HFE design verification on HFE guidelines. The V&V IP, Section 3.3.1, Verification Criteria, states that The criteria for HFE design verification is provided by the HSI style guide. The Style Guide contains the applicants HFE guidelines and procedural guidance for determining appropriate steps when a deviation from the Style Guide is necessary. Therefore, the staff finds that the application conforms to Criterion 11.4.2.3(1).

18.10.4.2.3.2 Methodology (Criterion 11.4.2.3(2))

Criterion 11.4.2.3(2) states that the applicants HFE design verification methodology should include procedures for (1) comparing the characteristics of the HSIs with HFE guidelines for the SOC and the general environment in which HSIs are sited, (2) determining for each guideline whether the HSI is acceptable or discrepant, and (3) evaluating whether an HED is a potential indicator of additional issues.

In the V&V IP, Sections 3.3.2, Design Verification Evaluation Methodology, and 5.2, Human Engineering Discrepancy Analysis, the applicant cited the following as the means to compare the characteristics of HSIs with HFE guidelines:

  • checklists and guidelines for comparing the HFE design criteria (Style Guide) to HSI components (e.g., alarms, controls, indications, procedures, navigation aids)
  • a description of the means of comparing HFE design criteria to HSI components in the context of the various environmental conditions or locations of those HSIs (e.g., noise, lighting, ambient temperature and humidity)18-126

Therefore the staff concludes the applicants method includes procedures for comparing characteristics of the HSIs with the HFE guidelines for the SOC and the general environment in which HSIs are sited.

The V&V IP, Section 3.3.2, states that the HSI design verification methodology includes guidelines for determining whether the HSI is acceptable or discrepant based on the associated HFE design criteria. Specifically, HSIs that do not meet the HFE design criteria completely result in identification of a design verification HED. Therefore, the staff concludes the applicants method includes a procedure for determining whether an HSI is acceptable or not.

When an HED is generated, the V&V IP, Section 5.2, describes the procedure for assessing whether the HED is an indicator of additional issues. Specifically, the applicant will use the following methods to assess the extent of condition and causal effects across HSI design features to determine whether an HED is an indicator of additional issues:

  • [[
  • ]]

Therefore, the staff concludes the applicants methodology includes procedures for evaluating whether an HED is a potential indicator of additional issues. The staff finds that the application conforms to Criterion 11.4.2.3(2).

18.10.4.2.3.3 Human Engineering Discrepancies (Criteria 11.4.2.3(3)-(4))

Criteria 11.4.2.3(3)-(4) state that the applicant should identify an HED when a characteristic of the HSI is discrepant from a guideline and document HEDs in terms of the HSI involved and how its characteristics depart from a particular guideline.

The V&V IP, Section 3.3.2, states that design verification HEDs are generated for HSIs that do not meet the HFE design criteria (as specified by the HSI Style Guide) completely. In addition, the V&V IP, Section 5.0, states that HEDs are identified and documented in the HFEITS. The staff issued RAI 9394, Question 18-16 (ADAMS Accession No. ML18068A733), requesting that the applicant clarify the specific information captured in the database for each HED. The applicants response to RAI 9394, Question 18-16 (ADAMS Accession No. ML18123A540),

gave detailed information on NuScales procedural guidance for tracking HFE issues, including how HEDs are categorized and the specific information captured for each issue. The staff found that the applicant will document the information identified in Criteria 11.4.2.3(3)-(4) for HEDs.

RAI 9394, Question 18-16 is a Confirmatory Item.

Open Item 18-6: The staff will confirm the applicant identified and documented HEDs identified during design verification consistent with Criteria 11.4.2.3(3)-(4) when the applicant submits the V&V RSR.18-127

Integrated Systems Validation The objective of the ISV review is to verify that the applicant validated, using performance-based tests, that the integrated system design (i.e., hardware, software, procedures and personnel elements) supports the safe operation of the plant.

NUREG-0711 has several criteria for this part of the review.

18.10.4.3.1 Validation Team (Criterion 11.4.3.1(1))

NUREG-0711, Section 11.4.3.1, Validation Team, states that the applicant should describe how the team performing the validation has independence from the personnel responsible for the actual design. Criterion 11.4.3.1(1) also says the members of the validation team should have never been part of the design team. The main intent of Criterion 11.4.3.1(1) is to ensure that bias is adequately controlled during ISV data collection (e.g., via observer notes and evaluations) and during the analysis and evaluation of ISV results performed to determine whether design changes are necessary.

The staff reviewed the V&V IP, Section 4.1, Validation Team, and found the applicant planned to use members of the HFE design team on the ISV validation team. The staff issued RAI 8758, Question 18-2 (ADAMS Accession No. ML17191A215), to request the applicant explain how it would minimize bias and ensure the objectivity of the validation team members who are part of the HFE design team.

In the response to RAI 8758, Question 18-2 (ADAMS Accession No. ML18134A353), the applicant provided justification for using some members of the HFE design team and provided revisions to the V&V IP to explain the controls the applicant will take to minimize bias. The applicant stated the following:

NuScale is taking an exception to NUREG-0711, review criterion 11.4.3.1 with respect to its direction that the team be entirely independent of the HFE design process. Understanding how the design was developed and what it is expected to accomplish brings additional diversity to the validation observation team.

Specifically, those who contributed to the design understand what was intended to be accomplished and can identify where objectives are not met. Two of the observers will be independent of the ISV test design. This alternate approach provides for more diversity within the team, a detailed knowledge of HFE design attributes and functions, a combination of practical and theoretical perspectives, and an enhanced orientation to the challenges that operators face.

The applicant provided a multifaceted strategy for controlling the potential for bias among the validation team that includes the following:

  • The acceptance criteria used to determine whether an HED should be categorized as a Priority 1 HED are objective. Subjective measures are intended to be used only to identify lower level issues.
  • Validation team members are trained and qualified to conduct the ISV in an objective manner. This training will include the specific roles of the two independent observers and their importance to mitigate team bias. Additionally, validation team members who 18-128

have participated in design activities will receive training on the importance of independent observer input.

  • NuScale plans to preserve independent observer comments as a record for future audit or review.
  • The independent observers have equal participation in the observer conference session, during which notes will be categorized and assigned as HEDs with an initial prioritization.

If an independent observer does not agree with the disposition of an observation comment, the observation team members will table the comment and present it to the NuScale operations senior management for review. Because all notes will be captured and scanned, the original notes are available to compare with the final consensus for discrepancies.

  • An independent individual or group will review the test results and actions to resolve first-priority HEDs to ensure that actions have been properly characterized and dispositioned appropriately.

The staff finds that including team members who have knowledge of the design is reasonable given the applicants first-of-a-kind, unique design. As stated by the applicant in the response to RAI 8758, Question 18-2, those who contributed to the design understand what was intended to be accomplished and can identify where objectives are not met, thus helping to determine when design changes are needed. The staff finds that the inclusion of two independent team members and the cited training and controls to reduce bias in data collection and analysis provide a reasonable means for controlling potential bias in validation team findings. The staff finds the applicants method to be an acceptable alternative to Criterion 11.4.3.1(1). RAI 8758, Question 18-2, is a Confirmatory Item pending the addition of the information cited to the V&V IP.

18.10.4.3.2 Test Objectives (Criteria 11.4.3.2(1)-(2))

NUREG-0711, Section 11.4.3.2, Test Objectives, includes two criteria for this topic. The second criterion addresses plant modifications and is not applicable to new reactors; thus, the staff evaluated the first criterion, as discussed below.

Criterion 11.4.3.2(1) states that the applicant should develop detailed test objectives to provide evidence that the integrated system adequately supports plant personnel in safely operating and includes a list of considerations. The staff compared the list in the V&V IP, Section 4.2, Test Objectives, to the list in Criterion 11.4.3.2(1) and found the lists to be consistent, with one exception: Validate that the personnel can effectively transition between the HSIs and procedures in accomplishing their tasks, and that interface management tasks, such as display configuration and navigation, are not a distraction or an undue burden. The staff issued RAI 9414, Question 18-23 (ADAMS Accession No. ML18077A002), to ask the applicant to explain how the final bullet in the list of considerations in Criterion 11.4.3.2(1) is addressed or why it is not applicable to the NuScale design. The response to RAI 9414, Question 18-23 (ADAMS Accession No. ML18121A299), states that the last test objective from NUREG-0711, Criterion 11.4.3.2(1), is applicable to the NuScale design. The applicant provided a revision of the V&V IP, Section 4.2, which includes the objective. RAI 9414, Question 18-23, is a Confirmatory Item pending the addition of the information cited to the V&V IP.18-129

18.10.4.3.3 Validation Testbeds (Criteria 11.4.3.3(1)-(9))

NUREG-0711, Section 11.4.3.3, Validation Testbeds, includes nine criteria for this topic.

18.10.4.3.3.1 Interface (Criteria 11.4.3.3(1)-(3))

Criteria 11.4.3.3(1)-(3) states that the applicants testbed should represent the complete integrated system (i.e., it should not only include those HSIs and procedures that are needed to complete the ISV scenarios), with HSIs and procedures that are represented with high physical and functional fidelity to the reference design.

The V&V IP, Section 4.3, states the applicants validation testbed for the ISV is its control room simulator. The V&V IP, Section 4.3.2, Interface Physical Fidelity, describes the testbed as a replica in form, appearance, and layout of the NuScale MCR design and includes presentation of alarms, displays, controls, procedures, automation, job aids, communications, interface management tools, layout, and spatial relationships. The V&V IP, Section 4.3.1, Interface Completeness, states the testbed also represents interfaces such as the RSS and LCSs (i.e.,

communications). Additionally, the V&V IP, Section 4.3.1 states, The test bed represents a complete and integrated system with HSI and procedures not specifically required in the test scenarios (e.g., alternate procedures).

The V&V IP, Section 4.3.3, Interface Functional Fidelity, states, High functional fidelity in the HSI, procedures, and automation is represented so that the HSI functions are available and the HSI component modes of operation, types of feedback, and dynamic response characteristics operate in the same way as the actual plant. Based on the V&V IP, Section 4.3.6, Data Content Fidelity, the staff understands that the term actual plant equates to the engineering design of the NuScale plant because a NuScale plant has not yet been built.

Therefore, the staff finds that the information provided in the V&V IP is consistent with Criteria 11.4.3.3(1)-(3).

18.10.4.3.3.2 Environmental Fidelity (Criterion 11.4.3.3(4))

Criterion 11.4.3.3(4) states that the testbeds environmental fidelity should be represented with high physical fidelity to the reference design, including the expected levels of lighting, noise, temperature, and humidity. The V&V IP, Section 4.3.4, states, The test bed is representative of the actual NuScale plant with regard to environmental features such as lighting, noise, temperature, humidity, and ventilation characteristics. In cases where the test bed cannot accurately simulate the environment, the ISV captures human factors engineering issue tracking system (HFEITS) entries for evaluation and resolution. Those environmental aspects of the design that cannot be considered during ISV are addressed specifically in the Design Implementation element of NUREG-0711. Therefore, the staff finds that the information provided in the V&V IP is consistent with Criterion 11.4.3.3(4).

Open Item 18-7: The staff will verify that the HFEITS identifies any cases for which the testbed cannot accurately simulate the environment when the applicant provides the V&V RSR.18-130

18.10.4.3.3.3 Data (Criteria 11.4.3.3(5)-(7))

Criterion 11.4.3.3(5) states that information and data provided to personnel should completely represent the plants systems they monitor and control. Criteria 11.4.3.3(6)-(7) state that the data content and data dynamics fidelity should be represented with high physical fidelity to the reference design. Specifically, the underlying model should provide input to the HSI such that the information accurately matches what is presented during operations, and the model should also be able to provide input to the HSI so that information flow and control responses occur accurately and within the correct response time (e.g., information should be sent to personnel with the same delays as occur in the plant).

The V&V IP, Section 4.3.5, Data Completeness Fidelity, states, In the test bed, information and data provided to personnel represent the complete set of plant systems monitored and controlled from that facility. Therefore, the staff finds that the information provided in the V&V IP is consistent with Criterion 11.4.3.3(5).

The V&V IP, Section 4.3.6, states that the alarms, controls, indications, procedures, and automation presented are based on an underlying plant model that accurately reflects the engineering design of the NuScale plant. In addition, it states that the model accurately provides input to the HSI, such that the information matches what is presented during operations. Therefore, the staff finds that the information provided in the V&V IP is consistent with Criterion 11.4.3.3(6).

The V&V IP, Section 4.3.7, Data Dynamics Fidelity, states that the plant model provides input to the HSI in a manner such that information flow and control responses occur accurately and in a correct response time. Information is provided to personnel with the same anticipated delays as would occur in the plant. Therefore, the staff finds that the information provided in the V&V IP is consistent with Criterion 11.4.3.3(7).

18.10.4.3.3.4 Important Human Action (Criterion 11.4.3.3(8))

Criterion 11.4.3.3(8) states that for IHAs at complex HSIs remote from the MCR, the applicant should consider the use of a simulator or mockup to verify that the requirements for human performance can be met. The V&V IP, Section 4.3.8, Remote Human-System Interfaces Containing Important Human Actions, states that none of the identified IHAs occur outside of the MCR. The applicant identified IHAs in the TIHA RSR, Section 4.1, Identification of Risk Important Human Actions. The staff confirmed these actions are performed in the MCR.

Therefore, the staff finds that Criterion 11.4.3.3(8) is not applicable.

18.10.4.3.3.5 Verification (Criterion 11.4.3.3(9))

Criterion 11.4.3.3(9) states that the applicant should verify the conformance of the testbed to the required testbed characteristics before validation tests are conducted. It also states that one approach an applicant can use to meet Criteria 11.4.3.3(1)-(7) is to use a testbed that complies with ANSI/ANS-3.5-2009, Nuclear Power Plant Simulators for Use in Operator Training (ANS 3.5).

The V&V IP, Section 4.3.9, Test Bed Conformance, states, The testbed is verified to conform with the required characteristics before validation tests are conducted. Also, the V&V IP, Section 4.3 states, The fidelity of the validation test beds models and HSI are verified to 18-131

represent the current, as-designed NuScale plant prior to use for the validation. The staff issued RAI 9396, Question 18-44 (ADAMS Accession No. ML18204A192) to request that the applicant describe the methods used to verify the conformance of the testbed to Criteria 11.4.3.3(1)-(7), which give the required characteristics for the testbed.

In the response to RAI 9396, Question 18-44 (ADAMS Accession No. ML18170A157), the applicant added Section 4.3.10, ISV Simulator Performance Testing, to the V&V IP to explain that the simulator is verified to conform to the testbed criteria in NUREG-0711 by conducting the real time and repeatability testing, limits of simulation testing, normal evolution testing, malfunction testing, and steady state testing described in ANS 3.5. ANS 3.5, Section 4, Testing Requirements, refers to real time and repeatability tests, limits of simulation tests, normal evolution tests, malfunction tests, and steady state tests as performance tests, which are conducted to evaluate the fidelity of the simulator to the reference plant for a range of plant normal and abnormal events, and it contains the general methodology and acceptance criteria for these performance tests. ANS 3.5, Section 4 explains that the intent of these performance tests is to ensure that no noticeable differences exist between the simulator and the reference plant. ANS 3.5, Section 5.1.1, Utilization of Baseline Data, lists potential sources of reference plant data that may be used for comparison to simulator response during performance testing in order to evaluate simulator fidelity. When data from the actual plant is not available for comparison, which is the case for the applicant, then data generated from engineering analysis and subject matter expertise may be used.

The applicant also added Section 4.3.11, Scenario-Based Testing, to the V&V IP to explain that scenario-based testing (SBT) is also conducted prior to the ISV for all scenarios. SBT is conducted by determining a set of key plant parameters to be evaluated and ensuring those parameters behave as expected for the developed ISV scenarios. The scenarios are then conducted in real time, using the procedures available in the simulator that were developed from the task analyses, to perform the scenario events. The applicant also included a proprietary list of items that are evaluated during SBT. The staff reviewed the list and found that the applicants SBT method is consistent with the method of SBT described in Nuclear Energy Institute 09-09, Nuclear Power Plant-Referenced Simulator Scenario Based Testing Methodology, Revision 1, dated December 8, 2009 (NEI 09-09), which the staff also endorsed in RG 1.149, Revision 4. ANS 3.5, Section 4.4.3.2, Simulator scenario-based testing, explains that the purpose of SBT is to ensure that for a given scenario, the simulator is capable of producing the expected reference unit response without significant performance discrepancies or deviation from an approved scenario sequence, and the appropriate cues and operator actions are included in the scenario.

The staff concludes the following:

  • The applicant intends to conduct simulator performance testing prior to ISV, which includes a broader range of normal and abnormal operations than those included in the ISV scenarios. HSIs and procedures other than only those used in the ISV scenarios must be available in the simulator in order to satisfactorily complete the performance testing. Therefore, the applicants method is acceptable to verify completeness of the testbed (i.e., that it includes HSIs and procedures not specifically needed in the ISV scenarios).
  • The applicant plans to conduct task support verification and HFE design verification prior to ISV testing. Task support verification is conducted to verify that the HSIs and 18-132

procedures in the testbed satisfy the HSI design requirements identified by the applicants task analysis. Design verification is conducted to verify that the HSIs, including procedures, conform to the HFE design guidelines in the applicants Style Guide. Design verification also includes verification of the environmental conditions in the control room (e.g., noise, lighting, temperature). Completing task support verification and design verification ensure that the simulators HSIs and procedures have physical fidelity to the HSI design that resulted from the applicants HFE design process.

Performing design verification also verifies the environmental fidelity of the testbed.

  • The applicants methods for conducting simulator performance testing and SBT are comparable to those methods the staff has previously found to be acceptable for verifying a simulator has a high degree of fidelity to its reference plant. During performance testing, the simulated plant parameters, which are provided information from the underlying plant model, are verified to respond as expected in the reference plant for those conditions. During SBT, the specific HSIs and procedures included in the ISV scenarios are used and verified to provide the expected feedback, and the simulator is verified to provide the expected plant response to operator input. Therefore, the applicants method to verify the HSI and procedure functionality fidelity, data completeness fidelity, data content fidelity, and data dynamics fidelity of the testbed is acceptable.

Therefore, the staff concludes the applicant identified sufficient methods to verify the testbed conforms to Criteria 11.4.3.3(1)-(7), and Criterion 11.4.3.3(9) is met. RAI 9396, Question 18-44, is a Confirmatory Item pending completion of the cited revisions to the V&V IP.

During the July-August 2018 ISV audit (ADAMS Accession No. ML18298A189), the staff reviewed the applicants procedures for conducting simulator performance testing and scenario-based testing, a sample of test results, and also observed simulator performance during ISV scenarios. The staffs observations are discussed below.

  • Simulator Testing Procedures: The staff reviewed the applicants procedures for conducting the simulator performance testing and SBT described in the V&V IP. The staff observed the simulator performance testing procedure referenced the draft version of ANSI/ANS-3.5 that was published in 2017. The NRC has not yet endorsed the 2017 version of the standard. The only significant difference between the performance testing portion of the recent draft and ANS 3.5 is that the draft does not prescribe abnormal events (i.e., malfunctions) that should be included in malfunction testing. Rather, it contains generic guidance for the selection of malfunctions to test that are intended to be applicable to any type of nuclear power plant design. ANS 3.5 prescribed malfunctions to ensure licensees could comply with the operator training requirements in 10 CFR Part
55. The applicants simulator is not being used at this time to train and examine licensed operators, and the requirements in 10 CFR Part 55 are not applicable to the applicant at this time.

The staff observed the applicants procedure for performance testing generally conformed to ANS 3.5, and the applicant identified where exceptions were taken or modifications were made. The staff observed the applicant made modifications to the guidance in ANS 3.5 in its procedure to address the fact that ANS 3.5 is written specifically for use in operator training and examination programs. The staff also observed the applicant applied the guidance for malfunction testing in the draft version of 18-133

the standard. [[ ]]. The staff determined the applicants selected malfunction tests represented a range of abnormal plant events that included events not specifically included in the ISV scenarios.

Further, the staff found the applicant modified the criteria used to determine whether noticeable differences between the simulator and the plant needed to be corrected.

ANS 3.5, Section 4.2.1.4, Assessment of deviations, states a training needs assessment should be conducted to determine whether noticeable differences impact the actions to be taken by operators; differences that do not impact operator actions or detract from training do not need to be corrected. Because the applicant is using the simulator for ISV testing rather than training, the staff noted the applicant modified the criteria such that [[ ]]. The staff also reviewed the list of uncorrected simulator discrepancies and determined the applicants justifications for not correcting those discrepancies prior to ISV testing were reasonable. For example, the applicant identified that some of the controls on a particular HSI that will be available in the actual main control room are not modeled in the simulator. The staff compared the controls in the simulator to the reference plant design and concluded that the absence of these particular controls from the HSI in the simulator was not likely to affect the decisions made by the ISV participants during testing.

The staff observed the applicants procedure for scenario-based testing generally conformed to ANS 3.5 and NEI 09-09 as discussed in the V&V IP. The staff found the applicant made a limited number of modifications to the method described in NEI 09-09 to address the fact that NEI 09-09 is written specifically for use in operator training and examination programs at nuclear power plants. For example, NEI 09-09 says scenario-based testing should be conducted by a crew of instructors certified as senior reactor operators. Because the applicant does not have, nor is it required to have, a licensed operator training program, it is not required to have instructors certified as senior reactor operators. The applicant identified personnel with [[ ]] as the performers of the scenario-based testing. The staff concluded these personnel were sufficiently knowledgeable of the design and the simulator to conduct the SBT.

  • Test Results: The staff observed that the applicant documented the completion of performance testing and scenario-based testing prior to commencing ISV testing in accordance with their procedures. The test results showed the tests were completed satisfactorily, and in some cases, the applicant identified and documented discrepancies that were identified during testing. In these cases, the applicant provided justification for why the discrepancies did not need to be corrected prior to ISV testing. The staff observed the justifications were made consistent with the applicants procedures.
  • Simulator Performance during ISV Testing: The staff observed simulator performance during ISV testing for a sample of the scenarios. The applicants scenario guides described the expected plant response for each scenario event. The staff observed that the simulator demonstrated the expected plant response documented in the scenario guides. Also, the staff compared the expected plant response documented in the scenario guides to the descriptions of plant systems and the accident analyses in the DCA Part 2 for a sample of events and found that the expected plant response in the scenario guides was consistent with the plant system response or accident analysis documented in the DCA Part 2. Additionally, the staff discussed the simulator model 18-134

with the applicant during the audit. The staff observed that the applicant used industry standard models to simulate plant system response.

Therefore, the staff concludes the applicant conducted simulator performance testing and SBT prior to ISV testing as described in the V&V IP.

Open Item 18-8: The staff will confirm the applicant completed the other activities (i.e., task support verification and design verification) that verify those testbed-required characteristics not addressed by simulator performance testing and SBT when the applicant provides the V&V RSR.

Plant Personnel (Criteria 11.4.3.4(1)-(4))

NUREG-0711, Section 11.4.3.4, Plant Personnel, includes four criteria for this topic.

18.10.4.4.1 Participant Sample Composition (Criteria 11.4.3.4(1), (2), and (4))

Criteria 11.4.3.4(1), (2), and (4) address the composition of the sample of those who will participate in the applicants validation tests:

  • Participants should be representative of the plant personnel who will interact with the HSI.
  • The sample of participants should reflect the characteristics of the population from which it is drawn. Those characteristics expected to contribute to variations in system performance should be specifically identified; the sampling process should reasonably assure that the validation encompasses variation along that dimension. Determining representativeness should include considering the participants license type and qualifications, skill and experience, age, and general demographics.
  • Bias in the sample of participants should be prevented by avoiding the use of participants who (1) are members of the design organization, (2) participated in prior evaluations, and (3) were selected for some specific characteristic, such as crews identified as good performers or more experienced.

The intent of Criteria 11.4.3.4 (1), (2), and (4) is twofold: to ensure that those participating in the ISV testing are representative of those that will eventually operate the real plant and to ensure bias is adequately controlled such that not only good performers are included or personnel with some unfair advantage like superior knowledge of the design, both of which may bias results.

The V&V IP, Section 4.4, describes the participants, stating that Individual operating crews participating in the ISV may be previously licensed commercial reactor or senior reactor operators, operators with Navy nuclear experience, or design engineering staff members familiar with the NuScale Power plant design. Section 4.4 states that crew members are selected and distributed across crews with consideration for age, gender, education level, and experience. Because the applicant cited members of the design engineering staff as potential ISV participants, the staff issued RAI 9371, Question 18-24 (ADAMS Accession No. ML18077A001) to clarify how these participants are representative of the anticipated plant 18-135

personnel who will interact with the HSI, how bias is prevented, and whether participants have participated in prior evaluations (e.g., SPV).

Regarding representativeness, the applicant provided the following information in the response to RAI 9371, Question 18-24 (ADAMS Accession No. ML18101B398):

The selected individuals are categorized into three groups based on their previous experience.

  • Previously licensed commercial nuclear power plant operators (either SRO or RO) (nine individuals)
  • Previously non-licensed commercial nuclear power plant operators or Navy nuclear plant operators (eight individuals)
  • Engineering degree with no previous operating experience (five individuals)

This is representative of the pool of plant personnel expected to operate in a NuScale control room and is consistent with the types of operators currently found in the existing nuclear industry (reference ACAD 10-001, Guidelines for Initial Training and Qualification of License Operators).

Because the ISV test participants have qualifications consistent with those in ACAD 10-001, which contains education and experience guidelines for operators license applicants that are equivalent to those in Regulatory Guide 1.8, Qualification and Training of Personnel for Nuclear Power Plants,4 the staff finds the ISV participants were representative of the personnel expected to operate the plant (i.e., licensed operators).

The applicants response to RAI 9371 also said that two of the ISV participants have had previous involvement with NuScale. One is a NuScale employee who has not been involved with the HSI or control room design or any prior evaluations; therefore the staff determined this individual would not have a significant amount of additional experience with or knowledge of the NuScale control room design that would bias the test results. The other individual participated in the staffing plan validation (SPV). Because the SPV occurred earlier in the design process, the SPV testing was limited to a few scenarios, the training program for SPV participants was relatively short compared to the training provided to ISV participants and was focused on the information participants would need to perform the SPV scenarios, the participant was not involved with any activities conducted after SPV to finalize the HFE design, and all ISV test participants completed a training program prior to ISV that addressed a broad range of NuScale control room operations, the staff determined that the applicants prior experience with the SPV was not likely to significantly bias the ISV results. Furthermore, having only one ISV participant who participated in the SPV is not likely to significantly bias test results because the ISV test results are derived from multiple sets of data collected from all ISV participants. Thus, the staff finds that the sample of participants is both representative of actual plant personnel and that 4

Refer to NUREG-1021, Operator Licensing Examination Standards for Power Reactors, Section ES-202.B, Background.18-136

bias is adequately controlled within the sample. The staff finds that the application conforms to Criteria 11.4.3.4 (1), (2), and (4).

18.10.4.4.2 Shift Staffing Levels (Criterion 11.4.3.4(3))

Criterion 11.4.3.4(3) states that the applicant should consider the minimum shift staffing levels, nominal levels, and maximum levels (including shift supervisors, ROs, shift technical advisors, and similar positions) when selecting personnel for participating in validation tests.

The V&V IP, Section 4.4, states the following:

Operating crew size for the validation tests includes a range of expected sizes to ensure that the HSI supports operations and event management. This range includes the minimum operating crew, nominal levels, and higher levels as defined during the staffing and qualifications program element NuScale Human Factors Engineering Staffing and Qualifications Results Summary Report (Reference 8.2.3) for a range of plant operating modes.

The applicant specified that the ISV includes at least one scenario with more than minimum crew staffing defined. Minimal and nominal staffing for the NuScale plant are synonymous.

As part of an audit in June 2018 (ADAMS Accession No. ML18208A370), the staff reviewed a sample of the finalized scenario guides and observed that crew size is considered within the set of ISV scenarios. Thus, the staff finds that the application conforms to Criterion 11.4.3.4(3).

18.10.4.4.2.1 Performance Measurement (Criteria 11.4.3.5.1(1)-(6))

18.10.4.4.2.1.1 Types of Performance Measures (Criteria 11.4.3.5.1(1)-(6))

NUREG-0711, Section 11.4.3.5.1, Types of Performance Measures, includes six criteria for this topic. Criteria 11.4.3.5.1(1)-(6) state that the applicant should identify plant performance measures, primary task measures, secondary task measures, measures of situation awareness, workload measures, and anthropometric and physiological measures for each ISV scenario.

Criterion 11.4.3.5.1(2) also states that for primary task measures, the applicant should identify the primary tasks so that primary tasks measures can be developed, primary task measures should reflect the aspects of the task that are important to performance, and the performance of primary tasks should be evaluated to identify errors of commission and omission.

In the V&V IP, Section 4.5.1.2, Personnel Task Performance Measures, the applicant provided its general approach to primary task measurement, which includes (1) identifying tasks that personnel are required to perform, (2) choosing measures to evaluate task performance that reflect those aspects of the task that are important to system performance, and (3) comparing actual and expected actions to identify errors of omission and commission. The applicant listed possible aspects of the task, such as time, accuracy, and frequency, to be measured. The applicant also explained that the complexity of the task will influence the performance measures that are selected such that more complex tasks may be assessed using more detailed performance measures. During the July-August 2018 ISV audit, the staff reviewed a sample of ISV scenario guides and observed that primary task measures were identified for each scenario.

The staff also observed that the primary task measures reflected the aspects of the task that were most important to performance. Therefore, the staff concludes the applicant identified adequate primary task measures consistent with Criterion 11.4.3.5.1(2).18-137

Open Item 18-9: The staff will confirm the applicant compared actual actions to expected actions for primary tasks in order to identify errors of commission of omission when the applicant provides the V&V RSR.

The V&V IP, Section 4.5 Performance Measures, states that performance measures include measures of plant performance. During the July-August 2018 ISV audit, the staff reviewed the applicants ISV Test Plan and observed that the applicant identified general plant performance measures for all ISV scenarios. For example, for an ISV scenario to be considered to be successful, [[ ]]. Therefore, the staff concludes the applicant identified plant performance measures consistent with Criterion 11.4.3.5.1(1).

The V&V IP, Section 4.5.1.2 also addresses secondary task measures and states, For each scenario, tasks that personnel are required to perform are identified and assessed. Primary and secondary personnel tasks are evaluated Secondary task performance measures reflect the workload associated with HSI manipulations associated with maintaining the overall plant. Test personnel evaluate secondary tasks in conjunction with primary tasks to observe effects on overall performance and workload both at individual and operating crew level.

During the July-August 2018 ISV audit, the staff reviewed a sample of ISV scenarios and observed that specific secondary task measures were identified for each scenario, and the measures were appropriate for the event being addressed. Therefore, the staff concludes the applicant identified secondary task measures consistent with Criterion 11.4.3.5.1(3).

The V&V IP, Section 4.5.1.5, Anthropometric and Physiological Factor Performance Measures, states, The primary purpose of anthropometric and physiological performance measures during ISV is to assess those aspects of the design that cannot be evaluated during design verification. Anthropometric and physiological performance measures evaluate how well the HSI supports plant personnel in monitoring and control of the plant. Many of these design aspects are assessed as part of verifying the HFE design. Therefore, the focus is on those areas of the design that only can be addressed by testing the integrated system, e.g., the ability of personnel to effectively use the various controls, displays, workstations, or consoles while performing their tasks.

The staff understands that much of the acceptability of a design is assessed during design verification to ensure that the design conforms to the design-specific HFE guidelines. Thus, the staff agrees that measuring just those aspects that assess how well the design supports dynamic operation during ISV is appropriate. The V&V IP, Section 4.5.1.5 also identifies the types of anthropometric and physiological performance measures for the ISV scenarios.

Therefore, the staff concludes the applicant identified secondary task measures consistent with Criterion 11.4.3.5.1(6).

The staff issued RAI 9395, Question 18-47 (ADAMS Accession No. ML18144A470), requesting the applicant identify the measures of situation awareness and workload for the ISV. In its response to RAI 9395, Question 18-47 (ADAMS Accession No. ML18201A351), the applicant 18-138

revised the V&V IP to include the following information regarding workload and situation awareness measures:

  • Workload: The applicant explained that both the NASA TLX Workload Questionnaire and

[[ ]] are used to measure workload. NASA TLX is a validated and accepted workload measurement in the nuclear domain. Both NASA TLX and [[ ]] are subjective in nature; however, primary and secondary task performance can provide objective indications of workload (e.g., failure of the ISV participants to perform a task may be indicative high workload). The use of multiple workload metrics provides a more comprehensive assessment of workload. Multiple measures also allows one to look for convergent evidence regarding workload levels and helps identify any issues that need to be addressed. Therefore, the staff concludes the applicant identified workload measures consistent with Criterion 11.4.3.5.1(5).

  • Situation Awareness: The applicant described that situation awareness is measured in multiple ways including a questionnaire, which is an explicit measure as described in NUREG/CR-7190, and subject matter expert observations, which is an implicit measure as described in NUREG/CR-7190. The applicant provided sample questions for the questionnaire. The applicant also stated that the scenarios introduce primary tasks, external tasks unrelated to the primary task, and embedded tasks to measure situational awareness. Subject matter expert (SME) observations are used to assess the human-system interface (HSI) effectiveness in supporting the operator's situational awareness during task performance. The use of multiple measures of situation awareness provides a higher degree of confidence that situation awareness has been assessed in a comprehensive way and allows one to look for convergent evidence that adequate situation awareness is present or may indicate where situation awareness issues exist.

Therefore, the staff concludes the applicant identified situation awareness measures consistent with Criterion 11.4.3.5.1(4).

The staff reviewed the workload and situation awareness questionnaires for a sample of scenarios at the ISV audit (ADAMS Accession No. ML18298A189). The staff determined that the situation awareness questions were appropriate to assess the ISV test participants understanding of the plants condition. The staff also observed that the content of the NASA TLX questionnaires conformed to the standard NASA TLX methodology.

RAI 9395, Question 18-47, is a Confirmatory Item pending completion of the cited revisions to the V&V IP.

18.10.4.4.2.1.2 Performance Measure Information and Validation Criteria (Criteria 11.4.3.5.2(1)-(5))

NUREG-0711, Section 11.4.3.5.2, Performance Measure Information and Validation Criteria, includes five criteria for this topic. Criteria 11.4.3.5.2(1)-(5) state that, for each performance measure, the applicant should describe how and when it is obtained, describe its characteristics, identify the criteria used to judge its acceptability and the basis for the criteria, and identify whether it is a pass/fail or diagnostic measure.

V&V IP, Section 4.5.1.1, Plant Performance Measures, describes when plant performance measures are obtained as follows:

18-139

Plant performance resulting from operator action or inaction includes plant process data (e.g., temperature, pressure) and component status (e.g., on/off; open/closed) as a function of time at as many locations in the plant simulation as is possible. These data are obtained from the entire plant: nuclear, fluid, structural, and electrical components. The test bed has the ability to record all plant process data and component status (including state changes) for the full length of any ISV Scenario.

The V&V IP, Section 4.5.2.1, Collection Methods, explains how plant performance measure are obtained and states, Objective data (e.g., video recording, administrator observations) collected during test scenarios are analyzed to assess impacts of operator actions on plant processes and equipment states. The analysis compares the performance derived from parameters and times collected by the test bed to the evaluation criteria for operator actions and for overall plant process behavior developed for each scenario.

Therefore, the staff concludes the applicant explained how and when the plant performance measures are obtained, which is consistent with Criteria 11.4.3.5.2(1)-(2).

The V&V IP, Section 4.5.1.2 states, Test personnel evaluate secondary tasks in conjunction with primary tasks to observe effects on overall performance and workload both at individual and operating crew level. The V&V IP, Section 4.1, also states, Objective performance measures and success criteria are developed as part of the methodology and listed within the scenario guides used for the conduct of ISV tests. During the July-August 2018 ISV audit, the staff observed the ISV scenario guides identified when each primary task should be completed such that the validation team could observe whether actual performance was consistent with expected performance. Therefore, the staff concludes the applicant explained how and when the plant performance measures are obtained, which is consistent with Criteria 11.4.3.5.2(1)-

(2).

The V&V IP, Section 4.5.1.5 identifies when and how the anthropometric and physiological measures are obtained. The V&V IP, Section 4.5.1.5 states,

[[ ]].

The V&V IP, Section 4.5.2.1, Collection Methods, states, Operator feedback on the HSI is collected via scenario debriefs and questionnaires. Both types of operator feedback include scale rating questions and open feedback (long answer) questions. Thus the ISV participants have the ability to document any anthropometric or physiological concerns. Therefore, the staff staff concludes the applicant explained how and when the anthropometric and physiological measures are obtained, which is consistent with Criteria 11.4.3.5.2(1)-(2).

The staff issued RAI 9395, Question 18-47 (ADAMS Accession No. ML18144A470) to request the applicant clarify when and how measures of situation awareness and workload are obtained and also describe the characteristics for each performance measure, identify the criteria used to judge its acceptability and the basis for the criteria, and identify which measures are pass/fail and diagnostic measures. The applicants response to RAI 9395, Question 18-47 (ADAMS18-140

Accession No. ML18201A351) included a proposed revision to the V&V IP to include the following information:

  • When and how situation awareness and workload measures are obtained. The applicant explained that the questionnaire used to measure situation awareness is administered [[ ]]. The applicant also stated that the NASA TLX is administered [[ ]].

During the July-August 2018 ISV audit, the staff observed the [[ ]]. The staff observed that intrusiveness was minimal, and there were no observable impacts on the ISV participants actions during the scenarios as a result of administering the situation awareness questionnaire or the NASA TLX. Therefore, the staff concludes the applicant identified how and when situation awareness and workload measures are obtained, which is consistent with Criteria 11.4.3.5.2(1)-(2).

  • The characteristics for each performance measure, the associated criterion used to judge the acceptability of performance, and whether the measure was pass/fail or diagnostic. The staff determined that the applicant appropriately characterized the performance measures in accordance with the guidance in NUREG-0711, Table 11-1, Characteristics of Performance Measures. The staff also found the applicant appropriately identified those performance measures that are pass/fail measures and those that are diagnostic measures such that the pass/fail measures are adequate to determine whether the HFE design should be accepted or not (i.e., whether the design is validated or not), and the diagnostic measures are adequate to facilitate the analysis of human performance errors and HEDs.

Additionally, the staff reviewed the applicants criteria selected to judge the acceptability of performance for each measure and the basis for the acceptance criteria provided in Appendix A of the revised V&V IP. During the ISV audit, the staff observed the primary task performance criteria was identified in the scenario guides, and the criteria for these performance measures were appropriately based on administrative requirements, technical specifications, equipment operating limits, and PRA assumptions about task performance, as applicable to the particular primary task. The staff found the acceptance criteria for primary tasks included criteria that would ensure safe plant performance, and thus the plant performance criteria were tied to primary task performance criteria. The staff also found the acceptance criteria for secondary task performance is also adequate to evaluate the demands of the HSI use such that the effectiveness of the HSI in supporting personnel in task performance can be determined.

The criteria for anthropometric and physiological measures is based on [[ ]], which is used to evaluate [[ ]], and to [[ ]]. Given the subjective nature of these measures, the use of [[ ]] is sufficient to evaluate the significance of any anthropometric or physiological issues and their potential impact on human performance.

The criteria for SA is based on a [[ ]], and the criteria for workload is based on [[ ]].

During the ISV audit, the staff observed the applicant established a minimum numerical threshold for acceptable SA. The staff finds the applicants minimum numerical value is a reasonable threshold for triggering a more in-depth evaluation to understand whether there is an issue with the HFE design. The staff also observed the applicant established a maximum threshold for workload, which is consistent with that described in the BNL Tech Report, Section 4.3.18-141

Therefore, the staff concludes the applicant described the characteristics for each performance measure, identified the criteria used to judge its acceptability and the basis for the criteria, and identified which measures are pass/fail and diagnostic measures, which is consistent with Criteria 11.4.3.5.2(3)-(5).

RAI 9395, Question 18-47 as a Confirmatory Item pending the incorporation of the revisions into the V&V IP.

18.10.4.4.2.2 Test Design 18.10.4.4.2.2.1 Scenario Sequencing (Criteria 11.4.3.6.1(1)-(2))

NUREG-0711, Section 11.4.3.6.1, includes two criteria for this topic. The criteria state that the applicant should balance (1) scenarios across crews to provide each crew with a similar, representative range of scenarios and (2) the order of presentation of scenarios to crews to provide reasonable assurance that the scenarios are not always presented in the same sequence (e.g., the easy scenario is not always used first). The V&V IP, Section 4.6.1, Scenario Sequencing, discusses the applicants sequencing for validation testing and states the following:

The scenario performance sequence is developed using the following guidance:

  • Equalize the opportunity for testing among all participants.
  • Vary the types of scenarios within the sequence; such that all are not easy at first and then progress too hard.

Therefore, the staff concludes the applicants method is consistent with Criteria 11.4.3.6.1(1)-

(2). As part of an audit in June 2018 (ADAMS Accession No. ML18208A370), the staff reviewed the assignment of crews to ISV scenarios and found it conformed to Criteria 11.4.3.6.1(1)-(2).

18.10.4.4.2.2.2 Test Procedures (Criteria 11.4.3.6.2(1)-(2))

NUREG-0711, Section 11.4.3.6.2, includes two criteria for this topic. Criteria 11.4.3.6.2(1)-(2) state that the applicant should use detailed, unambiguous procedures to govern the conduct of the tests, and the test procedures should minimize the opportunity for bias in the test personnels expectations and in the participants responses. Criterion 11.4.3.6.2(1) also lists the information that should be included in the test procedures in order to develop detailed and unambiguous test procedures.

The V&V IP, Section 4.6.2, Test Procedures, states that prior to ISV, test procedures are prepared to manage tests, assure consistency, control test bias, support repeatable results, and focus the test on the specific scenario objectives.

In the V&V IP, Section 4.6.2, the applicant also stated:

The test observers/administrators use the test procedures to set up each scenario, manage the scenario, and analyze the test results. ISV test procedures are designed to minimize the introduction of bias by both 18-142

observer/administrators and operating crews. A standardized scenario template is part of the test procedure.

The staff reviewed the applicants detailed test procedures during an audit from July 25, 2017, through February 14, 2018 (ADAMS Accession No. ML18135A049) and found the test procedures address the information listed in Criterion 11.4.3.6.2(1). Additionally, the staff found the test procedures contained detailed and standardized instructions to minimize the opportunity for bias in the validation team members and the ISV participants govern the interaction between the validation team members expectations and the ISV participants responses. For example, the procedures included pre-determined, scripted cues from the test personnel and responses to participants questions in order to maintain consistency across scenario trials. Additionally, the procedures contained instructions for the validation team to minimize intrusiveness and impact on the ISV participants during testing. Also, the procedures included provisions for limiting bias in the ISV participants responses that could result if the participants had prior knowledge of the scenario content and expected outcomes.

During the ISV audit, the staff observed that the applicant followed its ISV test procedures with one exception. The staff observed the applicant continued an ISV scenario after stopping the scenario to address a simulator issue for an amount of time that exceeded the time limit specified in the applicants procedure. The applicant explained that because the issue occurred after a predetermined data collection point, all necessary ISV data was collected, and it was feasible to restart the scenario at the point where the scenario had been paused for data collection. The staff did not observe any significant impact on the test as a result of the applicant resuming the scenario following a delay that exceeded the time limit identified in its ISV test procedure.

Therefore, the staff concludes the applicant developed and used detailed, unambiguous procedures to govern the conduct of the tests, which is consistent with Criteria 11.4.3.6.2(1)-(2).

18.10.4.4.2.2.3 Training Test Personnel (Criterion 11.4.3.6.3(1))

NUREG-0711, Section 11.4.3.6.3, includes one criterion for this topic. This criterion states that the applicant should train test personnel and lists topics that the training should cover. The V&V IP, Section 4.6.3, Training Test Personnel, states that the observers and administrators are trained and qualified on NuScale plant systems, the HSI, and the ISV test procedures. The training program consists of both classroom training and time in the testbed simulator.

The training programs have the following stated goals:

  • assuring familiarity with test procedures and scenarios
  • reducing bias and errors that may be introduced by the observers and administrators as a result of test-based learning, failure to follow the test procedure, or incorrect interaction with the operating crew
  • showing how to use the test procedure
  • documenting each test, including the following:

- where the test did not follow the scenario 18-143

- problems that occur during testing, even if they resulted from an oversight or error of those conducting the test

  • conveying the necessity of limiting observer and administrator interaction with test personnel to that in the scenario description
  • showing how to conduct postscenario debriefings
  • assuring familiarity with HFE data collection tools and techniques
  • assuring familiarity with observation techniques, goals, and responsibilities specific to each observers role The staff finds that the items listed in the V&V IP are consistent with those in Criterion 11.4.6.3(1). Thus, the staff finds that the application conforms to this review criterion.

18.10.4.4.2.2.4 Training Participants (Criteria 11.4.3.6.4(1)-(2))

NUREG-0711, Section 11.4.3.6.4, Training Participants, includes two criteria for this topic.

Criterion 11.4.3.6.4(1) states that participants should be trained such that there is reasonable assurance that the participants knowledge of the plants design, operations, and use of the HSIs and procedures represents that of experienced plant personnel. Participants should not be trained specifically to carry out the selected validation scenarios. The V&V IP, Section 4.6.4, Training Participants, describes the applicants participant training program. The staff reviewed the V&V IP, Section 4.6.4, and compared it to Criterion 11.4.3.6.4(1).

The V&V IP, Section 4.6.4, states the following:

Test participants undergo training similar to that which plant operators receive including conduct of operations, plant systems, HSI, plant events, and operating procedures. Test participants are not trained specifically on the scenarios in which they will participate.

In addition, the staff was able to audit the ISV test plan, which provided more details about the participant training. The staff found the training approach acceptable and that the application conforms to Criterion 11.4.3.6.4(1).

Criterion 11.4.3.6.4(2) states that, to assure that the participants performance is representative of plant personnel, the applicants training of participants should result in near-asymptotic performance and should be tested for such before conducting the validation. The staff reviewed the V&V IP, Section 4.6.4, which discusses the applicants participant training, and compared it to Criterion 11.4.3.6.4(2).

The V&V IP, Section 4.6.4, states the following:

To assure near-asymptotic performance and a consistent level of proficiency between individuals making up the operating crews, only participants who have successfully completed the training program and have reached an acceptable level of proficiency are considered to be qualified for operating crew assignment.18-144

The staff issued RAI 9397, Question 18-19, (ADAMS Accession No. ML18069A000), requesting clarification from the applicant about how it would assess the participants proficiency level before validation testing.

The response to RAI 9397, Question 18-19 (ADAMS Accession No. ML18101B177), describes that training consists of 9 weeks of classroom training to provide an in-depth level of knowledge of the NuScale design. Throughout this training, participants take periodic written examinations to ensure a consistent baseline knowledge level within the ISV participant group. Remediation is provided to address knowledge deficiencies. The applicant also described that participants receive 10 weeks of simulator training, focusing on the conduct of operations and plant operations, in which the ISV participants are assessed through periodic monitored dynamic simulator scenarios including a final audit examination that is administered similar to an ISV examination scenario. The ISV test team and NuScale management review the examination scores and determine whether participants have successfully completed the training program and are considered qualified for operating crew assignment. Therefore, the staff finds that the application acceptably conforms to Criterion 11.4.3.6.4(2).

18.10.4.4.2.2.5 Pilot Testing (Criteria 11.4.3.6.5(1)-(2))

NUREG-0711, Section 11.4.3.6.5, Pilot Testing, includes two criteria for this topic. These criteria state that the applicant (1) should conduct a pilot study before the validation tests begin to offer an opportunity for the applicant to assess the adequacy of the test design, performance measures, and data collection methods and (2) should not use participants in the pilot testing who will then participate in the validation tests. The staff reviewed the V&V IP, Section 4.6.5, Pilot Testing, which describes the applicants pilot testing.

The V&V IP, Section 4.6.5, states that the applicant will conduct a pilot test before the ISV in order to do the following:

  • Assess the adequacy of test design, performance measures, and data collection methods.
  • Give the observers and administrators experience in running the test.
  • Ensure that the ISV runs smoothly and correctly.

The V&V IP also states that a test operating crew, which does not participate in the ISV, will conduct the pilot testing.

The staff finds that the application conforms to these criteria.

18.10.4.4.2.3 Data Analysis and Human Engineering Discrepancy Identification (Criteria 11.4.3.7(1)-(7))

NUREG-0711, Section 11.4.3.7, Data Analysis and HED Identification, includes seven criteria for this topic.

18.10.4.4.2.3.1 Analysis Methods (Criteria 11.4.3.7(1)-(2))

Criteria 11.4.3.7(1)-(2) state that the applicant should (1) use a combination of quantitative and qualitative methods to analyze data, such that the analysis should reveal the relationship 18-145

between the observed performance and the established performance criteria, and (2) discuss the method by which data are analyzed across trials and include the criteria used to determine successful performance for a given scenario.

In the V&V IP, Section 4.7, the applicant stated the following:

Test data are analyzed using both quantitative and qualitative methods. The analysis identifies the relationship between the observed and measured performance and the established acceptance criteria described in Section 4.5.2.

Therefore, the staff concludes the applicants data analysis methods conform to Criterion 11.4.3.7(1).

Open Item 18-10: The staff will confirm the applicant used both quantitative and qualitative data analysis methods upon receipt of the V&V RSR.

The V&V IP, Section 4.7, also states the following:

Data are analyzed for each scenario across multiple trials. The method of analysis, consistency of measure assessing performance, and criteria used to determine successful performance for a given scenario is determined by the HFE Design Team.

Although the applicant committed to analyzing data across trials, it did not provide any information on the methodology or on the criteria used to determine successful performance for a given scenario. The staff issued RAI 9399, Question 18-35 (ADAMS Accession No. ML18082B396), to ask the applicant to describe the method(s) that will be used to analyze data across trials and the criteria that will be used to determine successful performance. The applicants supplemental response to RAI 9399, Question 18-35 (ML18249A421) provides both revisions to the V&V IP and specific proprietary examples of trending techniques across trials.

The applicant stated, Data is collected from multiple sources including crew debriefs, observer debriefs, NASA TLX questionnaires, Situational Awareness questionnaires, and management observations. The data is collected and added to a database where an HFE Subject Matter Expert (SME) and an Operations SME bin and code the performance data and then independently identify significant issues and trends within the data. This analysis compares and contrasts data sources, data across crews, data across trials, and data across scenarios. The HFE and Operations SMEs then collaborate on trending results and Human Engineering Discrepancy (HED) identification.

Additionally, in the response to RAI 9399, Question 18-35 (ML18137A584) the applicant identified the specific criteria used to determine successful performance for a given scenario Because the applicant intends to evaluate the data collected from all scenario trials, the staff finds the applicants data analysis methodology acceptable to analyze data from multiple trials and to assess the success of each scenario, which is consistent with Criterion 11.4.3.7(2). RAI 9399, Question 18-35, is being tracked as a Confirmatory Item pending the incorporation of the changes into next revision of the DCA Part 2.18-146

Open Item 18-11: The staff will verify the applicant used the data analysis methods discussed in the response to RAI 9399, Question 18-35 upon receipt of the V&V RSR.

Measurement (Criterion 11.4.3.7(3))

Criterion 11.4.3.7(3) states that the applicant should evaluate the degree of convergence between related measures.

DCA Part 2 Tier 2, Section 18.10.2.3.7, Data Analysis and Human Engineering Discrepancy Identification, states:

Assessments attained by different means, which are intended to measure same or similar performance measures, are compared. When differing conclusions are reached, more detailed cause analysis is performed, including the review of simulator logs, video and audio tapes, if necessary. Measuring convergence may be necessary for a single team and single scenario or for multiple teams and across several scenarios depending on the performance measure.

Therefore, the staff concludes the applicants data analysis methods conform to Criterion 11.4.3.7(3).

Open Item 18-12: Upon receipt of the V&V RSR, the staff will confirm that the applicant has evaluated the degree of convergence among related measures.

18.10.4.4.2.3.2 Interpretation (Criterion 11.4.3.7(4))

Criterion 11.4.3.7(4) states that the applicant should allow a margin of error when interpreting test results.

DCA Part 2 Tier 2, Section 18.10.2.3.7, states, Expert judgment is employed to infer a margin of error from the observed performance or data analysis. This allows for the possibility that actual performance may be slightly more variable than ISV test results. The staff issued RAI 9399, Question 18-35 (ADAMS Accession No. ML18082B396), to (1) identify the qualifications of the personnel who will provide the expert judgment and (2) discuss the process by which the expert judgment is derived (e.g., what information is considered) and how it is used in interpreting test results. In the response to RAI 9399, Question 18-35 (ADAMS Accession No. ML18137A584), the applicant stated, The "HFE design team" will be expected to use expert judgment, specifically the Operators and Human Factor Engineers who administer the ISV exams, provide observations, and analyze the resulting data. The HFE PMP, Table 3-1, lists the qualifications of the HFE design team. The applicant also stated that the HFE and operations personnel will compare the actual performance with the expected performance as documented in the respective ISV scenario guide, which is validated during the pilot test. The applicant also described that tasks identified in the scenario guide evaluation criteria that directly support mitigating core damage or large radiological releases are completed with a time completion ratio less than or equal to 0.75, thus leaving a margin of 25 percent to account for variability in performance.

The staff finds the methodology is consistent with Criterion 11.4.3.7(4).18-147

Open Item 18-13: Upon receipt of the V&V RSR, the staff will confirm that the results are interpreted allowing for a margin of error.

18.10.4.4.2.3.3 Verification (Criterion 11.4.3.7(5))

Criterion 11.4.3.7(5) states that the applicant should verify the correctness of the data analyses using individuals or groups other than those who performed the original analysis.

In DCA Part 2 Tier 2, Section 18.10.2.3.7, the applicant stated the following:

Integrated system validation data analysis is reviewed to verify the correctness of the analyses of the data. Data and data-analysis tools (e.g., equations, measures, spreadsheets, expert opinions, resulting HEDs) are documented and available for review and subsequent audit and application during HFE program elements design integration or human performance monitoring.

The staff issued RAI 9399, Question 18-35 (ADAMS Accession No. ML18082B396, to request that the applicant clarify the identity of the individual(s) or group(s) who will carry out this verification and how they are independent from those who conducted the original analysis. In the response to RAI 9399, Question 18-35 (ADAMS Accession No. ML18137A584), the applicant stated, The completed analysis will be compiled into an ISV test report. The test report will be reviewed by at least one peer from the observation group that was not directly involved in the original data analysis. The reviewer(s) will not be segregated or otherwise separate from the observation group other than they will not be involved in the initial analysis of the data. The test report will be reviewed and approved by a manager that was not directly involved with the original analysis of the data. Therefore, the staff concludes the applicants method includes verification of the correctness of the data analysis, which is consistent with Criterion 11.4.3.7(5).

Open Item 18-14: Upon receipt of the V&V RSR, the staff will confirm that the applicant verified the correctness of the analyses of the data.

18.10.4.4.2.3.4 Human Engineering Deficiencies (Criteria 11.4.3.7(6)-(7))

Criteria 11.4.3.7(6)-(7) state that the applicant should (1) identify HEDs when the observed performance does not meet the performance criteria and (2) resolve HEDs identified by pass/fail measures before the design is accepted.

The V&V IP, Section 5.1, HED Design Solution Implementation, states that the HFE design team performs an analysis to categorize HEDs into one of three categories (Priority 1, 2, or 3).

Priority 1 HEDs have a potential direct or indirect impact on plant safety and are resolved before ISV testing is considered complete, which occurs prior to concluding the design has been validated and is therefore acceptable. HEDs initiated as a result of a measure not being met (pass or fail performance measures) are Priority 1 HEDs.

The staff finds the applicants methodology for Criteria 11.4.3.7(6)-(7) acceptable.

Open Item 18-15: The staff will verify the applicant identified HEDs as described in the V&V IP when the applicant submits the V&V RSR.18-148

18.10.4.4.2.4 Validation Conclusions (Criteria 11.4.8(1)-(2))

NUREG-0711, Section 11.4.8, Validation Conclusions, includes two criteria for this topic.

Criteria 11.4.8(1)-(2) state the applicant should (1) document the statistical and logical bases for determining that performance of the integrated system is and will be acceptable and (2) document the limitations in the validation tests, their possible effects on the conclusions of the validation, and their impact on implementing the design.

The V&V IP, Section 4.8, Validation Conclusions, states that the V&V RSR will include the following:

  • the statistical and logical bases for determining that performance of the integrated system is acceptable
  • the limitations in identifying possible effects on validation conclusions and that the impact on the design integration HFE program element is considered, including the following:

- aspects of the tests not well controlled

- potential differences between the test situation and actual operations, such as the absence of productivity-safety conflicts

- differences between test platform design and the as-built NuScale plant Although the staff understands that the applicant plans to include the full bases and limitations in the V&V RSR, the staffs review of the V&V IP identified a concern with the methodology that could impact the applicants ability to satisfy Criterion 11.4.8(1) (i.e., demonstrating that performance of the integrated system is and will be acceptable).

Specifically, the V&V IP, Section 4.6.1, states that a minimum of two operating crews will perform each scenario. NUREG/CR-6393 contains the following guidance about the sample size used for the ISV test:

The objective of validation is to provide evidence that the integrated system

[(i.e., software, hardware, and personnel elements)] adequately supports plant personnel in the safe operation of the plant; i.e., that the integrated design remains within acceptable performance envelopes. To accomplish this objective, the methodology must permit a logical and defensible inference to be made from validation tests to predicted integrated system performance under actual operating conditions.

As a general rule, the larger the sample size (number of participating crews), the more confidence can be placed in generalizing the observed test performance to actual performance. Low sample sizes make it difficult to examine the effects of human variability. However, it should be recognized that there is a significant tradeoff between sample size and the difficulty, time, and cost of the validation program. Since human and integrated system variability is important to the 18-149

generalization process, methods should be employed to ensure its adequate estimation.

The actual sample size is difficult to specify precisely because it depends on several factors... The less sensitive the integrated system performance is to human performance, the less that variation needs to be assessed and the lower the needed sample size.

The staff was concerned that a minimum of two trials for each ISV scenario does not provide (1) enough opportunities for users of the integrated system to identify problems with the HFE design or (2) reasonable assurance that results from the ISV test will be indicative of the ability of the integrated system to support safe plant operation. As stated in NUREG/CR-6393, The less sensitive the integrated system performance is to human performance, the less that variation needs to be assessed and the lower the needed sample size. The staff did not find that the applicant had provided justification that the integrated system performance is less sensitive to human performance (specifically, human performance errors) to justify having a minimum of two trials per scenario.

Therefore, the staff issued RAI 8758, Question 18-1 (ADAMS Accession No. ML17191A215),

requesting the applicant to describe the bases for determining that performance of the integrated system using a minimum of two operating crews per scenario will be acceptable.

In the responses to RAI 8758, Question 18-1 (ADAMS Accession Nos. ML17322A051 and ML18134A353), the applicant identified design features that limit the likelihood and consequences of safety-significant errors of omission (i.e., one in which an operator action is required to be taken and the operator fails to take that action) and commission (i.e., one in which the operator takes an erroneous action when no action is needed or a different action is needed) as justification for using a minimum of two operating crews per scenario. The staffs assessment of the applicants justification is provided below.

18.10.4.4.2.4.1 Errors of Omission With respect to errors of omission, the applicant discussed the results of the deterministic analyses presented in DCA Part 2 Tier 2, Chapter 15 and Chapter 7 and the results of the PRAs discussed in DCA Part 2 Tier 2, Chapter 19. In the response to RAI 8758, Question 18-1 (ADAMS Accession No. ML17322A051), the applicant stated, NuScale has a uniquely safe design with only a few simple, passive safety systems, few support systems, and no reliance on AC [alternating current] or DC [direct current] power for mitigating design basis events (DBEs).

As a result of NuScale's simple and passive design, no operator actions are required for DBE mitigation. DCA Part 2 Tier 2, Section 15.0.0.6.4, Required Operator Actions, states the following:

There are no operator actions credited in the evaluation of NuScale DBEs. After a DBE, automated actions place the NPM in a safe-state and it remains in the safe-state condition for at least 72 hours8.333333e-4 days <br />0.02 hours <br />1.190476e-4 weeks <br />2.7396e-5 months <br /> without operator action, even with assumed failures.

DCA Part 2 Tier 2, Section 15.0.0.5, Limiting Single Failures, also states the following:

18-150

Operator actions allowed by procedure make the consequences less severe.

Failure to take one of these actions cannot make the consequences worse than the bounding Chapter 15 analysis.

Additionally, the analyses described in DCA Part 2 Tier 2, Chapter 7 do not credit any operator actions to mitigate DBEs following a common-cause failure of the digital I&C protection system.

Also, DCA Part 2 Tier 2, Section 7.1.1.2.2, Post-Accident Monitoring, states that no operator actions are credited for ATWS and SBO scenarios, which are BDBEs. As compared to large light-water plant designs, ATWS event mitigation requires operator actions to be taken within minutes to trip the reactor and place it in a safe state, and SBO event mitigation requires operators to take action to restore power to the site to ensure the availability of those plant systems that are needed for long-term core cooling within hours of the event. Operators do not need to trip the reactor following an ATWS event to meet ATWS acceptance criteria, and no operator actions are required until 72 hours8.333333e-4 days <br />0.02 hours <br />1.190476e-4 weeks <br />2.7396e-5 months <br /> after an SBO.

Given that the analysis of DBEs does not credit any operator actions, the staff concludes that there are no errors of omission for operator actions that are relied upon to mitigate the consequences of DBEs. Although plant procedures direct operators to take some actions during these DBEs, if operators fail to take any of these actions, the consequences cannot be any more severe than those described in DCA Part 2 Tier 2, Chapter 15. Because no operator actions are required to be performed to meet the acceptance criteria for the events analyzed in DCA Part 2 Tier 2, Chapter 15, the staff concludes the design is not sensitive to any errors of omission during DBEs, ATWS, and SBO.

The applicant also discussed the impact of errors of omission for BDBEs analyzed using probabilistic methods as discussed in DCA Part 2 Tier 2, Chapter 19. In the response to RAI 8758, Question 18-1 (ADAMS Accession No. ML17322A051), the applicant stated the following:

The PRA evaluated operating sequences that could lead to core damage. All sequences that lead to core damage have very low frequencies (less than 1E-10 per module critical year), and involved BDBEs. For the sequences that led to core damage, one of the following three conditions must occur: (Reference TR-1117-57216 NuScale Generic Technical Guidelines):

1. A malfunction of the ECCS to actuate as designed. For example, when the ECCS vent valves open but both of the ECCS recirculation valves do not, inhibiting water from reentering the core.
2. An isolable loss of coolant accident outside of the containment vessel with a failure to add makeup coolant. In this case, sufficient RCS inventory may be lost leading to core uncovery.
3. A situation where both trains of decay heat removal have failed in a manner to not remove RCS heat, and both of the RCS ASME Code safety relief valves do not open.

The PRA identified seven BDBE human actions. Six of the actions that could be taken to mitigate the events occur in the Main Control Room and two of these actions are considered IHAs [important human actions]. Since there are only six PRA identified actions that are performed in the control room, NuScale plans18-151

to sample the performance on all of those tasks to ensure there is confidence in the results of the testing. In lieu of performing each designed scenario three times, the scenarios have been designed such that the PRA actions are each sampled in at least two separate scenarios and the crews are sequenced such that all three crews perform each of the six PRA actions at least once.

Despite uncertainties in the applicants PRA as a result of incomplete design detail and lack of operating experience, the staff acknowledges that the applicants PRA identified fewer HAs relative to other certified designs and operating reactors. When compared to other certified designs and operating reactors, the PRA also identified fewer IHAs. As such, the staff concludes that the low number of HAs identified in the PRA shows a limited reliance on operator actions to mitigate the consequences of BDBEs. As discussed in the DCA Part 2 Tier 2, Section 18.6.3, Results, and DCA Part 2 Tier 2, Section 18.5.3, Results, the IHAs were included in the SPV, and the operating crews satisfactorily performed them. The staff observed some SPV testing as discussed in the audit report dated November 30, 2016 (ADAMS Accession No. ML16259A110) and noted that the tasks operators perform to accomplish the IHAs are not complex actions but rather are directed by plant procedures in a few steps. Also, the HFE design also includes alarms and indications to notify the operators of these events requiring HAs to be performed.

Additionally, each of the crews will also perform these IHAs and the other HAs during the ISV.

Satisfactory results must be obtained or compensatory actions must be taken if the crews cannot perform these actions within the time required. Therefore, the staff concludes that the proposed testing of these HAs and IHAs in the ISV by [[ ]], in addition to the successful demonstration of the IHAs during the SPV, is an acceptable approach to demonstrate the feasibility and reliability of the IHAs and HAs.

18.10.4.4.2.4.2 Errors of Commission With respect to errors of commission, because up to 12 units can be operated from a single control room, and up to 12 units can be operated from a single operator console, there is a relatively higher probability of operators taking an action intended for one unit on a wrong unit as compared to other certified designs. In the response to RAI 8758, Question 18-1 (ADAMS Accession No. ML18134A353), the applicant discussed how the MCR, MPS, and HSI designs help to limit the probability and consequences of safety-significant errors of commission. DCA Part 2 Tier 2, Section 7.0.4.1, Module Protection System, describes purpose of the MPS:

The primary purpose of the MPS is to monitor process variables and provide automatic initiating signals in response to out-of-normal conditions, providing protection against unsafe NPM operation during steady state and transient power operation. Each NPM has a single dedicated MPS. The two major functions that the MPS performs are:

  • monitors plant variables and trips the reactor when specified setpoints, which are based on the plant safety analysis analytical limits described in Chapter 15, are reached or exceeded during anticipated operational occurrences.
  • monitors plant variables and actuates engineered safety features actuation system (ESFAS) equipment when specified setpoints, which are based on 18-152

the plant safety analysis analytical limits described in chapter 15, are reached or exceeded during anticipated operational occurrences.

Actuation of ESFAS equipment prevents or mitigates damage to the reactor core and reactor coolant system components and ensures containment integrity.

In the response to RAI 8758, Question 18-1 (ADAMS Accession No. ML18134A353), the applicant stated the following:

Control room operators cannot manipulate safety-related SSCs [structures, systems, and components] except through the use of the module protection system (MPS) hard-wired manual actuation switches located at the standup panel for each unit. Operation of any of these switches is an infrequent operation directed by procedure and normally requires a peer-check prior to operation.

Operation of these switches is also expected to receive supervisory oversight and because of their physical location, operation of these safety-related switches is conspicuous to the operating crew.

The MPS cannot be overridden by an operator either before or after initiation, with the exception of containment isolation override to support either adding inventory to the reactor vessel using the chemical and volume control system (CVCS) or to containment using the containment flooding and drain system (CFDS). Once an MPS setpoint is reached, the associated safety related SSCs will transition to their single safety position. The containment isolation override function is only required during highly improbable beyond-design basis events which are addressed in Chapter 19 and is beyond the scope of this response, which is for Chapter 15 events only. The containment isolation override function requires multiple deliberate steps which are directed by procedures. The Conduct of Operations and generally accepted industry standards on human performance and use of error reduction tools ensure that a peer check and proper supervisory oversight would be provided to complete this Important Human Action. To accidentally perform this action in error or to complete this action on the wrong unit is not deemed credible.

DCA Part 2 Tier 2, Section 7.1.1.2.1, Protection Systems, provides more detail as to when and how safety-related structures, systems, and components are operated and states, When allowed by plant procedures to reconfigure systems after a reactor trip or an ESF actuation, the components can be repositioned using the nonsafety-related MCS when the enable nonsafety control switch is activated and no automatic or manual safety actuation signal is present. The HSI Design RSR states that the safety-related Enable Nonsafety Control Switch is a [[ ]]. DCA Part 2 Tier 2, Section 7.1.5.1.6, Guideline 6Postulated Common Cause Failure of Blocks, explains that control of safety-related components using non-Class 1E controls (i.e., the MCS) can only be enabled by the operator using the enable nonsafety control switch; otherwise, the non-Class 1E signals to the actuation priority logic are ignored. DCA Part 2 Tier 2, Section 7.2.12.2 states the following:

If enabled by the operator using the safety-related enable nonsafety control switch, the capability for manual component level control of ESF equipment is possible using nonsafety discrete hard-wired inputs from the MCS to the HWM.

These signals are then input to the actuation priority logic circuit on the EIM. Any 18-153

automatic or manual safety related signal will override the nonsafety signal and is prioritized within the actuation priority logic.

As stated in the response to RAI 8758, Question 18-1 (ADAMS Accession No. ML18134A353),

situations in which the operators need to operate the safety-related structures, systems, and components are expected to occur infrequently and also are directed by procedures. These components cannot be operated from the sitdown or standup workstations without first taking the correct Enable Nonsafety Control Switch for that module to the correct position, and they also cannot be operated if an ESF actuation signal is present. Therefore, they cannot be operated by the operator if the MPS determines the valve needs to be in its safety state.

Although it is possible that the operator might select the incorrect Enable Nonsafety Control Switch (e.g., he or she might operate the Enable Nonsafety Control Switch for Unit 9 instead of Unit 6), the operator would have to also make a subsequent error during operation of the actual components to actually reposition any equipment. If that did occur, and if the consequences of that action were significant enough to actuate an ESF signal for that unit, or if an ESF actuation setpoint was reached on that unit for other reasons, then the MPS would actuate the necessary ESF signal, and the components would automatically go to the position required for the ESF actuation regardless of what position the operator had selected using the MCS.

Additionally, although it is possible that the operator could operate the wrong ESF valve or valves (e.g., he or she might accidentally select an ECCS valve to operate rather than a Decay Heat Removal System valve) following operation of the correct Enable Nonsafety Control Switch for a particular unit, if the consequences of that action were significant enough to actuate an ESF signal for that unit, or if an ESF actuation setpoint was reached on that unit for other reasons, then the MPS would actuate the ESF signal, and the components would transition to the position required for the ESF actuation regardless of what position the operator had selected the component to be in using the MCS.

Because the operator must first enable the Enable Nonsafety Control Switch for a particular unit to operate the components that are relied upon to perform safety functions for that unit, and because safety signals from the MPS to those components are prioritized such that actions taken by the operator are overridden when safety setpoints are reached, the staff concludes that the design is relatively insensitive to possible errors of commission related to the operation of safety-related components.

Therefore, the staff concludes that the design features of the NuScale plant and HSI design do help to reduce the sensitivity of the integrated system to human performance errors and thus do provide justification for the applicants proposed number of trials.

Open Item 18-16: The staff will review the V&V RSR to confirm that the applicant documented (1) the statistical and logical bases for determining that performance of the integrated system is and will be acceptable and (2) the limitations in the validation tests, their possible effects on the conclusions of the validation, and their impact on implementing the design.

Human Engineering Discrepancy Resolution Review (Criteria 11.4.4(1)-(5))

NUREG-0711, Section 11.4.4, includes five criteria for this topic.18-154

18.10.4.5.1 Human Engineering Discrepancy Analysis (Criterion 11.4.4(1))

The V&V IP, Section 5.2, Human Engineering Discrepancy Analysis, describes the applicants analysis of HEDs. The staff compared this information to Criterion 11.4.4(1), which lists items that should be included in HED analyses. The staff found the information in the V&V IP, Section 5.2, includes all of the items listed in Criterion 11.4.4(1). Specifically, in the V&V IP, Section 5.2, the applicant stated the following:

HFE V&V HEDs are categorized based on their principal impact on:

  • personnel tasks and functions
  • plant systems
  • human-system interface feature
  • individual HSI component
  • operating procedure Extent of condition and causal effect across the various HSI design features and functions are assessed as part of the HED process. Extent of condition determination considers:
  • cumulative or combined effects of multiple HEDs
  • human engineering discrepancies that may represent a broader issue The staff finds the described analysis approach acceptable.

Open Item 18-17: The staff will verify the applicants implementation of the above practices upon receipt of the V&V RSR.

18.10.4.5.2 Selection of Human Engineering Discrepancies to Correct (Criterion 11.4.4(2))

Criterion 11.4.4(2) states that the applicant should conduct an evaluation to identify which HEDs to correct, correct those with direct safety consequences or potential safety impact (unless the applicant justifies leaving the condition as is), and correct HEDs that may adversely impact personnel performance in a way that has potential consequences to plant performance; operability of structures, systems, or components; and personnel performance or efficiency.

As discussed in Section 18.1.4.4 of this SER, the V&V IP, Section 5.1, states that the HFE design team performs an analysis to categorize HEDs as Priority 1, 2, or 3. Priority 1 HEDs are those with direct safety consequences or potential safety impact, and Priority 2 HEDs are those with potential consequences to plant performance and operability. Section 18.1.4.4 of this SER explains that the applicant intends to correct these HEDs and identifies when they will be corrected.

The staff finds that the applicants method is acceptable because the applicant will conduct an evaluation to identify those HEDs that need to be corrected based on whether they could impact plant safety or plant performance, which is consistent with Criterion 11.4.4(2).

Open Item 18-18: The staff will verify the implementation of the above practices upon receipt of the V&V RSR.18-155

18.10.4.5.3 Development of Design Solutions (Criterion 11.4.4(3))

Criterion 11.4.4(3) states that the applicant should identify design solutions to correct HEDs and, as part of the design solution, should evaluate the interrelationships of individual HEDs.

The V&V IP, Section 5.1, explains that after an HED has been prioritized, it is routed to the HFE design team, simulator review board, or both, as appropriate for resolution. The V&V IP, Figure 5-1, Human engineering discrepancy resolution process, illustrates the process the applicant will use to identify, review, implement, and verify design solutions. The V&V IP, Section 5.2 explains that the applicant will assess the cumulative or combined effects of multiple HEDs (i.e.,

the interrelationships of individual HEDs) as part of the HED resolution process.

The staff finds the applicants design solution approach acceptable because the applicant will identify design solutions to correct HEDs and also evaluate the interrelationships of individual HEDs, which is consistent with Criterion 11.4.4(3).

Open Item 18-19: The staff will verify the implementation of above practices through a review of the V&V RSR.

18.10.4.5.4 Design Solution Evaluation (Criterion 11.4.4(4))

Criterion 11.4.4(4) states that the applicant should evaluate design solutions to demonstrate the resolution of HEDs and to ensure that new HEDs are not introduced. Criterion 11.4.4(4) also says that generally, the evaluation should use the V&V method that originally detected the HED.

The staff issued RAI 9394, Question 18-17 (ADAMS Accession No. ML18068A733), requesting the applicant to explain how (i.e., the method by which) it will evaluate design solutions to ensure that HEDs are resolved and no new HEDs are introduced. In the responses to RAI 9394, Question 18-17 (ADAMS Accession Nos. ML18123A540 and ML18239A250), the applicant stated it will revise the V&V IP to include the following information:

Generally, design solutions will be verified to be acceptable using the same V&V method that originally detected the issue. For example, if an HED-1 is identified during performance of an ISV scenario, a similar scenario would be run to verify the solution was acceptable. Because the impact of design solutions vary widely this general practice may be adjusted using engineering judgment to ensure a thorough and appropriate test is conducted.

The following elements are considered when making this judgment:

  • number of procedures affected
  • number of HSIs affected
  • complexity of the condition under which the design solution is used
  • uniqueness of the design solution Because the applicant plans to verify that design solutions are acceptable by using the same method that originally detected the issue is acceptable, the applicants method is consistent with Criterion 11.4.4(4). The staff also understands that in some cases, it may not be necessary to verify a design solution is acceptable by using the same method that originally 18-156

detected the issue, and the applicant has identified appropriate factors to consider when making this decision. RAI 9394, Question 18-17, is a Confirmatory Item.

Open Item 18-20: The staff will confirm that the applicant evaluated design solutions using the method described in the V&V IP when the applicant submits the V&V RSR.

18.10.4.5.5 HED Evaluation Documentation (Criterion 11.4.4(5))

The staff reviewed the V&V IP, Section 5.1, which discusses the applicants documentation of the evaluation of HEDs. The staff compared this information to Criterion 11.4.4(5), which states that the applicant should document each HED, including the following:

  • the basis for not correcting an HED
  • related personnel tasks and functions
  • related plant systems
  • cumulative effects of HEDs
  • HEDs as indications of broader issues In the V&V IP, Section 5.0, the applicant explained that HFE issues and HEDs are documented and tracked in the HFEITS database. If an HED is not resolved the basis for a decision for accepting an HED without change in the integrated design is documented. It may be based on accepted HFE practices, current published HFE literature, trade-off studies, tests, or engineering evaluations.

In the V&V IP, Section 5.2, the applicant described how HEDs are categorized based on their impact on the following:

  • personnel tasks and functions
  • plant systems
  • HSI feature
  • individual HSI component
  • operating procedure In addition, the applicant described an extent of condition analysis for HEDs that considers cumulative or combined effects of multiple HEDs and HEDs that may represent a broader issue.

The staff understands that the analysis of each HED considers the bulleted information in this criterion; however, it was not clear to the staff whether the information is documented with the exception of the basis for not correcting an HED. The staff issued RAI 9394, Question 18-18 (ADAMS Accession No. ML18068A733), requesting that the applicant clarify the specific information documented for each HED.

In the response to RAI 9394, Question 18-18 (ADAMS Accession No. ML18123A540), the applicant stated the following:

Human engineering discrepancies (HEDs) are documented in the human factors engineering issues tracking system (HFEITS) database. A working level HFEITS procedure is used by the NuScale HFE staff to ensure the applicable information related to the five bullets is documented in the HFEITS database. The Human 18-157

Factors Verification and Validation Implementation Plan (RP-0914-8543) has been revised to reflect this documentation.

Because the applicant plans to document all of the information listed in Criterion 11.4.4(5), the staff finds the applicants method is acceptable. RAI 9394, Question 18-18, is a Confirmatory Item pending the completion of the changes to the V&V IP.

Open Item 18-21: The staff will confirm that HED tracking conforms to Criterion 11.4.4(5) by review of the V&V RSR.

Combined License Information Items No COL information items are associated with NuScale DCA Part 2 Tier 2, Section 18.10.

Conclusion The staff cannot make a conclusion until after the staff has received and reviewed the V&V RSR.

18.11 Design Implementation Introduction The objective of the staffs review is to ensure that the applicants as-built design conforms to the verified and validated design that resulted from the HFE design process.

Summary of Application DCA Part 2 Tier 1: Refer to Section 18.1.2 of this SER.

DCA Part 2 Tier 2: The applicant provided a description in of this HFE element in DCA Part 2 Tier 2, Section 18.11, Design Implementation.

ITAAC: The ITAAC associated with this element is in Tier 1, Section 3.15, Table 3.15-1. RAI 9415, Question 18-46 (ADAMS Accession No. ML18204A190) was issued to address deficiencies with the ITAAC. RAI 9415 is currently unresolved. Changes to the ITAAC on Table 3.15-1 are expected to be necessary to resolve RAI 9415.

TS: There are no TS associated with this element.

Topical Reports: There are no topical reports associated with this element.

Technical Reports: Refer to Section 18.1.2 of this SER.

Regulatory Basis The following NRC regulations contain the relevant requirements for this review:

  • 10 CFR 52.47(a)(8) as it pertains to the information necessary to demonstrate compliance with any technically relevant portions of the Three Mile Island requirements set forth in 10 CFR 50.34(f), except paragraphs (f)(1)(xii), (f)(2)(ix), and (f)(3)(v)18-158
  • 10 CFR 52.47(b)(1), which requires that a DC application include the proposed ITAAC that are necessary and sufficient to provide reasonable assurance that, if the inspections, tests, and analyses are performed and the acceptance criteria met, a facility that incorporates the DC has been constructed and will operate in accordance with the DC, the provisions of the Atomic Energy Act of 1954, as amended (AEA), and the NRC's rules and regulations.
  • 10 CFR 50.34(f)(2)(iii) - Provide, for Commission review, a control room design that reflects state-of-the-art human factor principles prior to committing to the fabrication or revision of fabricated control room panels and layouts SRP Chapter 18,Section III, Acceptance Criteria, lists the acceptance criteria adequate to meet the above requirements, as well as review interfaces with other SRP sections. Acceptance criteria for HFE design methodology are provided in NUREG-0711 (listed below). (NUREG-0711 references NUREG-0700, Human-System Interface Design Review Guidelines, which provides detailed acceptance criteria for HFE design attributes.)
  • NUREG-0711, Revision 3, Chapter 12, Section 12.4, Review Criteria Technical Evaluation NUREG-0711, Section 12.3, Applicant Products and Submittals NUREG-0711, Section 12.3, Applicant Products and Submittals, states the following:

If the applicant submits an IP, it should describe the methodology for conducting design implementation. The NRC will review it using the criteria in Section 12.4 below.

The staff used the criteria in NUREG-0711, Section 12.4, to evaluate the applicants DI IP in order to verify the following:

  • The as-built design conforms to the verified and validated design resulting from the HFE design process.
  • The implementation of plant changes considers the effect on personnel performance and provides the necessary support to give reasonable assurance of safe operations.

Unlike most other NUREG-0711 elements that have RSRs as part of the DCA, NuScale has submitted an IP for the design implementation element. This is because it is not possible to submit the RSR for this element until the plant is constructed and the as-built MCR can be evaluated to verify that it conforms to validated design. As such, the staff expects that an ITAAC will be necessary to complete this element; however, the ITAAC currently described in Chapter 18 of the Tier 1 submittal is insufficient to cover the full scope of design implementation activities. RAI 9415, Question 18-46 (ADAMS Accession No. ML18204A190), addresses this concern as well as other issues with the design implementation IP. The NRC received a response to RAI 9415, Question 18-46 (ADAMS Accession No. ML18172A318), but the response did not fully address the issues identified with the DI IP.18-159

On August 7 and 21, 2018, two public meetings were held to discuss NuScales strategy for DI (ADAMS Accession No. ML18235A137). The third public meeting was held on October 17, 2018 (ADAMS Accession No. ML18304A258). NuScale has proposed a method for DI in a draft supplemental RAI response that uses an ITAAC and other existing regulatory controls to ensure that a COL applicant adequately performs DI activities. The strategy does not include the submittal as an RSR as described in NUREG-0711 Section 12.3, Applicant Submittals. Staff is currently assessing the proposal. Resolution of this issue is critical to ensuring that the activities described in the DI IP are performed appropriately by a COL applicant and that there will be a sufficient means for the NRC to perform oversight of this work. This is Open Item 18-22, and it relates to multiple criteria in this section. Staff are engaging the applicant to come to an agreement upon an appropriate ITAAC and an appropriate strategy for meeting the intended purpose of the RSR.

NUREG-0711, Section 12.4.1, Final HFE Design Verification for New Plants and Control Room Modifications, includes four applicable review criteria for this topic. The review criteria in Section 12.4.2 Additional Considerations for Reviewing the HFE Aspects of Control Room Modifications, of NUREG-0711 apply only to plant modifications and are therefore not applicable to this DC review.

Evaluation of Aspects of the Design Not Addressed in Verification and Validation (Criterion 12.4(1))

18.11.4.2.1 Summary of Application The DI IP, Section 1.2, Scope, indicates that activities that were not evaluated in the V&V testing but that are part of the HFE program are within the scope of the design implementation.

This section also commits to using an appropriate V&V method.

The DI IP, Section 2.0, Design Implementation Assessments, identifies several specific methods that will be used during the design implementation process to ensure that the software, hardware, and facility configurations match the appropriate design drawings and specifications.

The DI IP, Section 2.0, also indicates that the HED process will be used if a nonconformance is identified. An HFE evaluation will be conducted to determine whether the deviation impacts the results of the ISV. If the nonconformance has no impact on the ISV results, then a specific validation method is used to confirm adequate results (such as tabletop walkthrough, mockup, part-task simulator, or plant walkdown). If the nonconformance does have an impact on the ISV results, then the applicable portion of the ISV will be repeated.

18.11.4.2.2 Staff Assessment The DI IP, Section 1.2, indicates that the scope of design implementation activities includes those activities that were not previously included in V&V testing consistent with the criterion.

The DI IP, Section 2.0, provides guidance regarding the methods to be used to validate any activities that were not validated during the ISV. Section 3.0 of the DI IP indicates that all HEDs that can affect safety will be addressed during the DI process. The DI IP allows flexibility in selecting an appropriate method for validating those HEDs for which an analysis has confirmed that the ISV results will not change (such as by using walkdown and subject matter expert review). If an HED may change the results of the ISV, a more controlled approach is proposed,18-160

which includes rerunning the applicable portion of the ISV to confirm that the results remain valid.

18.11.4.2.3 Conclusion The staff finds that the applicant proposed a methodology that covers the appropriate scope, including aspects of the design that were not addressed in the V&V. The methods above are common methods for validating actions and have an increased focus on those deviations that may affect the ISV results. Therefore, the staff finds this treatment to be acceptable to meet this criterion.

Comparison of Final Products to Planned Design and Identification of Discrepancies (Criterion 12.4(2))

18.11.4.3.1 Summary of Application The DI IP, Section 1.2, indicates that the facility configuration of the as-built plant will be evaluated to ensure that it matches the facility that was simulated during the ISV. This includes the final HSIs, procedures, and training used in the MCR and certain LCSs that are evaluated as part of the design implementation process.

The DI IP, Section 2.0, indicates that the HED process will be used if nonconformances are identified. The V&V RSR gives a full description of the HED process.

18.11.4.3.2 Staff Assessment The DI IP, Section 1.2, describes a process that compares the as-built plant to the design as verified during the V&V activities. This is an inappropriate basis for comparison because the as-built plant should ultimately represent the plant as verified during V&V activities as well as any changes to the design that were analyzed during design implementation activities to ensure that they have not introduced new human performance concerns. The current documentation does not cover this entire scope; therefore, the staff issued RAI 9415, Question 18-46 (ADAMS Accession No. ML18204A190), to address this issue. The response to 9415 was problematic, therefore, NuScale submitted a draft supplement which is described in Section 18.11.4.1 above.

A series of public meetings have been held to discuss the strategy. Significant progress has been made to resolve the issues described in RAI 9415, however it is still unresolved at this time (see Open Item 18-22). Staff are continuing to engage with NuScale on this issue.

18.11.4.3.3 Conclusion The staff finds that the applicant has developed a methodology that includes the review of the final HSIs, procedures, and training within the scope of the DI IP. HEDs will be used to document, track, and resolve or justify any nonconformances. Although the applicant did not provide an indepth description of the HED process in the DI IP, the HED process is described in the V&V IP and is evaluated under Element 11, Human Factors Verification and Validation, of NUREG-0711. However, RAI 9415 must be resolved before the staff can draw any conclusions about the acceptability of this criterion (see Open Item 18-22).18-161

Verification that All Human Factors Engineering Issues Have Been Addressed (Criterion 12.4(3))

18.11.4.4.1 Summary of Application The DI IP, Section 3.0,Human Factors Engineering Issues Tracking System Resolution, indicates that HEDs identified before the ISV will be addressed before the ISV begins. Any HEDs identified during the ISV (that require resolution) and those identified after the ISV will be addressed during the design implementation.

18.11.4.4.2 Staff Assessment The DI IP, Section 3.0, states, Some HEDs may not be resolved during HFE program activities and may be on-going due to anticipated technology or other advancements; however, all HEDs are closed prior to DI completion.

The staff issued RAI 9415, Question 18-46 (ADAMS Accession No. ML18204A190), to request clarification about how the applicant will track issues to resolution if the issues are closed before DI completion. The response to RAI 9415 (ADAMS Accession No. ML18172A318) indicates that all Priority 1 HEDs (HEDs that may affect safety) will be resolved prior to design certification. This means that only Priority 2 (HEDs that may affect efficiency of the plant) or Priority 3 HEDs (HEDs that are not Priority 1 or 2 HEDs) will remain after design certification.

These can be communicated to an eventual licensee for resolution, but are not required to be resolved because they are not likely to impact safety. (See NUREG-0711, Section 11.4.4 Human Engineering Discrepancy Resolution Review Criteria, for more information about determining which HEDs must be resolved. The staffs assessment of this treatment is described in Section 18.10, Verification and Validation, above.)

18.11.4.4.3 Conclusion The staff cannot make a determination about the acceptability of this criterion until the response to RAI 9415, Question 18-46, is resolved (see Open Item 18-22). Staff currently have a draft supplement to RAI 9415, but a final version has not been docketed, therefore, staff cannot come to a final determination to the acceptability of this treatment.

Description of How the Human Factors Engineering Program Addressed Important Human Actions (Criterion 12.4(4))

18.11.4.5.1 Summary of Application The DI IP, Section 4.0 Addressing Important Human Actions, indicates that IHAs are addressed during the V&V testing (and through other NUREG-0711 HFE program elements).

The TIHA RSR gives additional details on how IHAs are treated throughout the HFE program.

18.11.4.5.2 Staff Assessment The review of Criterion 12.4(3) indicates that those IHA issues that were identified before the ISV will be addressed before the ISV testing. If additional IHA issues are identified during or after the ISV process, they will be addressed during the design implementation.18-162

The staff acknowledges that the TIHA RSR describes how IHAs should be treated throughout the human factors program. (NUREG-0711, Element 7, Treatment of Important Human Actions, addresses the treatment of all IHAs identified by NuScale. The NRC review of that area will determine the adequacy of that plan.) Although the staff does not expect that the applicant will identify new IHAs during or after the ISV process, the DI IP indicates that these IHAs can be addressed during the DI process as described, which would test the newly identified IHAs using an appropriate V&V method (see Criterion 12.4(1)).

The response to RAI 9415 (ADAMS Accession No. ML18172A318) provided new information about this criterion (see section 18.11.4.1 of this report). RAI 9415 is currently awaiting a supplemental response needed to resolve this RAI (see Open Item 18-22).

18.11.4.5.3 Conclusion The response to RAI 9415 proposed changes to the process described above. Therefore, the staff cannot make a determination until the changes can be resolved.

Combined License Information Items No COL information items are associated with NuScale DCA Part 2 Tier 2, Section 18.11, Design Implementation.

Conclusion The staff cannot make a determination until the issues addressed in RAI 9415 are resolved.

Questions still remain regarding the scope of the ITAAC and the ability of the ITAAC to provide adequate control of design implementation activities. The approach proposed by NuScale in the series of public meetings is unique, and proposes crediting alternate regulatory processes in lieu of a COL applicant submitting an RSR. This approach has not been submitted on the docket, therefore it is not considered in this SER and should remain an open item (Open Item 18-22) until it can be resolved. Staff are working with NuScale to get a docketed RAI response supplement which can then be considered in the next update to this document.

18.12 Human Performance Monitoring Introduction The objective of the staffs review is to assure that the applicant has prepared a human performance monitoring strategy for ensuring that no significant safety degradation occurs because of any changes that are made in the plant and to verify that the conclusions that have been drawn from the human performance evaluation remain valid over the life of the plant.

Summary of Application DCA Part 2 Tier 1: There is no Tier 1 information associated with this element.

DCA Part 2 Tier 2: The applicant identified a COL information item that will address this element in DCA Part 2 Tier 2, Section 18.12, Human Performance Monitoring.

ITAAC: There are no ITAAC associated with this element.18-163

TS: There are no TS associated with this element.

Topical Reports: There are no topical reports associated with this element.

Technical Reports: There are no TRs associated with this element.

Regulatory Basis The following NRC regulations contain the relevant requirements for this review:

  • Title 10 of the Code of Federal Regulations (10 CFR), Section 52.47(a)(8) as it pertains to the information necessary to demonstrate compliance with any technically relevant portions of the Three Mile Island requirements set forth in 10 CFR 50.34(f), except paragraphs (f)(1)(xii), (f)(2)(ix), and (f)(3)(v)
  • 10 CFR 50.34(f)(2)(iii) - Provide, for Commission review, a control room design that reflects state-of-the-art human factor principles prior to committing to the fabrication or revision of fabricated control room panels and layouts SRP Chapter 18,Section III, Acceptance Criteria, lists the acceptance criteria adequate to meet the above requirements, as well as review interfaces with other SRP sections. Acceptance criteria for HFE design methodology are provided in NUREG-0711 (listed below). (NUREG-0711 references NUREG-0700, Human-System Interface Design Review Guidelines, which provides detailed acceptance criteria for HFE design attributes.)
  • NUREG-0711, Revision 3, Chapter 13, Human Performance Monitoring, Section 13.4, Review Criteria Technical Evaluation DCA Part 2 Tier 2, Revision 1, Section 18.12, contains one COL information item pertaining to human performance monitoring. The staff evaluates the acceptability of the COL information item in this SER section. The staff concluded that no additional COL information items were needed.

Combined License Information Items Table 18.12-1 lists COL information item numbers and descriptions related to human performance monitoring from DCA Part 2 Tier 2, Section 18.12.18-164

Table 18.12-1. NuScale COL Information Items for DCA Part 2 Tier 2, Section 18.12 DCA Part 2 Item Description Tier 2 No.

Section 18.12-1 A COL applicant that references the NuScale Power Plant 18.12 design certification will provide a description of the human performance monitoring program in accordance with applicable NUREG-0711 or equivalent criteria.

Conclusion A COL information item has been identified for this HFE element because a human performance monitoring IP was not provided for it. The staff finds this acceptable because the monitoring of human performance, which includes maintaining personnel skills and ensuring no safety degradation from modifications to the design, starts after the plant becomes operational and is therefore a COL activity.18-165