ML18164A253

From kanterella
Jump to navigation Jump to search
Audit Summary for Nuscale TR-0516-49417, Evaluation Methodology for Stability Analysis of the Nuscale Power Module
ML18164A253
Person / Time
Issue date: 06/13/2018
From: Bavol B
NRC/NRO/DNRL/LB1
To:
Bavol B / 415-6715
Shared Package
ML18164A252 List:
References
Download: ML18164A253 (20)


Text

1 U.S. NUCLEAR REGULATORY COMMISSION AUDIT

SUMMARY

REPORT FOR THE REGULATORY AUDIT OF NUSCALE POWER, LLC TOPICAL REPORT TR-0516-49417-P, EVALUATION METHODOLOGY FOR STABILITY ANALYSIS OF THE NUSCALE POWER MODULE

1.0 BACKGROUND

By letters dated July 31, 2016 and December 3, 2016, NuScale Power, LLC (NuScale) submitted for U.S. Nuclear Regulatory Commission (NRC) staff review topical report (TR) TR-0516-49417, Revision 0, Evaluation of Stability Methodology for Stability Analysis of the NuScale Power Module, (Reference 1) and the subsequent letter, Response to NRC Request for Supplemental Information to TR-0516-49417-P, Revision 0, (Reference 2) in support of the NuScale design certification application, which the NRC accepted for review on March 7, 2017 (Reference 3). The NRC staff initiated a regulatory audit related to the review of the TR on June 8, 2018 (Reference 4). The audit was conducted according to NRC Office Instruction NRO-REG-108, Regulatory Audits (Reference 5). The audit was principally performed via the NuScale electronic reading room (ERR). An on-site meeting at NuScales Rockville office on August 31, 2017, initiated the quality assurance (QA) portion of the audit which focused on the QA plan for NuScales system code, PIM.

The purpose of the regulatory audit of TR-0516-49417 was to: (1) gain a better understanding of the NuScales proposed methodology for stability analysis; (2) verify information; (3) identify information that will require docketing to support the basis of the licensing or regulatory decision; and (4) review related documentation and non-docketed information to evaluated conformance with regulatory guidance. Additional background is available in the audit plan associated with this audit summary (Reference 4).

2.0 REGULATORY AUDIT BASIS Title 10 of the Code of Federal Regulations (CFR), Section 50.34, Contents of applications; technical information, provides that applicants and licensees shall provide a safety analysis report. The safety analysis reports described under 10 CFR 50.34 provide results from analyses performed using transient and accident analysis methods to demonstrate compliance with specific regulatory criteria. In the case of PIM, the analysis results are intended to demonstrate compliance with General Design Criterion 12, Suppression of Reactor Power Oscillations, of 10 CFR 50, Domestic Licensing of Production and Utilization Facilities, Appendix A, General Design Criteria (GDCs) for Nuclear Power Plants. Further, these safety analyses must comport with the quality assurance requirements of 10 CFR Part 50, Appendix B, Quality Assurance Criteria for Nuclear Power Plants and Fuel Reprocessing Plants.

The guidance in 10 CFR 52.47(a)(2) states that a final safety analysis report (FSAR) must include a description of the structures, systems, and components (SSCs) of the facility, with emphasis on bases and technical justifications that have been used to establish facilitys performance requirements and the evaluations required to show that safety functions will be accomplished. The descriptions shall be sufficient to permit understanding of the system designs and their relationship to the safety evaluation. 10 CFR 52.47(a)(19) states that the FSAR must also include a quality assurance program (QAP) applied to the design of the

2 facilitys SSCs. An audit is needed to evaluate the stability analysis methodology, which is incorporated by reference into Chapter 15, Transient and Accident Analyses, of the NuScale FSAR and used to develop safety conclusions documented therein. The NRC staff must have sufficient information to ensure that acceptable risk and adequate assurance of safety can be documented in the NRC staffs safety evaluation reports for TR-0516-49417 and Chapter 15 of the FSAR.

This regulatory audit is based on the following regulatory requirements from 10 CFR Part 50, Appendices A and B.

Appendix A:

GDC 10, Reactor design, requires that specified acceptable fuel design limits (SAFDLs) are not to be exceeded during normal operation, including the effects of anticipated operational occurrences (AOOs).

GDC 12, Suppression of reactor power oscillations, requires that the reactor core and associated coolant, control, and protection systems shall be designed to assure that the power oscillations which can result in conditions exceeding SAFDLs are not possible or can be reliably detected and suppressed.

GDC 13, Instrumentation and control, requires the availability of instrumentation to monitor variables and systems over their anticipated ranges to assure adequate safety, and of appropriate controls to maintain these variables and systems within prescribed operating ranges.

GDC 20, Protection system functions, requires that the protection system initiate automatically appropriate systems to assure that SAFDLs are not exceeded as a result of AOOs and to sense accident conditions and initiate the operation of systems and components important to safety.

GDC 25, Protection system requirements for reactivity control malfunctions, requires that the reactor protection system be designed to assure that SAFDLs are not exceeded in the event of a single malfunction of the reactivity control systems.

GDC 26, Reactivity control system redundancy and capability, requires two independent reactivity control systems of different design principals to be provided. One system shall use control rods, while the other system shall be capable of reliably controlling reactivity rate changes to assure acceptable fuel design limits are not exceeded. One of the systems shall be capable of holding reactor core subcritical under cold conditions.

GDC 29, Protection against operational occurrences, requires that reactivity control systems be designed to assure extremely high probability of accomplishing their safety functions in the event of anticipated operation occurrences.

Appendix B:

Section II, Quality Assurance Program, states that a quality assurance program providing control over activities affecting the quality of SSCs important to safety shall be established.

Section III, Design Control, states that design control measures shall be applied to

3 reactor physics, thermal hydraulic, and accident analyses.

Relevant regulatory guidance includes:

Regulatory Guide 1.203, Transient and Accident Analysis Methods, December 2005.

NuScale Design-Specific Review Standard (DSRS) 15.0, Introduction - Transient and Accident Analyses, Revision 0, June 2016.

NUREG-0800, Standard Review Plan for the Review of Safety Analysis Reports for Nuclear Power Plants: LWR Edition (SRP) 15.0.2, Review of Transient and Accident Analysis Method, Revision 0, March 2007.

DSRS 15.9.A, Thermal Hydraulic Stability Review Responsibilities, Revision 0, June 2016.

3.0 AUDIT LOCATION AND DATES The audit was conducted from the NRC headquarters via NuScales electronic reading room, and at NuScales Rockville office.

Dates: Phase 1: June 8, 2017 through September 6, 2017 Phase 2: September 7, 2017 through December 13, 2017 PIM Q/A Audit (onsite): August 31, 2017 Locations:

NRC Headquarters Two White Flint North 11545 Rockville Pike Rockville, MD 20852-2738 NuScale 11333 Woodglen Drive. Suite 205 Rockville, Maryland 20852 4.0 AUDIT TEAM MEMBERS The NRC audit team included staff from the Office of New Reactors (NRO) and the Office of Nuclear Regulatory Research (RES).

Ray Skarda (NRO, Audit Lead)

Rebecca Karas (NRO/SRSB Branch Chief)

Jeff Schmidt (NRO)

Carl Thurston (NRO)

Jim Gilmer (NRO)

Tim Drzewiecki (NRO)

Peter Yarsky (RES)

Peter Lien (RES)

4 Ron Harrington (RES)

Andrew Bielen (RES)

Bruce Bavol (NRO, Project Manager) 5.0 APPLICANT AND INDUSTRY STAFF PARTICIPANTS NuScale Darrell Gardner Ben Bristol Bob Houser Brian Hayden Carrie Fosaaen Cristhian Galvez Eric Young Kenny Anderson Meghan McCloskey Yousef Farawila Kent Welter 6.0 AUDIT DOCUMENTS The following NuScale documents were audited by the NRC staff during the conduct of the audit:

QP-0303-10267, Design Control Process, Revision 7, June 30, 2017.

EP-0303-322, Use of Software in Design and Analysis, Revision 5, February 14, 2017.

CP-0303-969, Software Classification, Revision 4, February 10, 2017.

CP-0303-16307, Software Design Verification, Revision 4, April 21, 2017.

SwRS-0304-16933, Software Requirements Specification for Thermal-Hydraulic Stability, Revision 0, March 10, 2017.

NP-EC-0000-2339, Evaluation Methodology for Stability Analysis of the NuScale Power Module, Revision 0, July 22, 2016.

SwTP-0304-16934, PIM v1.1 Software Test Plan, Revision 0, August 17, 2017.

SwDD-0304-16936, PIM Software Design Document, Revision 1, August 17, 2017.

SwTR-0304-16938, PIM v1.1 Software Test Report, Revision 1, August 22, 2017.

SwUM-0304-16937, PIM Software Users Manual, Revision 1, August 17, 2017.

ES-0303-5300, Safety Analysis Documentation Standard," Revision 1, June 1, 2017.

EP-0303-52592, Engineering Change Control, Revision 3, August 28, 2017.

CP-0803-7437, Software Configuration Management, Revision 2, February 2, 2017.

5 EC-0000-5327, PIM Fuel Conduction Unit Test, Revision 0, August 3, 2017.

EC-0000-5285, PIM Code-to-Code Benchmark to NRELAP5, Revision 0, June 28, 2017.

EC-A010-1507, System Transient Model Input Parameters Calculation, Revision 3, May 30, 2017.

EC-T050-3234, NRELAP5 Model for the SIET Fluid Heated Facility, Revision 0, June 30, 2015.

SwTR-0304-16938, PIM v1.1 Software Test Report, Revision 1, August 22, 2017.

CP-0303-16897, Software Acceptance Testing, October 3, 2017.

CP-0003-3590, Software Anomaly, Defect and Error Reporting, Revision 4, August 14, 2017.

QP-1603-12896, Corrective Action Program, Revision 4, March 23, 2017.

CR-0317-53568, DCD 15.9 (Stability) Input Based on Preliminary Reactivity Kinetics Assumptions."

CR-0816-50553, Conditions with Stability Topical Report (EC-0000-2339 and TR-0516-49417-P).

LP-1503-9815, 10 CFR Part 21 Reporting, Revision 3, April 12, 2017.

SIET-TF2_base.i.pdf, NRELAP5 SIET TF2 Model, provided to ERR on October 9, 2017.

7.0 DESCRIPTION

OF AUDIT ACTIVITIES AND

SUMMARY

OF OBSERVATIONS The NRC staff requested information concerning various aspects of the NuScale stability analysis methodology. For example, NRC staff requested additional details regarding the Societ Informazioni Esperienze Termoidrauliche (SIET) tests and their thermal-hydraulic systems models. Additional SIET information was also requested as part NRC staffs audits for NuScales loss-of-coolant-accident (LOCA) analysis methodology TR and non LOCA methodology TR, therefore, emphasis of the NRC staffs stability audit was placed on other safety relate issues. This audit summary focuses principally on the PIM QA plan as discussed, below, in section 7.1. A brief description of audit discussions concerning the simplified ambient heat loss model which was used to develop PIMs ambient heat transfer coefficient curve fit is provided in section 7.2.

7.1 PIM Software Quality Assurance Program The NRC staff conducted an audit of the PIM quality assurance plan to ensure that it meets the minimum requirements of design control, document control, software configuration control and testing, and corrective actions. This review was conducted in accordance with the SRP 15.02.

The NRC staff audit also considered the independent peer reviews that were performed during the key development steps for PIM; including training records to ensure that the reviewers had the relevant expertise.

6 Section I of SRP 15.0.2 specifies that the quality assurance plan covers the procedures for design control, document control, software configuration control and testing, error identification, and corrective actions used in the development and maintenance of the evaluation model. The program should also ensure adequate training of personnel involved with code development and maintenance, as well as those who perform the analyses.

The SRP states in Section III that the reviewers should confirm that the evaluation model is maintained under a quality assurance program that meets the requirements of 10 CFR Part 50 Appendix B. As a minimum, the program must address design control, document control, software configuration control and testing, and corrective actions. The reviewers should confirm that the quality assurance program documentation includes procedures that address all of these areas. The reviewers may conduct an audit of the implementation of the code developers quality assurance program. The reviewers should confirm that independent peer reviews were performed at key steps in the evaluation model development process. The peer review team should include programmers, developers, end users, and independent members with recognized expertise in relevant engineering and science disciplines, code numerics, and computer programming. Expert peer review team members, who were not directly involved in the evaluation model development and assessment, can enhance the robustness of the evaluation models. Further, they can be of value in identifying deficiencies that are common to large system analysis codes.

Design Control In the NuScale quality assurance program, design control is dictated by the governing procedure QP-0303-10267. QP-0303-10627 is an overarching quality assurance procedure that, as applicable, points to other procedures for specific guidelines depending on the situation.

For engineering analyses relying on software, such as PIM, Sections 5.1.9 and 5.2.4 of QP-0303-10267 direct any preparer and reviewer, respectively, to EP-0303-322.

According to the procedures, the requirements vary depending on how software is used and what type of software is used. Software is classified according to various designations per procedure CP-0003-969. According to CP-0003-969 designations, PIM is Type 2 software, which means that it is used to perform design, analysis, or calculations for nuclear power plant systems, structures, or components.

In addition to classifying the software by type, CP-0003-969 provides for different classifications of software integrity level (SIL). The SIL for PIM was two until recently, when the SIL classification was changed to three. For the calculations supporting the Stability TR, the version of PIM used was classified as SIL 2 per CP-0003-969 and therefore, subject to specific Q/A requirements per EP-0303-322. SIL 2 codes refer to codes used for safety analyses that are not pre-verified for use and, therefore, require additional verification per Section 5.4.3 of EP-0303-322 to ensure that the code outputs are valid. EP-0303-322 provides for three methods to verify SIL 2 code results: (1) design review, (2) alternate calculation, or (3) qualification testing.

In a general sense, design review refers to an independent review and verification of the non-pre-verified code through a line-by-line review of the code. An alternate calculation method is straightforward in that it refers to comparison of the code results to those generated using a separate method or hand calculation. Qualification testing, generally, refers to comparison of the code results to reference data either from experimental data or analytical solutions.

7 A SIL 3 code is pre-verified for use and may be used, per EP-0303-322, without independently verifying the code output results through any of the three methods described above for non-pre-verified codes. According to Section 5.2 of EP-0303-322, the pre-verified software must be checked to ensure that the correct version and configuration status are used; that the inputs are correct, and that any limitations are identified as per the users manual or software error reports.

Procedure CP-0303-16307 describes the procedures associated with the verification of software to confirm with design requirements. Per this procedure, a software requirements specification (SwRS) is required. The PIM SwRS is provided in SwRS-0304-16933. Based on this SwRS, a requirements traceability matrix is developed that associates the software design requirements with specific implementation within the software. This matrix forms the basis for the verification testing. Verification testing is performed using the various techniques of: design review, alternate calculation, or qualification testing. In accordance with the procedure, the verification activities must be performed by competent engineers that were not involved in the original design of the software (the subject of independent reviews is discussed in greater detail in Section 7.1.5 of this report). Software used for safety analysis is placed under configuration control (software configuration control is discussed in Section 7.1.3 of this report). The results of the software testing are used to establish any limitations of the software and these limitations are documented by the independent test engineer in the software users manual (documentation is discussed in Section 7.1.2 of this report).

With respect to design control, the quality procedures are adequate by: (1) providing for documentation requirements that spell out the design requirements and formalize this requirements for software in the form a requirements traceability matrix, (2) requiring independent verification of the implementation relative to the design requirements, and (3) providing for a procedural feedback loop between the outcome of the verification and allowed uses of the software.

Document Control Document control requirements for PIM are specified in CP-0303-16307, Section 6.2, which provides that for SIL 2 and 3 software the following documents must be maintained as quality records:

Design verification test plans; Test reports; Design checklists; Requirements traceability matrices; Software test plan; Software design documents; Software test reports; and Software users manual.

8 The PIM was exercised as a SIL 2 code in the development of the Stability TR, the associated design verification test plans, test reports, and design checklists for the calculations performed as part of the Stability TR submittal are maintained as part of the calculation notebook for these PIM calculations (NP-EC-0000-2339). However, now that PIM has become a SIL 3 code, the other documents are maintained as separate documents and were audited by the NRC staff.

The requirements traceability matrix is mandated by the software procedures and is formalized by a form CP-0303-16307-F01. This matrix associates the software requirements to specific verification items. For PIM, the requirements traceability matrix is maintained under Appendix A of the software test report now that PIM has become a SIL 3 code. The requirements traceability matrix includes such requirements as input processing of geometry, write to output file, and perform transient simulations, and many others; all of which describe the separate functions that are necessary for PIM to perform as part of the stability analysis methodology.

The remaining software documentation is comprised of the software test plan (SwTP), software design document (SwDD), software test report (SwTR), and the software users manual (SwUM). Since being reclassified as a SIL 3 code, these documents are maintained for PIM and were audited by the NRC staff.

SwTP-0304-16734 provides a description of the software test plan for PIM. It includes a description of the calculation cases necessary to test specific functions of the code.

SwDD-0304-16936 is the design document and provides a detailed description of the code structure, subroutines, and design of the input interface and provides a description of the actual source code.

SwTR-0304-16938 is the PIM software test report, which provides a summary of the software test plan results. The report documents that all test cases were run with the results being as expected.

SwUM-0304-16937 is the PIM Users Manual. A similar document was submitted to the NRC as part of the Stability TR submittal. The SwUM submitted to the NRC is largely similar to the internal document and is identical to Appendix A of NP-EC-0000-2339.

However, the internal document audited by the NRC staff includes additional sections, namely a section entitled Concept of Operations which includes a subsection that describes the assumptions and limitations of PIM. As part of the procedures for engineering analysis, this section of the SwUM must be reviewed to identify whether the code is applicable for a given engineering analysis. Further, as part of the procedures for software verification, this section documents any limitations on the software identified as part of the software testing.

Changes to the software documentation listed above, according to ES-0303-5300, is controlled by the engineering change control process, EP-0303-52592. Any changes to the software or associated documentation would be subject to pre-screening and/or review by a change control board. Some changes may be made without review by the change control board if the recommended change: (1) is a result of (and within the scope of) another previously approved change, (2) is editorial, or (3) has no evident impact outside the change originators group or area and has no evident impacts to licensing basis documents. Proposed changes that are not pre-screened based on the above criteria are subject to the review of an interdisciplinary group of engineers referred to as the change control board. The change control board is responsible for reaching a consensus disposition to any recommended change, which can involve approving

9 the change, denying the change, or suggesting further analysis of the change. Once a change is approved, a configuration manager is responsible for ensuring that the impacts of the change on other areas are cataloged and tracked.

Additionally, now that PIM is maintained as a SIL 3 code, it is subject to the procedural requirements of CP-0803-7437. According to CP-0803-7437 software change requests may arise from a variety of sources, some of these sources refer to reported software errors that must be corrected (corrective actions and error reporting is discussed in Section 7.1.4 of this report) or may come from end-user requests for enhancements or may arise due to changes in the design specifications. Per CP-0803-7437 any changes must be evaluated by a configuration control board (CCB). If the CCB approves the changes and the proposed implementation plan, the responsible engineer implements the changes. The new software baseline is then subject to software verification per CP-0303-16307. This verification involves acceptance testing (acceptance testing is controlled by procedure CP-0303-16897 and is discussed in Section 7.1.3.2 of this report).

During the on-site meeting the NRC staff asked NuScale representatives about the implications of the change process at it relates to the requirements of 10 CFR 50.59, Changes, tests and experiments. The NuScale personnel responded that 10 CFR 50.59 evaluations are not required because there is no current licensee operating a NuScale reactor. The NRC staff notes, however, that any licensee or COL applicant referencing the PIM methodology would be required to perform 10 CFR 50.59 evaluations of any changes.

The requirements of document control are met by ensuring that the software documentation is maintained as quality records. Furthermore, changes to these documents are controlled by a procedure requiring independent review of proposed changes and tracking of change impacts on other areas, including those areas important to nuclear safety. However, at the current time, such changes are not evaluated against the regulatory criteria of 10 CFR 50.59. Therefore it is not clear to the NRC staff how the requirements of 10 CFR 50.59 can be met by any licensee or COL applicant referencing the Stability TR since the NuScale quality procedures that govern changes to PIM do not require that records describing the code change request, implementation, testing and acceptance be provided to the licensee or COL applicant.

Software Configuration Control and Testing When the Stability TR was submitted, PIM was classified as a SIL 2 code; however, during the progress of the NRC staff technical review, PIM was reclassified as a SIL 3 code. Therefore, the NRC staff audit covered aspects of software configuration control and testing of PIM during the historical phase when PIM was a SIL 2 code, which is described in Section 7.1.3.1 of this report as well as the modern maintenance of PIM as a SIL 3 code, which is described in Section 7.1.3.2 of this report.

7.1.3.1 Historical Software Configuration Control and Testing As a SIL 2 Code As a SIL 2 code, which was the status of PIM during the development of the Stability TR, the software validation requirements of CP-0303-16307 were followed and documented within the NuScale stability analysis calculation notebook (NP-EC-0000-2339).

Section 4.2.1.4 of Appendix C of NP-EC-0000-2339 provides for the verification and validation software requirements for PIM. Specifically this document provides the verification and validation requirements in SwRS 20 (verification) and SwRS 21 (validation). Table 6-1

10 summarizes the requirements traceability matrix, which documents how each of the specified software requirements are met. Table 6-1 describes how SwRS 20 and 21 are met per the functional testing and validation cases described in Section 6.2 of NP-EC-0000-2339. The NRC staff audited the four areas covered by Section 6.2 of NP-EC-0000-2339:

Section 6.2.1 covers the pressure drop models. Two phase pressure drop models are not tested. The single phase pressure losses are computed using a simplified core model over a wide range of Reynolds Number to compare to reference calculations.

Results are provided in Figures 6-1 and 6-2 of NP-EC-0000-2339, but no explicit acceptance criteria were provided in the document with respect to these comparisons.

The results appear to be in good agreement based on visual inference.

Section 6.2.2 covers critical heat flux. The NRC staff did not review this portion of the documentation because calculation of the critical heat flux is outside of the scope of approval sought for PIM.

Section 6.2.3 covers fluid properties. Certain fluid properties were determined to be critical with respect to the PIM calculations and these properties are: (1) saturation temperature, (2) liquid density, (3) vapor density, (4) liquid enthalpy, (5) heat of vaporization, (6) liquid enthalpy from liquid temperature, and (7) liquid density from pressure and enthalpy. The PIM-predicted values for these seven critical properties were compared to reference properties and the acceptance criterion is that the properties agree within two percent. The NRC staff reviewed the results of the comparison between PIM and the results from a reference steam table over a pressure range from 1 to 150 bars. The results show that the seven critical properties all agree within the acceptance criterion of two percent.

Section 6.2.4 covers neutron kinetics. The neutron kinetics model was verified by comparison of PIM calculation results to analytical results. However, no explicit criteria are provided in the document with respect to the comparison. The results of the comparison are provided in Figures 6-12 and 6-13 of NP-EC-0000-2339. The NRC staff agrees with the determination documented in the same calculation notebook that the results are essentially identical based on visual inference.

Other software requirements are verified through functional testing as described by Section 6.1 of NP-EC-0000-2339. The NRC staff reviewed a subset of the functions:

Section 6.1.1.4 covers mass conservation. This test is conducted by running PIM at constant mass and mass flow conditions and verifying there is no change in the mass flow rate (<0.01 percent) after a null transient lasting several hundred seconds.

Section 6.1.1.5 covers energy conservation. The test is conducted by running PIM through a null transient and confirming that the liquid temperature remains close to the initial, specified temperature within a tolerance of 0.01 percent. [

11

]

Section 6.1.1.6 covers the steam generator. Because there are multiple implementations of the steam generator heat transfer models in PIM, the verification plan addresses each available steam generator model. There are five total models, each with a different isgtyp index ranging from 0 to 4. Index isgtyp=0 refers to explicit modeling of the steam generator with detailed heat transfer. The other options refer to different simplifications to the model with isgtyp=4 referring to the user option to input the steam generator heat transfer profile. No acceptance criteria for the tests were specified. The verification proceeded by taking the results of the calculations and confirming that the predicted heat transfer profile listing in the column heat is consistent with expectations. Further, a null transient calculation was run to ensure that the associated distributions hold. During this testing an anomaly was identified with respect to isgtyp=1 whereby it was possible for the calculation to proceed when total heat removal from the steam generator was less than the reactor power in steady-state, which eventually results in the code crashing due to enthalpy error. The NRC staff notes that it is not necessary to review all steam generator implementations depending on which implementations are used for licensing analyses per the Stability TR. To this end, the NRC staff has already issued RAI 8846 Question 01-21 with respect to the steam generator modeling. It is expected that this RAI response will address which steam generator implementation options are allowed in the scope of the overall stability analysis methodology. The Stability TR appears to describe the isgtyp=0 and isgtyp=4 options, but it is not clear which one is the basis for the licensing calculations; therefore it may be the case that the isgtyp=1 results are not germane to the NRCs review.

As a result of these verification test results, the anomalies identified with respect to energy conservation and isgtyp=1 enthalpy error have been listed in Section 6.4 of NP-EC-0000-2339 which lists the limits of usage for PIM. However, a review of the PIM software users manual (SwUM-0304-16937) makes no mention of these limitations. It was expected that these limitations would be listed to be consistent with CP-0303-16307 (which directs test engineers to ensure that the users manual reflects the limitations on code use as established by software testing) and EP-0303-322 (which describes how limitations on use from the users manual are considered in safety analysis). The NRC staff has issued RAI 9333, which addresses inconsistencies between the software verification testing performed for PIM during its use as SIL 2 software and the users manual (both SwUM-0304-16937 as well as the users manual submitted to the NRC staff titled Appendix A: PIM Input and Execution Instructions). The NRC staff notes that the limitations specified in the submitted users manual (which refers to Section 5.2 of the Stability TR) matches the description in SwUM-0304-16937.

7.1.3.2 Modern Software Configuration Control and Testing As A SIL 3 Code As a SIL 3 code, PIM is pre-verified for use. This pre-verified version is PIM version 1.1 and has been uploaded to the calculation server for codes that are pre-verified for use. PIM is now maintained and configuration-controlled per CP-0803-743. Under this procedure, changes made to PIM must be evaluated by a configuration control board, and, if approved, implemented changes must be testing according to the SwTP acceptance testing requirements. Section 5.10 of the PIM SwTP states that the acceptance testing is comprised of the integration test cases as well as any addition tests that may be specified as part of any new testing requirements.

12 The PIM SwTP specifies the required testing, which is entirely covered by the integration testing suite. This integration testing suite covers all functional requirements and represent an incremental increase over the scope of testing performed as part of the code verification activities for PIM use as a SIL 2 code. The test suite is specified in Table 5-2 of the SwTP and is divided into ten areas which are described in Section 5.2 of the test plan.

The first area includes: basic functioning of input, output, and operation; input error detection; and steam generator. This area is based on Appendix C of the PIM calculation notebook NP-EC-0000-2339 and includes 15 specific cases. The cases are run and confirmed to agree within numerical tolerances as part of the test plan.

The second area relates to the pressure drop models and is categorized under two subsets: 02a, and 02b. The test suite is based on NP-EC-0000-2339 Appendix C Section 6.2.1, but the scope of calculation is reduced. In the original work, 70 individual cases were considered with variation in Reynolds number to cover the entire application range of PIM. In the SwTP, the scope of testing is reduced from 70 individual cases to 5 individual cases. The results of the code output are compared to alternate calculation results to ensure consistency, as was done in NP-EC-0000-2339.

The third area relates to the critical heat flux. The NRC staff did not review this portion of the documentation because critical heat flux predictions are not within the scope of the review.

The fourth area relates to fluid properties. The same seven critical properties are identified in the SwTP as were described in Section 6.2.3 of NP-EC-0000-2339 and the test procedure and criteria are the same.

The fifth area relates to neutron kinetics. The PIM calculation is compared to a reference analytical solution consistent with Section 6.2.4 of NP-EC-0000-2339. Unlike the original calculation note, however, the SwTP provides that the comparison must indicate agreement within the numerical tolerance as an acceptance criterion.

The sixth area relates to transient cases. Appendices E and F of NP-EC-0000-2339 form the basis for the acceptance tests. Four cases specified in Table 5-2 of the SwTP were selected and the tests must confirm agreement within numerical tolerances.

The seventh area relates to fuel rod heat transfer. The SwTP refers to a unit test of the fuel rod heat transfer calculation in EC-0000-5327. Per the SwTP, comparison to the reference case must agree within numerical tolerance. The NRC staff reviewed EC-0000-5327 which provides a comparison of PIM transient heat flux results for a given set of fuel characteristics and specified driver oscillation to heat flux calculated according to a finite difference method based on the analytical heat conduction equations. The results confirm excellent agreement between PIM and the alternative calculation.

The eighth area relates to benchmarking against NRELAP5. The primary purpose of this test is to verify the steam generator model through comparison to an alternate calculation using a higher fidelity thermal-hydraulic model. The benchmark, however, also compares pressure drops, fluid properties, and coolant densities in the core, lower riser, upper riser, and downcomer. The SwTP provides that the agreement must be within numerical tolerances relative to the initial benchmark unit test documented in EC-

13 0000-5285. The NRC staff reviewed EC-0000-5285 and found some differences between PIM and NRELAP5, the most notable of which is the difference in the secondary side temperature profile and axial power profile. The major trends are well reflected, however, and the results are in reasonable agreement.

The ninth area relates to NIST-1 validation. Five cases were identified from stability testing at NIST-1. These cases were analyzed using PIM to determine the decay ratio and oscillation period at various power levels. The initial validation demonstrated reasonable prediction of oscillation period with a conservative bias in the decay ratio prediction relative to the data. For the purpose of the SwTP, the reference calculations are compared to PIM results to determine that the results agree within numerical tolerances.

The tenth area relates to decay ratio. In particular this area includes four calculations that utilize a [ ] feature in PIM. In these calculations, the [

] is used and the results are compared to an analytical solution.

Table 5-3 in the SwTP provides decay ratios for each case, which is also the acceptance criteria for these test cases.

The scope of the acceptance testing includes a small increase in scope relative to the testing performed when PIM was a SIL 2 code. In particular, the scope increased to include the fuel rod conduction alternate calculation, the thermal-hydraulic NRELAP5 alternate calculation, NIST-1 validation calculations, and an area related to decay ratio predictions with an [

]. The NRC staff notes that the nature of the testing is to compare PIM results between versions of that code, but to use the PIM models referenced in the above calculations.

The testing for NIST-1 validation (the ninth area), as an example, does not test PIM by comparing the latest version to the data collected from NIST-1. The NRC staff confirmed that the testing scope covers all aspects identified in the traceability matrix.

The NRC staff reviewed SwTR-0304-16938. The SwTR concludes that all of the tests met the acceptance criteria and provided positive results.

Section 5.9 of the PIM SwTP provides the requirement for regression testing, which must be conducted to ensure that code changes do not introduce unexpected changes to the code outputs. According to CP-0303-16897, regression testing is the selective retesting to detect errors introduced during modification of the computer program or to verify that the modified computer program still meets its specified requirements. Subsequent to the NRC staff on-site audit, CP-0303-16897 was revised and the revision updates the software test plan template (CP-0303-16897-F03-R0 to CP-0303-16897-F03-R1). Previously, CP-0303-16897-F03-R0 Section 5.9 describes the regression testing but only refers to doing qualification against the previous code version. CP-0303-16897-F03-R1 Section 5.9 now states that regression testing, as specified in the SwTP, refers to the testing done to ensure that applied changes to the application have not adversely affected previously tested functionality. Section 5.9 of SwTP-0304-16934, specifies that regression testing will be based on the initial qualification testing and that as each new PIM release is created, a regression test should be stored for that version.

PIM regression testing would occur by comparing new versions of the code results for the regression test suite against the results stored in the regression test for the previous code version. This approach to regression testing would make PIM susceptible to code drift.

14 Code drift refers to a process whereby multiple, subsequent changes to an evaluation model or code result in a significant change in the results. Code drift can occur if each change results in only a small difference in the results from version to version in each change, but the difference continues to accumulate in a consistent direction. 10 CFR 50.46a(3)(i) codifies this concept for evaluation models (EMs) used for loss-of-coolant-accident (LOCA) analyses. For LOCA EMs, the impact to the figure of merit (peak cladding temperature in this case) must be integrated with each change (including error corrections) made to the EM. In this sense, because the integrated effect of the changes are considered, the reporting requirements are triggered when the accumulated effect becomes significant. Without tracking of the integrated effect of changes, it would be possible for changes to accumulate such that each change from an approved version results in code drift and a significant difference goes undetected.

The current regression testing specified under the PIM SwTP is susceptible to allowing code drift. This is for two reasons: (1) the SwTP specifies that the code results are compared to the results produced with the most recent approved version, as opposed to a reference version or reference results, and (2) the SwTP regression test suite does not necessarily include validation data, which would not be subject to change between code versions. Code drift is typically addressed by vendors by providing for a static benchmark in the regression test suite, such as the comparison of code output against experimental data, reference results provided in the licensing documentation, or against results from a static, frozen code version. Since a similar element is missing from the SwTP, and similarly is missing from the governing procedure: CP-0303-16897, the NRC staff cannot conclude that the Q/A procedures are adequate from the standpoint of software testing. Therefore, the NRC staff issued RAI 9333 which addresses its regression testing and code drift concerns.

Error Reporting and Corrective Actions In CP-0003-3590, it provides the procedures for reporting software anomalies, defects, and errors. Per this procedure, errors in software are tracked through the NuScale Error Tracking System (NETS). This procedure provides specific guidelines for determining the severity level associated with different types of errors in software. Depending on the severity level of the error, a condition report (CR) may be generated within the NuScale corrective action program (CAP). The CAP is described by procedure QP-1603-12896.

The CAP was generically reviewed by the NRC staff as part of the NRC staff review of Topical Report NP-TR-1010-859-NP, Revision 0, Quality Assurance Program Description for Design Certification of the NuScale Power Reactor, October 29, 2010, and the subsequent revisions.

The findings with respect to CAP in the NRC staffs safety evaluation for the initial submittal remain applicable to the most recently approved version (Revision 3). The NRC staff found that the NuScale quality assurance procedures follow the guidance of SRP Section 17.5, Paragraph II.P for establishing the necessary measures and governing procedures to promptly identify, control, document, classify, and correct conditions adverse to quality. Based on these previous review findings, the NRC staff audit scope did not include a review of the CAP, rather, in accordance with SRP 15.0.2, the NRC staff audited specific examples of reports in NETS and the corrective action program (CAP) to better understand the processes associated with error identification and corrective actions and to ensure that the procedures were being followed in the case of PIM. These specific examples are related to PIM and the stability calculations documented in the Stability TR.

The first example is Condition Report (CR) 0317-53568, DCD 15.9 (Stability) Input Based on Preliminary Reactivity Kinetics Assumptions, which documents an issue found whereby the

15 stability analysis was not in procedural compliance and the issue was identified as a configuration control issue. At issue was that the PIM calculations that support the DCD submittal were performed using reactivity kinetics data from a preliminary calculation note, as opposed to a final calculation note. A review of the calculations demonstrated that the numerical values between preliminary and final did not change, and that, therefore, the calculation results were not impacted.

The second example is CR-0816-50553, which documents typographical errors in the calculation note and the Stability TR whereby analysis results for a case at 40 megawatts were incorrectly referred to as analysis results at rated conditions. The CR is closed based on correction of the errors in the documentation and the CR provides that 10 CFR Part 21, Reporting of Defects and Noncompliance, evaluations were completed.

In QP-1603-12896 it provides for screening errors to determine if an error is reportable under 10 CFR 21. In which case, QP-1603-12896 points to LP-1503-9815, 10 CFR Part 21 Reporting, Revision 3, April 12, 2017. LP-1503-9815 provides for a condition report screening committee (CRSC). The CRSC is responsible for reviewing deficiencies identified in the CAP to determine if the deficiencies are reportable to the NRC per the requirements of 10 CFR Part 21.

If the CRSC determines that the CAP item is reportable, they are responsible to mark the item as potentially reportable and the evaluation of the item transfers from the CRSC to the responsible manager, who assigns an evaluator who is a subject matter expert. The evaluator follows a screening process included in Appendix A of the subject procedure that aligns with the requirements and definitions in 10 CFR 21. The procedure, further, provides for timelines and interim reporting requirements that are consistent with the provisions of 10 CFR 21.

The NRC staff reviewed CR-0816-50553, which demonstrates how the outcome of the CRSC review and condition reporting determination per LP-1503-9815 are documented in CRs. The NRC staff found that the example was illustrative of the process and clearly indicated the interfaces between the associated procedures.

The third example was reviewed by the NRC staff during the on-site visit and is NETS entry 1141. NETS entry 1141 requests a feature be added to PIM to perform a loop closure check. A similar type of check is common is systems analysis tools, in fact, a similar feature is included in TRACE. Implementation of this type of input checking should not impact the PIM stability methodology.

The fourth example is NETS entry 1145, which documents an error in the PIM calculation of density head due to incorrectly interpreting [

] variable. The issue was determined to be numerically negligible, but requiring a modification for self-consistency. The NRC staff reviewed this NETS entry to better understand the process for elevating issues to the CAP. In this case, the diagnosed severity level was sufficiently low that it was not elevated to the CAP.

The procedures meet the requirements for corrective actions by providing a process for determining if error reports under the NETS are, in fact, items appropriate for inclusion as CRs in the CAP. In addition, CAP CRs are screened for potential Part 21 reporting according to the appropriate criteria and reporting timelines. A review of PIM-related specific examples has demonstrated that NuScale faithfully adheres to the CAP-related quality procedures previously approved by the NRC staff.

16 Independent Peer Reviews and Training The NRC staff conducted an audit of the peer reviews and training records of the peer reviewers to confirm that independent peer reviews were performed at key steps in the evaluation model development process per the guidance in SRP 15.0.2. The NRC staff confirmed that the peer review team included programmers, developers, end users, and independent members with recognized expertise in relevant engineering and science disciplines, code numerics, and computer programming.

The PIM has been classified as a SIL 2 code, which required an independent peer review per QP-0303-10627 and EP-0303-322. For the stability calculations provided to the NRC in the licensing topical report, the governing engineering calculation notebook (NP-EC-0000-2339) provides the documentation of the code development, analyses, and review.

Per the stability analysis calculation notebook, NP-EC-0000-2339, the independent peer review was conducted by a contractor. By contracting the peer review effort to an external entity, NuScale has ensured the independence of the peer review. The NRC staff audited the statement of work for the contract, which specified that the peer reviewer shall have deep knowledge of thermal-hydraulic stability phenomena, theory and analytical modeling. The NRC staff agrees that these competencies are appropriate for the conduct of such a review. The NRC staff reviewed the peer review documentation, which provides the identity of the peer view.

The peer reviewer has recognized expertise in the relevant technical disciplines and is acceptably qualified to perform the task. In addition, the NRC staff confirmed through training records and the contract statement of work, that the contractor was trained in the relevant NuScale quality assurance procedures governing the processes and requirements for independent peer review.

Moving forward, PIM has been reclassified as a SIL 3 code per CP-0003-969 and will be maintained per CP-0303-16307. The NRC staff confirmed that the quality assurance program ensures adequate training of personnel involved with code development and maintenance, as well as those who perform the analyses by auditing the training requirements and training records of the NuScale staff.

The NRC staff were trained in the relevant QA procedures:

QP-0303-10267, Design Control Process, CP-0303-16307, Software Design Verification, QP-1603-12896, Corrective Action Program, CP-0003-969, Software Classification, EP-0303-303, Preparation and Approval of Engineering Calculations, EP-0303-322, Use of Software in Design and Analysis, and LP-1503-9815, 10 CFR Part 21 Reporting.

In addition, analysts were trained in NuScale power module stability analysis and the PIM code in March 2017 according to the records audited by the NRC staff.

17 On the basis of the training records, the NRC staff concludes that the independent peer reviews required by the procedures moving forward will be conducted by NuScale staff qualified in the quality assurance procedures, stability phenomenology, and the PIM code.

PIM QA Conclusions The NRC staff audited the software quality assurance procedures for the PIM stability analysis code to ensure compliance with the minimum requirements specified in SRP 15.0.2. In the conduct of this audit, the NRC staff reviewed many procedures and also specific examples of quality documents related to PIM analyses and the PIM code. The NRC staffs review covered five major areas consistent with SRP 15.0.2: design control, document control, software configuration control and testing, error reporting and corrective action, and independent peer reviews and training.

Design Control In terms of design control, the NRC staff found that the governing procedures are acceptable insofar as they: (1) provide for documentation requirements that spell out the design requirements and formalize this requirements for software in the form a requirements traceability matrix, (2) require independent verification of the implementation relative to the design requirements, and (3) provide for a procedural feedback loop between the outcome of the verification and allowed uses of the software.

Document Control The requirements of document control are met by ensuring that the software documentation is maintained as quality records. Furthermore, changes to these documents are controlled by a procedure requiring independent review of proposed changes and tracking of change impacts on other areas, including those areas important to nuclear safety. However, at the current time, the NuScale procedures do not address the requirements of 10 CFR 50.59 in terms of evaluating changes. Therefore it is not clear to the NRC staff how the requirements of 10 CFR 50.59 can be met by any licensee or COL applicant referencing the stability TR since the NuScale quality procedures that govern changes to PIM do not require that records describing the code change request, implementation, testing and acceptance be provided to the licensee or COL applicant.

Software Configuration Control and Testing The NRC staff reviewed procedures associated with software testing and identified two issues needing additional clarification. The first issue is related to the documentation of code limitations that are identified as part of software testing. These limitations, according to NuScale procedures, should be translated to the code users manual to ensure that the codes are executed in an acceptable manner. The NRC staff identified two instances where code limitations appear to have been identified through software testing, but associated limitations are not captured in the users manual. The second issue relates to regression testing and code drift. The NRC Staff issued request for additional information (RAI) 9333, which requests corrections to the former issue and clarification on the latter issue.

Error Reporting and Corrective Action

18 The procedures meet the requirements for corrective actions by providing a process for determining if error reports under the NETS are, in fact, items appropriate for inclusion as CRs in the CAP. In addition, CAP CRs are screened for potential 10 CFR Part 21 reporting according to the appropriate criteria and reporting timelines. A review of PIM-related specific examples has demonstrated that NuScale faithfully adheres to the CAP-related quality procedures previously approved by the NRC staff.

Independent Peer Reviews and Training The initial peer review of PIM was performed by an external contractor. The NRC staff reviewed the details and confirmed that the external contractor was, indeed, independent, possessed the requisite technical competency to perform the review, and was adequately trained in the applicable quality assurance procedures at NuScale. Furthermore, on the basis of the training records audited by the NRC staff, the NRC staff concludes that the independent peer reviews required by the procedures moving forward will be conducted by NuScale staff qualified in quality assurance procedures, stability phenomenology, and the PIM code.

7.2 Ambient Heat Loss Model Section 5.5.4 Ambient Heat Loss Model, of the TR-0516-49417-P, provides a brief summary of PIMs ambient heat loss model. A cubic curve fit, where an ambient heat transfer coefficient, hamb, is computed as a function of coolant temperature is given by Eq. 5-48 of the TR. The TR does not indicate how the data (hamb, Tcoolant) that was used to obtain the cubic fit was originally computed. The TR does note that the cubic expression includes effects of conduction through the RPV and CNV, thermal radiation between the outer RPV and inner CNV surfaces, and convection at inner RPV and outer CNV surfaces.

On December 13, 2018, NuScale and NRC staff participated in an audit call, and NuScale provided clarification to several questions concerning the ambient heat loss model. The remaining issues needing clarification are described herein. During the audit call, NuScale provided a qualitative description of the simplified heat loss model, indicating it was a steady state, 1-D, (multi-layer) conduction approximation to the energy conservation equation. Also, during the audit call, NuScale could not recall the geometry (cylindrical, slab, or other) of the simplified ambient heat loss model, but indicated additional details of the model could be found in two reports, EC-0000-2339 and EC-A010-00001507_03, which were added to NuScales Electronic Reading Room (ERR). These references provided parameter values, such as heat transfer coefficients and emissivities on the RPV and containment wall surfaces. However, the NRC staff still needs clarification of the simplified ambient heat loss geometry, explicit mathematical expression of hamb developed for the ambient heat loss model, and radii/length values used for RPV/containment surfaces. The estimated ambient heat loss affects the primary side energy balance through heat addition term described in section 5.5.1.2, Energy Balance, of the TR. Since NuScales PIM code is currently an (NRC) unapproved code, and NuScale is seeking approval to use the PIM code to perform safety analyses as part of its stability methodology, further information regarding its ambient heat loss model is necessary.

As a follow up to the audit call, NRC staff has issued RAI 9388 requesting additional information concerning NuScales simplified ambient heat loss model that includes: the model geometry (slab, cylindrical, other), the explicit mathematical expression of the simplified ambient heat loss model, and parameter values used in the simplified model to obtain the nominal ambient heat loss at full power.

19 8.0 EXIT BRIEFING The NRC staff conducted an audit closing meeting via teleconference on December 13, 2017.

During the meeting NRC staff reiterated the purpose of the audit and discussed their activities.

NRC staff also indicated to the applicant that the open items identified during the conduct of the audit would be transmitted as questions as a request for additional information. References to the detailed questions are provided in Section 9 of this audit summary.

9.0 REQUESTS FOR ADDITIONAL INFORMATION RESULTING FROM AUDIT The NRC staff issued two RAIs based on observations made during the audit. These RAIs are available in Agencywide Documents Access and Management System (ADAMS). ADAMS Accession Nos. are provided in Table 1. Other RAIs have also been issued independent of the audit.

Table 1. RAIs Resulting from Audit RAI Number Reference 9333 ML18033B743 9388 ML18054A280 10.0 OPEN ITEMS AND PROPOSED CLOSURE PATHS Not applicable.

11.0 DEVIATIONS FROM THE AUDIT PLAN The audit was originally scheduled to exit on December 6, 2017, but was extended to December 13, 2017, to accommodate the evaluation and discussion of additional documentation requested by NRC staff.

12.0 REFERENCES

1. Letter from NuScale Power, LLC, NuScale Power, LLC Submittal of Evaluation Methodology for Stability Analysis of the NuScale Power Module, TR-0516-49417-P (NRC Project No. 0769), July 31, 2016, ADAMS Accession No. ML16250A851.
2. Letter from NuScale Power, LLC, NuScale Power, LLC Submittal of NuScale Response to NRC Request for Supplemental Information to TR-0516-49417-P (NRC Project No.

0769), December 3, 2016, ADAMS Accession No. ML16340A756.

3. Letter to NuScale Power, LLC, Acceptance Letter for the Review Of NuScale Power, LLC Topical Reports TR-0516-49417-P, Evaluation Methodology for Stability Analysis of the NuScale Power Module, Revision 0, and Response for Supplemental Information to the TR, Revision 0, March 7, 2017, ADAMS Accession No. ML17116A06.
4. Letter to NuScale Power, LCC Audit Plan for the Regulatory Audit of NuScale Topical Report Tr-0516-49417-P, Evaluation Methodology For Stability Analysis of the NuScale Power Module, June 7, 2018, ADAMS Accession No. ML1715B024.

20

5. NRO-REG-108, Regulatory Audits, April 2, 2009, ADAMS Accession No. ML081910260.