ML21334A158: Difference between revisions
StriderTol (talk | contribs) (StriderTol Bot insert) |
StriderTol (talk | contribs) (StriderTol Bot change) |
||
(One intermediate revision by the same user not shown) | |||
Line 18: | Line 18: | ||
=Text= | =Text= | ||
{{#Wiki_filter:U.S. NUCLEAR REGULATORY COMMISSION REGULATORY GUIDE 1.245, REVISION 0 Issue Date: January 2022 Technical Lead: Patrick Raynaud PREPARING PROBABILISTIC FRACTURE MECHANICS SUBMITTALS A. INTRODUCTION Purpose This regulatory guide (RG) describes a framework to develop the contents of a licensing submittal that the staff of the U.S. Nuclear Regulatory Commission (NRC) considers acceptable when performing probabilistic fracture mechanics (PFM) analyses in support of regulatory applications. | {{#Wiki_filter:U.S. NUCLEAR REGULATORY COMMISSION REGULATORY GUIDE 1.245, REVISION 0 Issue Date: January 2022 Technical Lead: Patrick Raynaud Written suggestions regarding this guide or development of new guides may be submitted through the NRCs public Web site in the NRC Library at https://nrcweb.nrc.gov/reading-rm/doc-collections/reg-guides/, under Document Collections, in Regulatory Guides, at https://nrcweb.nrc.gov/reading-rm/doc-collections/reg-guides/contactus.html. | ||
Electronic copies of this RG, previous versions of RGs, and other recently issued guides are also available through the NRCs public Web site in the NRC Library at https://nrcweb.nrc.gov/reading-rm/doc-collections/reg-guides/, under Document Collections, in Regulatory Guides. This RG is also available through the NRCs Agencywide Documents Access and Management System (ADAMS) at http://www.nrc.gov/reading-rm/adams.html, under ADAMS Accession Number (No.) ML21334A158. The regulatory analysis may be found in ADAMS under Accession No. ML21034A261. The associated draft guide DG-1382 may be found in ADAMS under Accession No. ML21034A328, and the staff responses to the public comments on DG-1382 may be found under ADAMS Accession No ML21306A292. | |||
PREPARING PROBABILISTIC FRACTURE MECHANICS SUBMITTALS A. INTRODUCTION Purpose This regulatory guide (RG) describes a framework to develop the contents of a licensing submittal that the staff of the U.S. Nuclear Regulatory Commission (NRC) considers acceptable when performing probabilistic fracture mechanics (PFM) analyses in support of regulatory applications. | |||
Applicability This RG applies to nonreactor and reactor licensees that elect to use PFM as part of the technical basis for a licensing action. PFM could be used in a wide variety of applications to meet a wide range of regulations; consequently, it is not possible to provide a comprehensive list of such applications. | Applicability This RG applies to nonreactor and reactor licensees that elect to use PFM as part of the technical basis for a licensing action. PFM could be used in a wide variety of applications to meet a wide range of regulations; consequently, it is not possible to provide a comprehensive list of such applications. | ||
However, the staff anticipates that this RG could apply to nonreactor and reactor licensees subject to Title 10 of the Code of Federal Regulations (10 CFR) Part 50, Domestic Licensing of Production and Utilization Facilities (Ref. 1) ; 10 CFR Part 52, Licenses, Certifications, and Approvals for Nuclear Power Plants (Ref. 2); 10 CFR Part 71, Packaging and Transportation of Radioactive Material (Ref. | However, the staff anticipates that this RG could apply to nonreactor and reactor licensees subject to Title 10 of the Code of Federal Regulations (10 CFR) Part 50, Domestic Licensing of Production and Utilization Facilities (Ref. 1) ; 10 CFR Part 52, Licenses, Certifications, and Approvals for Nuclear Power Plants (Ref. 2); 10 CFR Part 71, Packaging and Transportation of Radioactive Material (Ref. | ||
3); and 10 CFR Part 72, Licensing Requirements for the Independent Storage of Spent Nuclear Fuel, High-Level Radioactive Waste, and Reactor-Related Greater Than Class C Waste (Ref. 4). | 3); and 10 CFR Part 72, Licensing Requirements for the Independent Storage of Spent Nuclear Fuel, High-Level Radioactive Waste, and Reactor-Related Greater Than Class C Waste (Ref. 4). | ||
Applicable Regulations This RG discusses acceptable ways to present PFM analyses in regulatory submittals to the NRC and can be used to demonstrate compliance with a wide range of regulations. The following is a list of regulations where PFM may be a useful tool for developing the technical basis for submittals. Potential uses of PFM in regulatory applications are not limited to the following list of regulations. | Applicable Regulations This RG discusses acceptable ways to present PFM analyses in regulatory submittals to the NRC and can be used to demonstrate compliance with a wide range of regulations. The following is a list of regulations where PFM may be a useful tool for developing the technical basis for submittals. Potential uses of PFM in regulatory applications are not limited to the following list of regulations. | ||
10 CFR Part 50, Domestic Licensing of Production and Utilization Facilities, applies to applicants for, and holders of, licenses for production and utilization facilities. | |||
o | o 10 CFR 50.55a: Codes and standards. | ||
o | o 10 CFR 50.60: Acceptance criteria for fracture prevention measures for light-water nuclear power reactors for normal operation. | ||
RG 1.245 Revision 0, Page 2 o 10 CFR 50.61: Fracture toughness requirements for protection against pressurized thermal shock events. | |||
o 10 CFR 50.61a: Alternate fracture toughness requirements for protection against pressurized thermal shock events. | |||
o 10 CFR 50.66: Requirements for thermal annealing of the reactor pressure vessel. | |||
o 10 CFR 50.69: Risk-informed categorization and treatment of structures, systems, and components for nuclear power reactors. | |||
o Appendix G to Part 50: Fracture Toughness Requirements. | |||
o Appendix H to Part 50: Reactor Vessel Material Surveillance Program Requirements. | |||
10 CFR Part 52, Licenses, Certifications, and Approvals for Nuclear Power Plants, applies to applicants for, and holders of, early site permits, standard design certifications, combined licenses, standard design approvals, and manufacturing licenses for nuclear power facilities. | |||
o Appendix A to Part 52: Design Certification Rule for the U.S. Advanced Boiling Water Reactor. | |||
o Appendix B to Part 52: Design Certification Rule for the System 80+ Design. | |||
o Appendix C to Part 52: Design Certification Rule for the AP600 Design. | |||
o Appendix D to Part 52: Design Certification Rule for the AP1000 Design. | |||
o Appendix E to Part 52: Design Certification Rule for the ESBWR Design. | |||
o Appendix F to Part 52: Design Certification Rule for the APR1400 Design. | |||
10 CFR Part 71, Packaging and Transportation of Radioactive Material, provides requirements for packaging, preparation for shipment, and transportation of licensed material. | |||
o 10 CFR 71.43: General standards for all packages. | |||
o 10 CFR 71.45: Lifting and tie-down standards for all packages. | |||
o 10 CFR 71.51: Additional requirements for Type B packages. | |||
o 10 CFR 71.55: General requirements for fissile material packages. | |||
o 10 CFR 71.64: Special requirements for plutonium air shipments. | |||
o 10 CFR 71.71: Normal conditions of transport. | |||
o 10 CFR 71.73: Hypothetical accident conditions. | |||
o 10 CFR 71.74: Accident conditions for air transport of plutonium. | |||
o 10 CFR 71.75: Qualification of special form radioactive material. | |||
RG 1.245 Revision 0, Page 3 10 CFR Part 72, Licensing Requirements for the Independent Storage of Spent Nuclear Fuel, High-Level Radioactive Waste, and Reactor-Related Greater Than Class C Waste, provides requirements, procedures, and criteria for the issuance of licenses to receive, transfer, and possess power reactor spent fuel, power reactor-related Greater than Class C (GTCC) waste, and other radioactive materials associated with spent fuel storage in an independent spent fuel storage installation (ISFSI) and the terms and conditions under which the Commission will issue these licenses. | |||
o 10 CFR 72.122: Overall requirements. | |||
RG 1.245 Revision 0, Page | |||
o | |||
Related Guidance PFM is likely to be used to risk-inform licensing applications. Consequently, guidance documents such as the following related to risk-informed activities as well as probabilistic risk assessment (PRA) may be related to this RG: | Related Guidance PFM is likely to be used to risk-inform licensing applications. Consequently, guidance documents such as the following related to risk-informed activities as well as probabilistic risk assessment (PRA) may be related to this RG: | ||
NUREG-0800, Standard Review Plan for the Review of Safety Analysis Reports for Nuclear Power Plants: LWR Edition (SRP), Chapter 19, Severe Accidents, Section 19.2, Review of Risk Information Used To Support Permanent Plant-Specific Changes to the Licensing Basis: | |||
General Guidance (Ref. 5), provides general guidance on applications that address changes to the licensing basis. | General Guidance (Ref. 5), provides general guidance on applications that address changes to the licensing basis. | ||
NUREG/CR-7278, Technical Basis for the Use of Probabilistic Fracture Mechanics in Regulatory Applications (Ref. 6) | |||
RG 1.174, An Approach for Using Probabilistic Risk Assessment of Risk-Informed Decisions on Plant-Specific Changes to the Licensing Basis (Ref. 7). | |||
RG 1.175, An Approach for Plant-Specific, Risk-Informed Decisionmaking: Inservice Testing (Ref. 8). | |||
RG 1.178, An Approach for Plant-Specific Risk-Informed Decisionmaking for Inservice Inspection of Piping (Ref. 9). | |||
RG 1.200, Acceptability of Probabilistic Risk Assessment Results for Risk-Informed Activities (Ref. 10). | |||
RG 1.201, Guidelines for Categorizing Structures, Systems, and Components in Nuclear Power Plants According to Their Safety Significance (Ref. 11), discusses an approach to support the new rule established as 10 CFR 50.69, Risk-Informed Categorization and Treatment of Structures, Systems, and Components for Nuclear Power Reactors. | |||
Purpose of Regulatory Guides The NRC issues RGs to describe methods that are acceptable to the staff for implementing specific parts of the agencys regulations, to explain techniques that the staff uses in evaluating specific issues or postulated events, and to describe information that will assist the staff with its review of applications for permits and licenses. Regulatory guides are not NRC regulations and compliance with them is not mandatory. Methods and solutions that differ from those set forth in RGs are acceptable if supported by a basis for the issuance or continuance of a permit or license by the Commission. | Purpose of Regulatory Guides The NRC issues RGs to describe methods that are acceptable to the staff for implementing specific parts of the agencys regulations, to explain techniques that the staff uses in evaluating specific issues or postulated events, and to describe information that will assist the staff with its review of applications for permits and licenses. Regulatory guides are not NRC regulations and compliance with them is not mandatory. Methods and solutions that differ from those set forth in RGs are acceptable if supported by a basis for the issuance or continuance of a permit or license by the Commission. | ||
Paperwork Reduction Act This RG provides voluntary guidance for implementing the mandatory information collections in 10 CFR Parts 50, 52, 71, and 72 that are subject to the Paperwork Reduction Act of 1995 (44 U.S.C. 3501 et. seq.). These information collections were approved by the Office of Management and Budget (OMB), | RG 1.245 Revision 0, Page 4 Paperwork Reduction Act This RG provides voluntary guidance for implementing the mandatory information collections in 10 CFR Parts 50, 52, 71, and 72 that are subject to the Paperwork Reduction Act of 1995 (44 U.S.C. 3501 et. seq.). These information collections were approved by the Office of Management and Budget (OMB), | ||
approval numbers 3150-0011, 3150-0151, 3150-0132, and 3150-0008. Send comments regarding this information collection to the FOIA, Library and Information Collections Branch (T6-A10M), U.S. | approval numbers 3150-0011, 3150-0151, 3150-0132, and 3150-0008. Send comments regarding this information collection to the FOIA, Library and Information Collections Branch (T6-A10M), U.S. | ||
Nuclear Regulatory Commission, Washington, DC 20555-0001, or by e-mail to Infocollects.Resource@nrc.gov, and to the OMB reviewer at: OMB Office of Information and Regulatory Affairs (3150-0011, 3150-0151, 3150-0132, and 3150-0008), Attn: Desk Officer for the Nuclear Regulatory Commission, 725 17th Street, NW Washington, DC20503; e- mail: | Nuclear Regulatory Commission, Washington, DC 20555-0001, or by e-mail to Infocollects.Resource@nrc.gov, and to the OMB reviewer at: OMB Office of Information and Regulatory Affairs (3150-0011, 3150-0151, 3150-0132, and 3150-0008), Attn: Desk Officer for the Nuclear Regulatory Commission, 725 17th Street, NW Washington, DC20503; e-mail: | ||
oira_submission@omb.eop.gov. | oira_submission@omb.eop.gov. | ||
Public Protection Notification The NRC may not conduct or sponsor, and no person is required to respond to, a collection of information unless the document requesting or requiring the collection displays a currently valid OMB control number. | Public Protection Notification The NRC may not conduct or sponsor, and no person is required to respond to, a collection of information unless the document requesting or requiring the collection displays a currently valid OMB control number. | ||
TABLE OF CONTENTS Purpose ............................................................................................................................................ 1 Applicability .................................................................................................................................... 1 Applicable Regulations .................................................................................................................... 1 Related Guidance ............................................................................................................................. 3 Purpose of Regulatory Guides ......................................................................................................... 3 Paperwork Reduction Act ................................................................................................................ 4 Public Protection Notification.......................................................................................................... 4 Reason for Issuance ......................................................................................................................... 6 Background ...................................................................................................................................... 6 Consideration of International Standards ......................................................................................... 6 Regulatory Position C.1 ................................................................................................................... 8 | RG 1.245 Revision 0, Page 5 TABLE OF CONTENTS Purpose............................................................................................................................................ 1 Applicability.................................................................................................................................... 1 Applicable Regulations.................................................................................................................... 1 Related Guidance............................................................................................................................. 3 Purpose of Regulatory Guides......................................................................................................... 3 Paperwork Reduction Act................................................................................................................ 4 Public Protection Notification.......................................................................................................... 4 Reason for Issuance......................................................................................................................... 6 Background...................................................................................................................................... 6 Consideration of International Standards......................................................................................... 6 Regulatory Position C.1................................................................................................................... 8 | ||
: 1. General Considerations ............................................................................................................ 8 1.1. Graded Approach ........................................................................................................... 8 1.2. Analytical Steps in a Probabilistic Fracture Mechanics Analysis ................................. 8 Regulatory Position C.2 ................................................................................................................. 10 | : 1. | ||
: 2. Probabilistic Fracture Mechanics Analysis and Submittal Contents...................................... 10 2.1. Regulatory Context ...................................................................................................... 10 2.2. Information Made Available to the NRC Staff with a Probabilistic Fracture Mechanics Submittal ................................................................................................... 10 2.3. Quantities of Interest and Acceptance Criteria ............................................................ 11 2.4. Software Quality Assurance and Verification and Validation..................................... 11 2.5. Models ......................................................................................................................... 14 2.6. Inputs ........................................................................................................................... 16 2.7. Uncertainty Propagation .............................................................................................. 18 2.8. Convergence ................................................................................................................ 19 2.9. Sensitivity Analyses .................................................................................................... 21 2.10. Quantity of Interest Uncertainty Characterization ....................................................... 23 2.11. Sensitivity Studies ....................................................................................................... 24 | General Considerations............................................................................................................ 8 1.1. | ||
Graded Approach........................................................................................................... 8 1.2. | |||
Analytical Steps in a Probabilistic Fracture Mechanics Analysis................................. 8 Regulatory Position C.2................................................................................................................. 10 | |||
: 2. | |||
Probabilistic Fracture Mechanics Analysis and Submittal Contents...................................... 10 2.1. | |||
Regulatory Context...................................................................................................... 10 2.2. | |||
Information Made Available to the NRC Staff with a Probabilistic Fracture Mechanics Submittal................................................................................................... 10 2.3. | |||
Quantities of Interest and Acceptance Criteria............................................................ 11 2.4. | |||
Software Quality Assurance and Verification and Validation..................................... 11 2.5. | |||
Models......................................................................................................................... 14 2.6. | |||
Inputs........................................................................................................................... 16 2.7. | |||
Uncertainty Propagation.............................................................................................. 18 2.8. | |||
Convergence................................................................................................................ 19 2.9. | |||
Sensitivity Analyses.................................................................................................... 21 2.10. Quantity of Interest Uncertainty Characterization....................................................... 23 2.11. Sensitivity Studies....................................................................................................... 24 | |||
B. DISCUSSION Reason for Issuance The NRC developed this RG to provide guidance for the use of PFM in regulatory applications. It is intended to ensure that the staff guidance is clear with regard to the contents of PFM regulatory applications. The use of this RG is anticipated to increase the efficiency of reviews for regulatory applications that use PFM as a supporting technical basis by providing a set of common guidelines for reviewers and licensees. | RG 1.245 Revision 0, Page 6 B. DISCUSSION Reason for Issuance The NRC developed this RG to provide guidance for the use of PFM in regulatory applications. It is intended to ensure that the staff guidance is clear with regard to the contents of PFM regulatory applications. The use of this RG is anticipated to increase the efficiency of reviews for regulatory applications that use PFM as a supporting technical basis by providing a set of common guidelines for reviewers and licensees. | ||
===Background=== | ===Background=== | ||
Line 84: | Line 95: | ||
In 2018, the NRC published the technical letter report, Important Aspects of Probabilistic Fracture Mechanics Analyses (Ref. 16), to outline the important concepts for using PFM in support of regulatory applications and held a public meeting to discuss this technical letter report. Following the October 23, 2018, public meeting on Discussion of a graded approach for probabilistic fracture mechanics codes and analyses for regulatory applications, (Ref. 17) the Electric Power Research Institute (EPRI) developed a proposal on the minimum contents of a submittal that uses PFM as part of its technical basis. Some licensees have submitted licensing applications claiming to have followed the EPRI minimum requirements. However, in reviewing these submittals, the NRC has found that the minimum requirements in the EPRI proposal are not always clear and concise. Further, the EPRI document does not precisely define its guidance when the minimum requirements specified in the EPRI guidance are not sufficient, leading to ambiguity, inefficient reviews, and uncertainty in regulatory outcomes. Importantly, the NRC staff accounted for EPRIs proposal when developing this RG. In 2021, the NRC will publish, concurrently with this RG, NUREG/CR-7278, Technical Basis for the use of Probabilistic Fracture Mechanics in Regulatory Applications (Ref. 6), which constitutes the detailed technical basis for this RG. | In 2018, the NRC published the technical letter report, Important Aspects of Probabilistic Fracture Mechanics Analyses (Ref. 16), to outline the important concepts for using PFM in support of regulatory applications and held a public meeting to discuss this technical letter report. Following the October 23, 2018, public meeting on Discussion of a graded approach for probabilistic fracture mechanics codes and analyses for regulatory applications, (Ref. 17) the Electric Power Research Institute (EPRI) developed a proposal on the minimum contents of a submittal that uses PFM as part of its technical basis. Some licensees have submitted licensing applications claiming to have followed the EPRI minimum requirements. However, in reviewing these submittals, the NRC has found that the minimum requirements in the EPRI proposal are not always clear and concise. Further, the EPRI document does not precisely define its guidance when the minimum requirements specified in the EPRI guidance are not sufficient, leading to ambiguity, inefficient reviews, and uncertainty in regulatory outcomes. Importantly, the NRC staff accounted for EPRIs proposal when developing this RG. In 2021, the NRC will publish, concurrently with this RG, NUREG/CR-7278, Technical Basis for the use of Probabilistic Fracture Mechanics in Regulatory Applications (Ref. 6), which constitutes the detailed technical basis for this RG. | ||
The NRC has an approved methodology for risk-informed decision making for design-basis changes (Ref. 7), and PFM may be used as a tool within that framework. The purpose of PFM is to model the behavior and degradation of systems more accurately and consequentially draw more precise and accurate conclusions about situations relative to performance criteria or design assumptions. | The NRC has an approved methodology for risk-informed decision making for design-basis changes (Ref. 7), and PFM may be used as a tool within that framework. The purpose of PFM is to model the behavior and degradation of systems more accurately and consequentially draw more precise and accurate conclusions about situations relative to performance criteria or design assumptions. | ||
Consideration of International Standards The International Atomic Energy Agency (IAEA) works with member states and other partners to promote the safe, secure, and peaceful use of nuclear technologies. The IAEA develops safety requirements and safety guides for protecting people and the environment from harmful effects of ionizing radiation. These requirements and guides provide a system of safety standards categories that | Consideration of International Standards The International Atomic Energy Agency (IAEA) works with member states and other partners to promote the safe, secure, and peaceful use of nuclear technologies. The IAEA develops safety requirements and safety guides for protecting people and the environment from harmful effects of ionizing radiation. These requirements and guides provide a system of safety standards categories that | ||
reflect an international perspective on what constitutes a high level of safety. In developing or updating RGs the NRC has considered IAEA safety requirements, safety guides and other relevant reports in order to benefit from the international perspectives, pursuant to the Commissions International Policy Statement (Ref. 18) and NRC Management Directive and Handbook 6.6 (Ref. 19). | RG 1.245 Revision 0, Page 7 reflect an international perspective on what constitutes a high level of safety. In developing or updating RGs the NRC has considered IAEA safety requirements, safety guides and other relevant reports in order to benefit from the international perspectives, pursuant to the Commissions International Policy Statement (Ref. 18) and NRC Management Directive and Handbook 6.6 (Ref. 19). | ||
The following IAEA safety requirements and guides were considered in the development/ update of the RG: | The following IAEA safety requirements and guides were considered in the development/ update of the RG: | ||
International Atomic Energy Agency, Development and Application of Level 1 Probabilistic Safety Assessment for Nuclear Power Plants, IAEA Safety Standards Series No. SSG-3, IAEA, Vienna (2010) (Ref. 20). | |||
C. STAFF REGULATORY GUIDANCE This section describes methods, approaches, and information that the NRC staff considers acceptable for performing PFM analyses and preparing the associated documentation in support of regulatory applications. To enhance the efficiency of the NRCs review of PFM submittals, the staff recommends that applicants using the framework presented in this guidance document identify any deviations in their application and provide explanations for each deviation. | RG 1.245 Revision 0, Page 8 C. STAFF REGULATORY GUIDANCE This section describes methods, approaches, and information that the NRC staff considers acceptable for performing PFM analyses and preparing the associated documentation in support of regulatory applications. To enhance the efficiency of the NRCs review of PFM submittals, the staff recommends that applicants using the framework presented in this guidance document identify any deviations in their application and provide explanations for each deviation. | ||
Regulatory Position C.1 | Regulatory Position C.1 | ||
: 1. General Considerations 1.1. Graded Approach For regulatory submittals to the NRC that use PFM as part of their supporting technical basis, the level of detail associated with the analysis and documentation activities should scale with the complexity and safety significance of the application, as well as the complexity of the supporting analysis (including methods and analysis tools). | : 1. General Considerations 1.1. | ||
Graded Approach For regulatory submittals to the NRC that use PFM as part of their supporting technical basis, the level of detail associated with the analysis and documentation activities should scale with the complexity and safety significance of the application, as well as the complexity of the supporting analysis (including methods and analysis tools). | |||
This RG provides a graded set of guidelines on analysis steps, analytical software quality assurance (SQA) and verification and validation (V&V), and levels of associated documentation. When followed, these guidelines result in an acceptable practical framework for the content of PFM submittals to ensure the effectiveness and efficiency of NRC reviews of submittals containing PFM analysis and results. | This RG provides a graded set of guidelines on analysis steps, analytical software quality assurance (SQA) and verification and validation (V&V), and levels of associated documentation. When followed, these guidelines result in an acceptable practical framework for the content of PFM submittals to ensure the effectiveness and efficiency of NRC reviews of submittals containing PFM analysis and results. | ||
1.2. Analytical Steps in a Probabilistic Fracture Mechanics Analysis Applicants should follow the process charted in Figure C-1 when performing PFM analyses in support of regulatory applications. NUREG/CR-7278 describes the steps shown in detail, and Table C-1 shows the cross-referencing between the sections of this RG and the corresponding sections of NUREG/CR-7278. | 1.2. | ||
Analytical Steps in a Probabilistic Fracture Mechanics Analysis Applicants should follow the process charted in Figure C-1 when performing PFM analyses in support of regulatory applications. NUREG/CR-7278 describes the steps shown in detail, and Table C-1 shows the cross-referencing between the sections of this RG and the corresponding sections of NUREG/CR-7278. | |||
Figure C-1: PFM analysis flowchart in support of regulatory submittals (corresponding sections of this guide are shown in parentheses when applicable) | RG 1.245 Revision 0, Page 9 Figure C-1: PFM analysis flowchart in support of regulatory submittals (corresponding sections of this guide are shown in parentheses when applicable) | ||
Table C-1: Submittal Content Mapping to NUREG/CR-7278 NUREG/CR-7278 | RG 1.245 Revision 0, Page 10 Table C-1: Submittal Content Mapping to NUREG/CR-7278 RG Section NUREG/CR-7278 Sections Content 2.1 3.1.1 Regulatory Context 2.2 Information Made Available to NRC Staff 2.2.1 3.1.3 PFM Software 2.2.2 3.1.3 Supporting Documents 2.3 2.2.1 / 3.1.2 Quantities of Interest and Acceptance Criteria 2.4 2.2.2 / 3.1.3 SQA and V&V 2.5 2.2.3 / 3.1.3 Models 2.6 3.2.1 / 3.2.2 / 3.3.1 / 3.4.1 Inputs 2.7 3.3.1 Uncertainty Propagation 2.8 3.3.2 Convergence 2.9 3.3.3 Sensitivity Analyses 2.10 3.3.4 Output Uncertainty Characterization 2.11 3.4.1 / 3.4.2 Sensitivity Studies Regulatory Position C.2 | ||
: 2. Probabilistic Fracture Mechanics Analysis and Submittal Contents Each subsection in this Regulatory Position relates to an item expected in a submittal. The content in each subsection comes from, in large part, the suggested minimum content for PFM submittals that was developed in EPRIs white paper (Ref. 17). Tables in each subsection provide guidance for different documentation expectations. Each table contains circumstances under which specific information should be provided for a complete submittal. It is important to note that submittals should be informed by the specific details and elements of each analysis and need not include all of the listed elements, though careful consideration should be applied to arrive at that conclusion. | : 2. Probabilistic Fracture Mechanics Analysis and Submittal Contents Each subsection in this Regulatory Position relates to an item expected in a submittal. The content in each subsection comes from, in large part, the suggested minimum content for PFM submittals that was developed in EPRIs white paper (Ref. 17). Tables in each subsection provide guidance for different documentation expectations. Each table contains circumstances under which specific information should be provided for a complete submittal. It is important to note that submittals should be informed by the specific details and elements of each analysis and need not include all of the listed elements, though careful consideration should be applied to arrive at that conclusion. | ||
2.1. | 2.1. | ||
2.2. | Regulatory Context Regulatory submittals using PFM analyses should explain why a probabilistic approach is appropriate and how the probabilistic approach is used to demonstrate compliance with the regulatory criteria. When no specific regulatory acceptance criteria exist, the submittal should explain how the probabilistic approach informs the regulatory action and regulatory compliance demonstration. Applicants should be aware that the use of PFM in a regulatory submission is only one aspect of what is required for risk-informed decision making. | ||
2.2. | |||
Information Made Available to the NRC Staff with a Probabilistic Fracture Mechanics Submittal Applicants should make information supporting the submittal available for review. The NRC encourages applicants to discuss the contents of their submittal with the agency during pre-submittal meetings (the timing of such meetings is left to the applicant, but it could be desirable to schedule such meetings early in the lifecycle of the application) or other formal communications. Pre-submittal discussions should cover information that will be provided with the submittal, information that might be provided upon request, and information that may not be directly transmittable but might be reviewed under specific agreed-upon circumstances, such as an audit. | |||
2.2.1 | RG 1.245 Revision 0, Page 11 2.2.1 Probabilistic Fracture Mechanics Software In case the NRC staff determines that it is unable to perform independent confirmatory calculations or independent benchmarking of the PFM analyses, the applicant should ensure that an alternate approach is available to NRC reviewers. This will preferably be determined during preapplication meetings but can also be identified during the review. Such approaches may include one or more of the following options: | ||
Provide NRC reviewers with direct access to the PFM software executable program and the necessary user instructions to use the tools. | |||
Ensure the availability of analysts during NRC audits or NRC review meetings, such that the PFM submittal developers can run analysis cases as requested by NRC reviewers. | |||
Allow the NRC reviewers to submit analysis requests to the applicant and provide the NRC reviewers with the results of the analyses, possibly as part of an audit or review meeting. | |||
2.2.2 | 2.2.2 Probabilistic Fracture Mechanics Software Quality Assurance and Verification and Validation Documents The applicant should ensure that the SQA and V&V documentation for the PFM software is available for NRC review through in-person or virtual audits, if requested by the NRC. | ||
2.3. Quantities of Interest and Acceptance Criteria The quantities of interest (QoIs) to be compared to the acceptance criteria should be clearly defined and include the following: | 2.3. | ||
Quantities of Interest and Acceptance Criteria The quantities of interest (QoIs) to be compared to the acceptance criteria should be clearly defined and include the following: | |||
the units of measurement and time period, the relationship to the outputs of the PFM software, the acceptance criteria, and if the QoI is a probability in the extreme tails of the distribution, a description of how this affected the analysis choices. | |||
The use of previously approved acceptance criteria (if already in existence for the specific application at hand) is encouraged but should be appropriately justified and explained. Specifically, the applicant should ensure that inherent assumptions and requirements of the source activity are respected, and that any apparent differences are reconciled. If there is no precedent for an acceptance criterion, the applicant should derive probabilistic acceptance criteria based on risk-informed decision-making principles in accordance with RG 1.200 and RG 1.174, if applicable, and describe the bases for the chosen acceptance criteria. | The use of previously approved acceptance criteria (if already in existence for the specific application at hand) is encouraged but should be appropriately justified and explained. Specifically, the applicant should ensure that inherent assumptions and requirements of the source activity are respected, and that any apparent differences are reconciled. If there is no precedent for an acceptance criterion, the applicant should derive probabilistic acceptance criteria based on risk-informed decision-making principles in accordance with RG 1.200 and RG 1.174, if applicable, and describe the bases for the chosen acceptance criteria. | ||
If the PFM submittal includes more than one QoI, the applicant should document the above steps and information for each QoI. | If the PFM submittal includes more than one QoI, the applicant should document the above steps and information for each QoI. | ||
2.4. Software Quality Assurance and Verification and Validation In general, PFM software to be used in regulatory applications should be developed under the framework of an SQA plan and undergo V&V activities. The SQA and V&V activities should depend on the safety significance, complexity, past experience (previous use in regulatory applications), and status | 2.4. | ||
Software Quality Assurance and Verification and Validation In general, PFM software to be used in regulatory applications should be developed under the framework of an SQA plan and undergo V&V activities. The SQA and V&V activities should depend on the safety significance, complexity, past experience (previous use in regulatory applications), and status | |||
of previous approval for the PFM software. The applicant should follow its SQA program and V&V procedures with these concepts in mind. | RG 1.245 Revision 0, Page 12 of previous approval for the PFM software. The applicant should follow its SQA program and V&V procedures with these concepts in mind. | ||
The applicant should determine to which Table C-2 category its PFM software belongs. The applicant should consider discussing its choice of categorization with the NRC in pre-submittal meetings to ensure that an acceptable determination has been made. | The applicant should determine to which Table C-2 category its PFM software belongs. The applicant should consider discussing its choice of categorization with the NRC in pre-submittal meetings to ensure that an acceptable determination has been made. | ||
The applicant should perform and document SQA and V&V activities for the PFM software according to the guidelines in Table C-2. The NRC has approved applications using the following codes for a specific range of applications as of the publication of this RG: | The applicant should perform and document SQA and V&V activities for the PFM software according to the guidelines in Table C-2. The NRC has approved applications using the following codes for a specific range of applications as of the publication of this RG: | ||
the latest version of FAVOR, see FAVOR theory manual (Ref. 12) for validated application range the latest version of xLPR, see xLPR documentation (Ref. 21) for validated application range the version of the SRRA (Structural Reliability and Risk Assessment) code approved in Ref. 22, for the range approved in the referenced safety evaluation. | |||
There may be instances where a code was approved just for a specific application, and these are considered approved for the same exact type of application. | |||
There may be instances where a code was approved just for a specific application, and these are considered approved for the same exact type of application. | |||
Table C-2: SQA and V&V Code Categories Category | RG 1.245 Revision 0, Page 13 Table C-2: SQA and V&V Code Categories Category Description Submittal Guidelines QV-1 Code used in NRC-approved application a QV-1A Exercised within previously validated range | ||
* Demonstrate code applicability within the validated | * Demonstrate code applicability within the validated range. | ||
* Describe features of the specific application where the code is validated and applicable (i.e., areas of known code capability). | * Describe features of the specific application where the code is validated and applicable (i.e., areas of known code capability). | ||
QV-1B | QV-1B Exercised outside of previously validated range | ||
* Provide evidence for the applicability of the code to the | * Provide evidence for the applicability of the code to the specific application with respect to the areas of unknown code capability. | ||
* Describe features of the specific application where the code has not been previously validated and applied (i.e., | * Describe features of the specific application where the code has not been previously validated and applied (i.e., | ||
areas of unknown code capability). | areas of unknown code capability). | ||
QV-1C | QV-1C Modified | ||
* Give an SQA summary and V&V description for modified portions of the code. | * Give an SQA summary and V&V description for modified portions of the code. | ||
* Demonstrate that the code was not broken as a result of changes. | * Demonstrate that the code was not broken as a result of changes. | ||
* Make detailed documentation available for further review upon request (audit). | * Make detailed documentation available for further review upon request (audit). | ||
QV-2 | QV-2 Commercial off-the-shelf software designed for the specific purpose of the application b | ||
* Demonstrate code applicability. | * Demonstrate code applicability. | ||
* Describe the software and its pedigree. | * Describe the software and its pedigree. | ||
* Make software and documentation available for review upon request (audit). | * Make software and documentation available for review upon request (audit). | ||
QV-3 | QV-3 Custom code | ||
* Summarize the SQA program and its implementation. | * Summarize the SQA program and its implementation. | ||
* Provide a basic description of the measures for quality assurance, including V&V of the PFM analysis code as applied in the subject report. | * Provide a basic description of the measures for quality assurance, including V&V of the PFM analysis code as applied in the subject report. | ||
* For very simple applications, possibly provide the source code instead of standardized SQA and V&V. | * For very simple applications, possibly provide the source code instead of standardized SQA and V&V. | ||
* Include separate deterministic fracture mechanics analyses to support other validation results, as appropriate for a given application. | * Include separate deterministic fracture mechanics analyses to support other validation results, as appropriate for a given application. | ||
a | a As of the publication of this RG, PFM codes used in NRC-approved applications or having received general approval within a validated range include xLPR, FAVOR, and SRRA. | ||
b | b Examples would include publicly available (for purchase or free) commercial software specifically to perform PFM analyses. Combinations of commercial off-the-shelf software may be acceptable (e.g., a finite-element software such as ABAQUS or ANSYS coupled with a probabilistic framework such as GoldSim or DAKOTA). | ||
2.5. Models The applicant should describe all the models implemented and used as part of the PFM analysis and software. Each model should be assessed independently and categorized as shown in Table C-3. The applicant should follow the submittal guidelines in Table C-3 for each model in the PFM software and/or analysis, based on the model category. | RG 1.245 Revision 0, Page 14 2.5. | ||
Models The applicant should describe all the models implemented and used as part of the PFM analysis and software. Each model should be assessed independently and categorized as shown in Table C-3. The applicant should follow the submittal guidelines in Table C-3 for each model in the PFM software and/or analysis, based on the model category. | |||
Table C-3: Submittal Guidelines for Models Category Description | RG 1.245 Revision 0, Page 15 Table C-3: Submittal Guidelines for Models Category Description Submittal Guidelines M-1 Model from a code in category QV-1A within the same validated range | ||
* Reference existing documentation for that model in the | * Reference existing documentation for that model in the NRC-approved code, demonstrate that the current range of the model is within the previously approved and validated range, and demonstrate that the model functions as intended in the new software. | ||
M-2 | M-2 Model from a code in category QV-1B outside the validated range | ||
* See the submittal guidelines for M-1, except demonstrate | * See the submittal guidelines for M-1, except demonstrate validity of the model for the new applicability range (document a comparison of model predictions for the entire new range to applicable supporting data, predictions made using alternative models, and/or using engineering judgment, optionally supported by quantitative goodness-of-fit analyses). | ||
M-3 | M-3 Model derived from a category M-1 or M-2 model | ||
* See the submittal guidelines for M-2 and include a | * See the submittal guidelines for M-2 and include a detailed description of changes to the M-1 or M-2 model, with justification for the validity of the new model. | ||
M-4 | M-4 Well-established model not previously part of an NRC-approved code | ||
* Provide justification for model as being well-established | * Provide justification for model as being well-established by supporting references and engineering judgement. | ||
* Describe gaps and limitations in the code capabilities for the analysis, combined with a strategy for mitigating identified gaps and communicating any remaining issues or risks. | * Describe gaps and limitations in the code capabilities for the analysis, combined with a strategy for mitigating identified gaps and communicating any remaining issues or risks. | ||
* Describe the model(s) applied in the PFM analysis code in sufficient detail so a competent analyst familiar with the relevant subject area could independently implement the model(s) from the documentation alone. Model forms can either be theoretical, semiempirical, or empirical. | * Describe the model(s) applied in the PFM analysis code in sufficient detail so a competent analyst familiar with the relevant subject area could independently implement the model(s) from the documentation alone. Model forms can either be theoretical, semiempirical, or empirical. | ||
Line 173: | Line 176: | ||
* Identify important uncertainties or conservatisms. | * Identify important uncertainties or conservatisms. | ||
* Describe the computational expense of the model and how that might affect analysis choices. | * Describe the computational expense of the model and how that might affect analysis choices. | ||
M-5 | M-5 First-of-a-kind model not yet published in a peer-reviewed journal | ||
* See the submittal guidelines for M-4, and perform and | * See the submittal guidelines for M-4, and perform and document model sensitivity studies to understand trends in the model, as compared to expected model behavior and to the data used to develop the model, and describe model maturity and the status of the technical basis. | ||
2.6. Inputs For each QoI, the applicant should categorize each input in the PFM software or analysis as shown in Table C-4. In Table C-4, knowledge refers to the depth of information available to prescribe either the deterministic inputs or the distributions on the uncertain inputs. Importance refers to the relative effect of input on the QoI. To determine the input category, the applicant should consider the following: | RG 1.245 Revision 0, Page 16 2.6. | ||
Inputs For each QoI, the applicant should categorize each input in the PFM software or analysis as shown in Table C-4. In Table C-4, knowledge refers to the depth of information available to prescribe either the deterministic inputs or the distributions on the uncertain inputs. Importance refers to the relative effect of input on the QoI. To determine the input category, the applicant should consider the following: | |||
whether the input is deterministic or uncertain, how much knowledge is available about the input, and the inputs importance with regard to the QoI. | |||
Throughout the analysis, the applicant should continuously assess the relative importance of inputs on the QoI and revisit the assumptions and choices made for the most important inputs to confirm their validity. The applicant should also consider the use of sensitivity studies to show the impact (or lack thereof) of some of the key assumptions made for the inputs that are most important to the outcome of the analyses. | Throughout the analysis, the applicant should continuously assess the relative importance of inputs on the QoI and revisit the assumptions and choices made for the most important inputs to confirm their validity. The applicant should also consider the use of sensitivity studies to show the impact (or lack thereof) of some of the key assumptions made for the inputs that are most important to the outcome of the analyses. | ||
Table C-4: Categorization Based on Knowledge and the Importance of Inputs Used in the Analysis Input Category | Table C-4: Categorization Based on Knowledge and the Importance of Inputs Used in the Analysis Input Category Low Knowledge of Input Characteristics High Knowledge of Input Characteristics Deterministic Uncertain Deterministic Uncertain High Importance I-4D I-4R I-3D I-3R Low Importance I-2D I-2R I-1D I-1R For guidance on the documentation of inputs, the applicant should refer to Table C-5. The applicant should provide the information recommended in Table C-5 for each input based on each inputs independent categorization. | ||
Table C-5: Submittal Guidelines for Inputs Category Submittal Guidelines I-1D | RG 1.245 Revision 0, Page 17 Table C-5: Submittal Guidelines for Inputs Category Submittal Guidelines I-1D | ||
* List input value. | * List input value. | ||
I-1R | I-1R | ||
Line 216: | Line 216: | ||
I-4R | I-4R | ||
* See the submittal guidelines for I-3R. | * See the submittal guidelines for I-3R. | ||
* If there is a lack of data, justify the use of expert judgment. | * If there is a lack of data, justify the use of expert judgment. | ||
2.7. Uncertainty Propagation The applicant should document the methods used to propagate uncertainty through the PFM model such that analysis results may be reproduced. The applicant should determine the PFM analysis uncertainty propagation category, as shown in Table C-6. The applicant should follow the guidelines in Table C-6 to document how uncertainties are propagated in the PFM analysis. If the submittal presents several analyses, the applicant should determine the category for each analysis and document the uncertainty propagation for each analysis according to the guidelines in Table C-6. | RG 1.245 Revision 0, Page 18 2.7. | ||
Table C-6: Submittal Guidelines for Uncertainty Propagation Category | Uncertainty Propagation The applicant should document the methods used to propagate uncertainty through the PFM model such that analysis results may be reproduced. The applicant should determine the PFM analysis uncertainty propagation category, as shown in Table C-6. The applicant should follow the guidelines in Table C-6 to document how uncertainties are propagated in the PFM analysis. If the submittal presents several analyses, the applicant should determine the category for each analysis and document the uncertainty propagation for each analysis according to the guidelines in Table C-6. | ||
* Give the method for uncertainty propagation and describe | Table C-6: Submittal Guidelines for Uncertainty Propagation Category Description Submittal Guidelines UP-1 Analysis does not employ a surrogate model | ||
* Give the method for uncertainty propagation and describe the simulation framework. | |||
* If Monte Carlo sampling is used, describe the finalized sampling scheme and rationale for the sampling scheme, including sampling method, sample size, the pseudo-random number generation method, and the random seeds used. | * If Monte Carlo sampling is used, describe the finalized sampling scheme and rationale for the sampling scheme, including sampling method, sample size, the pseudo-random number generation method, and the random seeds used. | ||
* Describe the approach for maintaining separation of aleatory and epistemic uncertainties, if applicable. | * Describe the approach for maintaining separation of aleatory and epistemic uncertainties, if applicable. | ||
* If importance sampling is used to oversample important regions of the input space, justify the choice of importance distribution. | * If importance sampling is used to oversample important regions of the input space, justify the choice of importance distribution. | ||
UP-2 | UP-2 Analysis does employ a surrogate model | ||
* See the submittal guidelines for UP-1 and describe the | * See the submittal guidelines for UP-1 and describe the form of the surrogate model(s), any approximations or assumptions, the method used for fitting the surrogate, and the validation process for the surrogate model. | ||
UP-2A | UP-2A Surrogate model used for sensitivity analysis | ||
* See the submittal guidelines for UP-2 and describe the | * See the submittal guidelines for UP-2 and describe the features of the different surrogate models used. | ||
UP-2B | UP-2B Surrogate model is used for uncertainty propagation | ||
* See the submittal guidelines for UP-2 and quantify the | * See the submittal guidelines for UP-2 and quantify the magnitude of error associated with the surrogate model approximation and include as additional uncertainty in the estimation of the QoI. | ||
2.8. Convergence To assess the convergence of the QoI estimate, the applicants documentation should demonstrate the convergence for any discretization used in the analysis (e.g., time step, spatial discretization), as well as statistical convergence based on the sample size and sampling method used in the probabilistic analysis. The primary goal should be to show that the conclusions of the analysis would not change significantly if the applicant used a reasonably, more refined discretization or a larger sample size. | RG 1.245 Revision 0, Page 19 2.8. | ||
Convergence To assess the convergence of the QoI estimate, the applicants documentation should demonstrate the convergence for any discretization used in the analysis (e.g., time step, spatial discretization), as well as statistical convergence based on the sample size and sampling method used in the probabilistic analysis. The primary goal should be to show that the conclusions of the analysis would not change significantly if the applicant used a reasonably, more refined discretization or a larger sample size. | |||
To demonstrate and document discretization convergence, the applicant should do the following: | To demonstrate and document discretization convergence, the applicant should do the following: | ||
For PFM codes in category QV-1A, the applicant need not document discretization convergence, but analysts should nonetheless verify that discretization convergence is achieved. | |||
For cases where the use of a QV-1 code exercised outside of the validated range, i.e., | |||
QV-1B, may directly impact discretization convergence, verification should be documented. | QV-1B, may directly impact discretization convergence, verification should be documented. | ||
For new or modified codes (categories QV-1C, QV-2, and QV-3), the applicant should document the approach used for assessing discretization convergence and demonstrate and document that a more refined discretization does not significantly affect the outcome of the analysis. | |||
To demonstrate and document statistical convergence, the applicant should follow the graded approach described in Table C-7. Figure C-2 illustrates the decision tree for the statistical convergence categories. | To demonstrate and document statistical convergence, the applicant should follow the graded approach described in Table C-7. Figure C-2 illustrates the decision tree for the statistical convergence categories. | ||
Figure C-2: Decision tree for statistical convergence categories. | Figure C-2: Decision tree for statistical convergence categories. | ||
Table C-7: Submittal Guidelines for Statistical Convergence Category | RG 1.245 Revision 0, Page 20 Table C-7: Submittal Guidelines for Statistical Convergence Category Description Submittal Guidelines SC-1 a | ||
SC-1 | [Acceptance criteria met with at least one order of magnitude margin] AND [no importance sampling AND no surrogate models used] | ||
* No sampling uncertainty characterization | * No sampling uncertainty characterization recommended as long as the uncertainty is sufficiently small relative to the margin. b SC-2A | ||
SC-2A | [Acceptance criteria met with at least one order of magnitude margin] AND [use of importance sampling OR surrogate models OR both] | ||
* Describe the approach used for assessing statistical | * Describe the approach used for assessing statistical convergence, with one method needed for sampling uncertainty characterization. | ||
* Explain the approach used for characterizing sampling uncertainty. | |||
* Explain the approach used for characterizing sampling | |||
uncertainty. | |||
* Justify why the sampling uncertainty is small enough for the intended purpose (i.e., why statistical convergence is sufficient for the intended purpose). | * Justify why the sampling uncertainty is small enough for the intended purpose (i.e., why statistical convergence is sufficient for the intended purpose). | ||
* Describe how sampling uncertainty is used in the interpretation of the results. | * Describe how sampling uncertainty is used in the interpretation of the results. | ||
SC-2B | SC-2B | ||
[Acceptance criteria met with at least one order of magnitude margin] AND [use of importance sampling OR surrogate models OR both] AND [separation of aleatory and epistemic uncertainties is implemented in the PFM code] | |||
sampling OR surrogate models OR both] AND [separation of aleatory and epistemic uncertainties is implemented in the PFM code] | * See the submittal guidelines for SC-2A and distinguish between epistemic and aleatory means and standard deviations. | ||
SC-3A | SC-3A | ||
* See the submittal guidelines for SC-2A and provide | [Acceptance criteria met with less than one order of magnitude margin] | ||
SC-3B | * See the submittal guidelines for SC-2A and provide two different methods for sampling uncertainty characterization. | ||
* See the submittal guidelines for SC-3A and give a | SC-3B | ||
[Acceptance criteria met with less than one order of magnitude margin] AND [separation of aleatory and epistemic uncertainties is implemented in the PFM code] | |||
a | * See the submittal guidelines for SC-3A and give a sample size convergence analysis for both the aleatory and epistemic sample sizes. | ||
b | a Data type may have an impact on the convergence category. Continuous outputs can be category SC-1, but binary outputs inherently must be category SC-2 or SC-3 unless epistemic and aleatory uncertainties are separated. | ||
b Some assessment of uncertainty is necessary, even if qualitative, as long as the uncertainty itself is understood to be small. | |||
2.9. Sensitivity Analyses In most cases, the applicant should perform sensitivity analyses to identify the inputs that drive the QoI uncertainty. The applicant should assess its PFM software and analysis to determine the sensitivity analyses category shown in Table C-8. The applicant should follow the guidelines in Table C-8 to document the details of sensitivity analyses. If the combination of PFM software and analysis belongs to category SA-1 in Table C-8, the NRC does not recommend performing sensitivity analyses. | RG 1.245 Revision 0, Page 21 2.9. | ||
If the submittal presents several PFM analyses, the applicant should determine the sensitivity analysis category for each PFM analysis and document sensitivity analyses for each PFM analysis according to the guidelines in Table C-8. | Sensitivity Analyses In most cases, the applicant should perform sensitivity analyses to identify the inputs that drive the QoI uncertainty. The applicant should assess its PFM software and analysis to determine the sensitivity analyses category shown in Table C-8. The applicant should follow the guidelines in Table C-8 to document the details of sensitivity analyses. If the combination of PFM software and analysis belongs to category SA-1 in Table C-8, the NRC does not recommend performing sensitivity analyses. | ||
If the submittal presents several PFM analyses, the applicant should determine the sensitivity analysis category for each PFM analysis and document sensitivity analyses for each PFM analysis according to the guidelines in Table C-8. | |||
Table C-8: Submittal Guidelines for Sensitivity Analyses | RG 1.245 Revision 0, Page 22 Table C-8: Submittal Guidelines for Sensitivity Analyses Category Description Sensitivity Analysis Needed? a Submittal Guidelines SA-1 Previously approved code (QV-1A, QV-1B) with same QoI characteristic and same input parameters b No | ||
* Describe important input and measure of input importance from previous use. | |||
QoI characteristic and same input parameters b SA-2 | SA-2 Previously approved code (QV-1A, QV-1B) with different QoI Yes | ||
* Explain the methods used for sensitivity analysis, | * Explain the methods used for sensitivity analysis, including any initial screening and model approximations and assumptions. | ||
* State whether a local or global sensitivity analysis approach is used. | * State whether a local or global sensitivity analysis approach is used. | ||
* Give the QoI used for the sensitivity analysis. | * Give the QoI used for the sensitivity analysis. | ||
* For a global sensitivity analysis, describe the sampling scheme along with the rationale for selection, including the sampling technique, number of model realizations, and random seed for the model realizations. | * For a global sensitivity analysis, describe the sampling scheme along with the rationale for selection, including the sampling technique, number of model realizations, and random seed for the model realizations. | ||
* Provide the results of the sensitivity analysis, including the most important model inputs identified; a measure of the input importance, such as the variance explained by the most important inputs; and relevant graphical summaries of the sensitivity analysis results. | * Provide the results of the sensitivity analysis, including the most important model inputs identified; a measure of the input importance, such as the variance explained by the most important inputs; and relevant graphical summaries of the sensitivity analysis results. | ||
SA-3 | SA-3 Modified approved code (QV-1C) with limited independent variables (e.g., <5, determined on a case-by-case basis) | ||
* Describe analyses, important input, and measure | Yes | ||
variables (e.g., | * Describe analyses, important input, and measure of input importance. | ||
SA-4 Modified approved code (QV-1C) with many independent variables (e.g., >5, determined on a case-by-case basis) | |||
Yes | |||
* See the submittal guidelines for SA-2. | * See the submittal guidelines for SA-2. | ||
SA-5 First-of-a-kind code (QV-2, QV-3) with limited independent variables (e.g., <5, determined on a case-by-case basis) | |||
Yes | |||
* See the submittal guidelines for SA-3. | * See the submittal guidelines for SA-3. | ||
QV-3) with | SA-6 First-of-a-kind code (QV-2, QV-3) with many independent variables (e.g., >5, determined on a case-by-case basis) | ||
Yes, with sub-model SA as appropriate | |||
* See the submittal guidelines for SA-2. | * See the submittal guidelines for SA-2. | ||
* Indicate how the sensitivity analysis results informed future uncertainty propagation for estimation of the QoI and associated uncertainty. | |||
* Indicate how the sensitivity analysis results | |||
* State whether the results of the sensitivity analysis are consistent with the expected important inputs based on expert judgment. | * State whether the results of the sensitivity analysis are consistent with the expected important inputs based on expert judgment. | ||
a | a Local sensitivity analysis may be used as a screening step if completing a global sensitivity analysis with all inputs is not computationally feasible (as the cost of performing a global sensitivity analysis increases with the number of inputs). The results from local sensitivity analysis can help reduce the input space for a global sensitivity analysis, but local sensitivity analysis does have its risks in that it can miss important inputs if the input/output relationship is nonlinear. Sensitivity analysis should be performed unless there is a strong basis for what inputs are important (e.g., previous analyses, expert judgment, or it is obvious what inputs are important since it is a simple code). | ||
b | b Inputs must remain the same because sensitivity is dependent on the input distributions. | ||
2.10. Quantity of Interest Uncertainty Characterization The applicant should characterize the uncertainty of the QoI to interpret the results of the analysis. In its description of the QoI uncertainty, the applicant should include a measure of the best estimate and uncertainty of the QoI, a graphical summary of the QoI uncertainty, and a detailed description of how the best estimate and its uncertainty were calculated. The applicant should also summarize the key uncertainties considered in the analysis, as well as any major assumptions (including conservatisms and simplifications) and assess their impact on the analysis conclusions. | RG 1.245 Revision 0, Page 23 2.10. | ||
Quantity of Interest Uncertainty Characterization The applicant should characterize the uncertainty of the QoI to interpret the results of the analysis. In its description of the QoI uncertainty, the applicant should include a measure of the best estimate and uncertainty of the QoI, a graphical summary of the QoI uncertainty, and a detailed description of how the best estimate and its uncertainty were calculated. The applicant should also summarize the key uncertainties considered in the analysis, as well as any major assumptions (including conservatisms and simplifications) and assess their impact on the analysis conclusions. | |||
The applicant should independently assess each QoI and determine the category for each applicable QoI, as shown in Table C-9. The applicant should follow the guidelines in Table C-9 to document the QoI uncertainty of each QoI, based on the category of each QoI. | The applicant should independently assess each QoI and determine the category for each applicable QoI, as shown in Table C-9. The applicant should follow the guidelines in Table C-9 to document the QoI uncertainty of each QoI, based on the category of each QoI. | ||
Table C-9: Submittal Guidelines for Output Uncertainty Characterization Category | Table C-9: Submittal Guidelines for Output Uncertainty Characterization Category Description Submittal Guidelines O-1 Acceptance criteria met with at least one order of magnitude margin | ||
* Give a measure of the best estimate and uncertainty in the | * Give a measure of the best estimate and uncertainty in the QoI. | ||
* Include a graphical display of the output uncertainty. | * Include a graphical display of the output uncertainty. | ||
* Describe how the best estimate and its uncertainty were calculated, including a clear description of the types of uncertainty (e.g., input, sampling, epistemic) being summarized. | * Describe how the best estimate and its uncertainty were calculated, including a clear description of the types of uncertainty (e.g., input, sampling, epistemic) being summarized. | ||
* Summarize key uncertainties considered in the analysis and any major assumptions, conservatisms, or simplifications that were included and assess (qualitative or quantitative) their effect on the analysis conclusions. | * Summarize key uncertainties considered in the analysis and any major assumptions, conservatisms, or simplifications that were included and assess (qualitative or quantitative) their effect on the analysis conclusions. | ||
O-2A | O-2A Acceptance criteria met with less than one order of magnitude margin and a strong basis for input distributions and uncertainty classification | ||
* See the submittal guidelines for O-1 and provide the | * See the submittal guidelines for O-1 and provide the reasoning behind a strong basis. | ||
O-2B | |||
[Acceptance criteria met with less than one order of magnitude margin] | |||
and [no strong basis for input distributions or uncertainty classification, or both] | |||
* See the submittal guidelines for O-1. | * See the submittal guidelines for O-1. | ||
* Include a sensitivity analysis (if important inputs are unknown) and sensitivity studies for any inputs that do not have a strong basis. | |||
O-3 O-1, O-2A, or O-2B and potential unknowns | |||
* Include a sensitivity analysis (if important inputs are | * See the submittal guidelines for O-1 and provide the reasoning behind a strong basis. | ||
O-3 | |||
* See the submittal guidelines for O-1 and provide the | |||
* Describe potential unknowns and their possible effect on analysis results. | * Describe potential unknowns and their possible effect on analysis results. | ||
OR | OR | ||
* Include a sensitivity analysis (if important inputs are unknown) and sensitivity studies for any inputs that do not have a strong basis. | * Include a sensitivity analysis (if important inputs are unknown) and sensitivity studies for any inputs that do not have a strong basis. | ||
2.11. Sensitivity Studies In most cases, the applicant should perform sensitivity studies to understand how analysis assumptions impact the results of the overall analysis, to show why some assumptions may or may not impact the results, and to understand new and complex codes, models, or phenomena, especially if there are large perceived uncharacterized uncertainties. The applicant should assess its PFM software and analysis to determine the sensitivity studies category shown in Table C-10. The applicant should follow the guidelines in Table C-10 to document the details of sensitivity studies. If the combination of PFM software and analysis belongs to category SS-1 in Table C-10, the staff does not recommend performing sensitivity studies. | RG 1.245 Revision 0, Page 24 2.11. | ||
If the submittal presents several PFM analyses, the applicant should determine the sensitivity studies category for each PFM analysis and document sensitivity studies for each PFM analysis according to the guidelines in Table C-10. | Sensitivity Studies In most cases, the applicant should perform sensitivity studies to understand how analysis assumptions impact the results of the overall analysis, to show why some assumptions may or may not impact the results, and to understand new and complex codes, models, or phenomena, especially if there are large perceived uncharacterized uncertainties. The applicant should assess its PFM software and analysis to determine the sensitivity studies category shown in Table C-10. The applicant should follow the guidelines in Table C-10 to document the details of sensitivity studies. If the combination of PFM software and analysis belongs to category SS-1 in Table C-10, the staff does not recommend performing sensitivity studies. | ||
If the submittal presents several PFM analyses, the applicant should determine the sensitivity studies category for each PFM analysis and document sensitivity studies for each PFM analysis according to the guidelines in Table C-10. | |||
Table C-10: Submittal Guidelines for Sensitivity Studies Category | RG 1.245 Revision 0, Page 25 Table C-10: Submittal Guidelines for Sensitivity Studies Category Description Sensitivity Study Needed? | ||
SS-1 | Submittal Guidelines SS-1 Category QV-1A code with same QoI characteristic a No | ||
* Summarize sensitivity studies conducted in | * Summarize sensitivity studies conducted in prior approval. | ||
SS-2 | SS-2 Category QV-1A code with different QoI characteristic Limited, focused on inputs related to QoI | ||
* Summarize past sensitivity studies conducted | * Summarize past sensitivity studies conducted in prior approval and current sensitivity studies. | ||
SS-3 | SS-3 Category QV-1B or QV-1C code with limited independent variables (e.g., <5, determined on a case-by-case basis) | ||
Limited, focused on impact of modification | |||
* Summarize past and current sensitivity studies. | * Summarize past and current sensitivity studies. | ||
code with | SS-4 Category QV-1B or QV-1C code with many independent variables (e.g., >5, determined on a case-by-case basis) | ||
Yes, focused on inputs related to QoI | |||
* Summarize past and current sensitivity studies. | * Summarize past and current sensitivity studies. | ||
* List the uncertain assumptions that are considered for sensitivity studies. | |||
* List the uncertain assumptions that are | |||
* State the impact and conclusion of each sensitivity study. | * State the impact and conclusion of each sensitivity study. | ||
* Give the rationale for why certain assumptions were or were not considered for sensitivity studies. | * Give the rationale for why certain assumptions were or were not considered for sensitivity studies. | ||
Line 339: | Line 334: | ||
* Describe how each sensitivity study is translated into model realizations and compare the study and the reference realization. | * Describe how each sensitivity study is translated into model realizations and compare the study and the reference realization. | ||
* List changes to the code and the QA procedure used. | * List changes to the code and the QA procedure used. | ||
SS-5 | SS-5 Category QV-2 or QV-3 code with limited independent variables (e.g., <5, determined on a case-by-case basis) | ||
Yes | |||
* See the submittal guidelines for SS-4. | * See the submittal guidelines for SS-4. | ||
code with | SS-6 Category QV-2 or QV-3 code with many independent variables (e.g., >5, determined on a case-by-case basis) | ||
Yes, model and input studies | |||
* See the submittal guidelines for SS-4. | * See the submittal guidelines for SS-4. | ||
a Inputs must remain the same because sensitivity is dependent on the input distributions. | |||
D. IMPLEMENTATION The NRC staff may use this RG as a reference in its regulatory processes, such as licensing, inspection, or enforcement. However, the NRC staff does not intend to use the guidance in this RG to support NRC staff actions in a manner that would constitute backfitting as that term is defined in 10 CFR 50.109, Backfitting, 10 CFR 72.62, Backfitting, or as described in NRC Management Directive 8.4, Management of Backfitting, Forward Fitting, Issue Finality, and Information Requests, (Ref. 23) nor does the NRC staff intend to use the guidance to affect the issue finality of an approval under 10 CFR Part 52, Licenses, Certifications, and Approvals for Nuclear Power Plants. The staff also does not intend to use the guidance to support NRC staff actions in a manner that constitutes forward fitting as that term is defined and described in Management Directive 8.4. If a licensee believes that the NRC is using this RG in a manner inconsistent with the discussion in this Implementation section, then the licensee may file a backfitting or forward fitting appeal with the NRC in accordance with the process in Management Directive 8.4. | RG 1.245 Revision 0, Page 26 D. IMPLEMENTATION The NRC staff may use this RG as a reference in its regulatory processes, such as licensing, inspection, or enforcement. However, the NRC staff does not intend to use the guidance in this RG to support NRC staff actions in a manner that would constitute backfitting as that term is defined in 10 CFR 50.109, Backfitting, 10 CFR 72.62, Backfitting, or as described in NRC Management Directive 8.4, Management of Backfitting, Forward Fitting, Issue Finality, and Information Requests, (Ref. 23) nor does the NRC staff intend to use the guidance to affect the issue finality of an approval under 10 CFR Part 52, Licenses, Certifications, and Approvals for Nuclear Power Plants. The staff also does not intend to use the guidance to support NRC staff actions in a manner that constitutes forward fitting as that term is defined and described in Management Directive 8.4. If a licensee believes that the NRC is using this RG in a manner inconsistent with the discussion in this Implementation section, then the licensee may file a backfitting or forward fitting appeal with the NRC in accordance with the process in Management Directive 8.4. | ||
GLOSSARY acceptance | RG 1.245 Revision 0, Page 27 GLOSSARY acceptance criteria Set of conditions that must be met to achieve success for the desired application. | ||
aleatory uncertainty Uncertainty based on the randomness of the nature of the events or phenomena that cannot be reduced by increasing the analysts knowledge of the systems being modeled. | |||
assumptions | assumptions A decision or judgment that is made in the development of a model or analysis. | ||
best estimate | best estimate Approximation of a quantity based on the best available information. Models that attempt to fit data or phenomena as best as possible; that is, models that do not intentionally bound data for a given phenomenon or are not intentionally conservative or optimistic. | ||
code | code The computer implementation of algorithms developed to facilitate the formulation and approximation solution of a class of problems. | ||
conservative | conservative analysis An analysis that uses assumptions such that the assessed outcome is meant to be less favorable than the expected outcome. | ||
convergence | convergence analysis An analysis with the purpose of assessing the approximation error in the quantity of interest estimates to establish that conclusions of the analysis would not change solely due to sampling uncertainty. | ||
correlation | correlation A general term for interdependence between pairs of variables. | ||
deterministic | deterministic A characteristic of decision-making in which results from engineering analyses not involving probabilistic considerations are used to support a decision. Consistent with the principles of determinism, which hold that specific causes completely and certainly determine effects of all sorts. Also refers to fixed model inputs. | ||
distribution | distribution A function specifying the values that the random variable can take and the likelihood they will occur. | ||
epistemic | epistemic uncertainty The uncertainty related to the lack of knowledge or confidence about the system or model; also known as state-of-knowledge uncertainty. The American Society of Mechanical Engineers/American Nuclear Society PRA standard (Ref. 24) defines epistemic uncertainty as the uncertainty attributable to incomplete knowledge about a phenomenon that affects our ability to model it. Epistemic uncertainty is reflected in ranges of values for parameters, a range of viable models, the level of model detail, multiple expert interpretations, and statistical confidence. In principle, epistemic uncertainty can be reduced by the accumulation of additional information. | ||
(Epistemic uncertainty is sometimes also called modeling uncertainty.) | (Epistemic uncertainty is sometimes also called modeling uncertainty.) | ||
expert judgment Information (or opinion) provided by one or more technical experts based on their experience and knowledge. Used when there is a lack of information, for example, if certain parameter values are unknown, or there are questions about phenomenology in accident progression. May be part of a structured approach, such as expert elicitation, but is not necessarily as formal. May be the opinion of one or more experts, whereas expert elicitation is a highly structured process in which the opinions of several experts are sought, collected, and aggregated in a very formal way. | expert judgment Information (or opinion) provided by one or more technical experts based on their experience and knowledge. Used when there is a lack of information, for example, if certain parameter values are unknown, or there are questions about phenomenology in accident progression. May be part of a structured approach, such as expert elicitation, but is not necessarily as formal. May be the opinion of one or more experts, whereas expert elicitation is a highly structured process in which the opinions of several experts are sought, collected, and aggregated in a very formal way. | ||
global sensitivity | RG 1.245 Revision 0, Page 28 global sensitivity analysis The study of how the uncertainty in the output or quantity of interest of a model (numerical or otherwise) can be apportioned to different sources of uncertainty in the model input. The term global ensures that the analysis considers more than just local or one-factor-at-a-time effects. Hence, interactions and nonlinearities are important components of a global statistical sensitivity analysis. | ||
important input | important input variable An input variable whose uncertainty contributes substantially to the uncertainty in the response. | ||
input | input Data or parameters that users can specify for a model; the output of the model varies as a function of the inputs, which can consist of physical values (e.g., material properties, tolerances) and model specifications (e.g., spatial resolution). | ||
local sensitivity | local sensitivity analysis A sensitivity analysis that is relative to location in the input space chosen and not for the entire input space. | ||
model | model A representation of a physical process that allows for prediction of the process behavior. | ||
outputs | outputs A value calculated by the model given a set of inputs. | ||
parameter | parameter A numerical characteristic of a population or probability distribution. More technically, the variables used to calculate and describe frequencies and probabilities. | ||
probabilistic | probabilistic A characteristic of an evaluation that considers the likelihood of events. | ||
quantity of | quantity of interest A numerical characteristic of the system being modeled, the value of which is of interest to stakeholders, typically because it informs a decision. Can refer to either a physical quantity that is an output from a model or a given feature of the probability distribution function of the output of a deterministic model with uncertain inputs. | ||
random variable | random variable A variable, the values of which occur according to some specified probability distribution. | ||
realization | realization The execution of a model for a single set of input parameter values. | ||
risk-informed | risk-informed A characteristic of decision making in which risk results or insights are used together with other factors to support a decision. | ||
sampling | sampling The process of selecting some part of a population to observe, so as to estimate something of interest about the whole population. | ||
sampling | sampling uncertainty The uncertainty in an estimate of a quantity of interest that arises due to finite sampling. Different sets of model realizations will result in different estimates. This type of uncertainty contributes to uncertainty in the true value of the quantity of interest and is often summarized using the sampling variance. | ||
sensitivity | sensitivity analysis The study of how uncertainty in the output of a model can be apportioned to different sources of uncertainty in the model input. | ||
sensitivity studies PFM analyses that are conducted under credible alternative assumptions. | sensitivity studies PFM analyses that are conducted under credible alternative assumptions. | ||
simulation | simulation The execution of a computer code to mimic an actual system. Typically, comprises a set of model realizations. | ||
software quality A planned and systematic pattern of all actions necessary to provide adequate | RG 1.245 Revision 0, Page 29 software quality assurance A planned and systematic pattern of all actions necessary to provide adequate confidence that a software item or product conforms to established technical requirements; a set of activities designed to evaluate the process by which the software products are developed or manufactured. | ||
surrogate | surrogate A function that predicts outputs from a model as a function of the model inputs. Also known as response surface, metamodel, or emulator. | ||
uncertainty | uncertainty propagation Quantifying the uncertainty of a models responses that results from the propagation through the model of the uncertainty in the models inputs. | ||
validation | validation The process of determining the degree to which a model is an accurate representation of the real world from the perspective of the intended uses of the model. | ||
variance | variance The second moment of a probability distribution, defined as E(X-)2, where is the first moment of the random variable X. A common measure of variability around the mean of a distribution. | ||
verification | verification The process of determining whether a computer program (code) correctly solves the mathematical-model equations. This includes code verification (determining whether the code correctly implements the intended algorithms) and solution verification (determining the accuracy with which the algorithms solve the mathematical-model equations for specified quantities of interest). | ||
LIST OF FIGURES Figure C-1: PFM analysis flowchart in support of regulatory submittals (corresponding sections of this guide are shown in parentheses when applicable) ........................................................................................ 9 Figure C-2: Decision tree for statistical convergence categories. .............................................................. 19 | RG 1.245 Revision 0, Page 30 LIST OF FIGURES Figure C-1: PFM analysis flowchart in support of regulatory submittals (corresponding sections of this guide are shown in parentheses when applicable)........................................................................................ 9 Figure C-2: Decision tree for statistical convergence categories............................................................... 19 | ||
LIST OF TABLES Table C-1: Submittal Content Mapping to NUREG/CR-7278 .................................................................. 10 Table C-2: SQA and V&V Code Categories ............................................................................................. 13 Table C-3: Submittal Guidelines for Models ............................................................................................. 15 Table C-4: Categorization Based on Knowledge and the Importance of Inputs Used in the Analysis...... 16 Table C-5: Submittal Guidelines for Inputs ............................................................................................... 17 Table C-6: Submittal Guidelines for Uncertainty Propagation .................................................................. 18 Table C-7: Submittal Guidelines for Statistical Convergence ................................................................... 20 Table C-8: Submittal Guidelines for Sensitivity Analyses ........................................................................ 22 Table C-9: Submittal Guidelines for Output Uncertainty Characterization ............................................... 23 Table C-10: Submittal Guidelines for Sensitivity Studies ......................................................................... 25 | RG 1.245 Revision 0, Page 31 LIST OF TABLES Table C-1: Submittal Content Mapping to NUREG/CR-7278.................................................................. 10 Table C-2: SQA and V&V Code Categories............................................................................................. 13 Table C-3: Submittal Guidelines for Models............................................................................................. 15 Table C-4: Categorization Based on Knowledge and the Importance of Inputs Used in the Analysis...... 16 Table C-5: Submittal Guidelines for Inputs............................................................................................... 17 Table C-6: Submittal Guidelines for Uncertainty Propagation.................................................................. 18 Table C-7: Submittal Guidelines for Statistical Convergence................................................................... 20 Table C-8: Submittal Guidelines for Sensitivity Analyses........................................................................ 22 Table C-9: Submittal Guidelines for Output Uncertainty Characterization............................................... 23 Table C-10: Submittal Guidelines for Sensitivity Studies......................................................................... 25 | ||
REFERENCES1 1 | RG 1.245 Revision 0, Page 32 REFERENCES1 1 | ||
2 | U.S. Nuclear Regulatory Commission, 10 CFR Part 50: Domestic Licensing of Production and Utilization Facilities, Washington, DC: U.S. NRC. | ||
3 | 2 U.S. Nuclear Regulatory Commission, 10 CFR Part 52: Licenses, Certifications, and Approvals for Nuclear Power Plants, Washington, DC: U.S. NRC. | ||
4 | 3 U.S. Nuclear Regulatory Commission, 10 CFR Part 71: Packaging and Transportation of Radioactive Material, Washington, DC: U.S. NRC. | ||
5 | 4 U.S. Nuclear Regulatory Commission, 10 CFR Part 72: Licensing Requirements for the Independent Storage of Spent Nuclear Fuel, High-Level Radioactive Waste, and Reactor-Related Greater Than Class C Waste, Washington, DC: U.S. NRC. | ||
6 | 5 U.S. Nuclear Regulatory Commission, NUREG-0800: Standard Review Plan for the Review of Safety Analysis Reports for Nuclear Power Plants: LWR Edition, Agencywide Documents Access and Management System (ADAMS) Accession No. ML042080088, Washington, DC, U.S. NRC. | ||
6 L. Hund, J. Lewis, N. Martin, M. Starr, D. Brooks, A. Zhang, R. Dingreville, A. Eckert, J. | |||
Mullins, P. Raynaud, D. Rudland, D. Dijamco, S. Cumblidge, NUREG/CR-7278: Technical Basis for the Use of Probabilistic Fracture Mechanics in Regulatory Applications, ADAMS Accession No. ML21257A237, U.S. NRC, Washington, DC, U.S. NRC. | Mullins, P. Raynaud, D. Rudland, D. Dijamco, S. Cumblidge, NUREG/CR-7278: Technical Basis for the Use of Probabilistic Fracture Mechanics in Regulatory Applications, ADAMS Accession No. ML21257A237, U.S. NRC, Washington, DC, U.S. NRC. | ||
7 | 7 U.S. Nuclear Regulatory Commission, RG-1.174: An Approach for Using Probabilistic Risk Assessment in Risk-Informed Decisions on Plant-Specific Changes to the Licensing Basis, Washington, DC, USA: U.S. NRC. | ||
8 | 8 U.S. Nuclear Regulatory Commission, RG-1.175: An Approach for Plant-Specific, Risk-Informed Decisionmaking: Inservice Testing, Washington, DC: U.S. NRC. | ||
9 | 9 U.S. Nuclear Regulatory Commission, RG-1.178: An Approach for Plant-Specific Risk-Informed Decisionmaking for Inservice Inspection of Piping, Washington, DC: U.S. NRC. | ||
10 U.S. Nuclear Regulatory Commission, RG-1.200: An Approach for Determining the Technical Adequacy of Probabilistic Risk Assessment Results for Risk-Informed Activities, Washington, DC, USA: U.S. NRC. | 10 U.S. Nuclear Regulatory Commission, RG-1.200: An Approach for Determining the Technical Adequacy of Probabilistic Risk Assessment Results for Risk-Informed Activities, Washington, DC, USA: U.S. NRC. | ||
1 | 1 Publicly available NRC published documents are available electronically through the NRC Library on the NRCs public Web site at http://www.nrc.gov/reading-rm/doc-collections/ and through the NRCs Agencywide Documents Access and Management System (ADAMS) at http://www.nrc.gov/reading-rm/adams.html. The documents can also be viewed online or printed for a fee in the NRCs Public Document Room (PDR) at 11555 Rockville Pike, Rockville, MD. For problems with ADAMS, contact the PDR staff at 301-415-4737 or (800) 397-4209; fax (301) 415-3548; or e-mail pdr.resource@nrc.gov. | ||
IAEA safety requirements and guides may be found at WWW.IAEA.Org/ or by writing the International Atomic Energy Agency, P.O. Box 100 Wagramer Strasse 5, A-1400 Vienna, Austria; telephone (+431) 2600-0; fax (+431) 2600-7; or e-mail Official.Mail@IAEA.Org. It should be noted that some of the international recommendations do not correspond to the NRC requirements which take precedence over the international guidance. | IAEA safety requirements and guides may be found at WWW.IAEA.Org/ or by writing the International Atomic Energy Agency, P.O. Box 100 Wagramer Strasse 5, A-1400 Vienna, Austria; telephone (+431) 2600-0; fax (+431) 2600-7; or e-mail Official.Mail@IAEA.Org. It should be noted that some of the international recommendations do not correspond to the NRC requirements which take precedence over the international guidance. | ||
11 U.S. Nuclear Regulatory Commission, RG-1.201: Guidelines for Categorizing Structures, Systems, and Components in Nuclear Power Plants According to Their Safety Significance, Washington, DC: U.S. NRC. | RG 1.245 Revision 0, Page 33 11 U.S. Nuclear Regulatory Commission, RG-1.201: Guidelines for Categorizing Structures, Systems, and Components in Nuclear Power Plants According to Their Safety Significance, Washington, DC: U.S. NRC. | ||
12 P.T. Williams, T.L. Dickson, B.R. Bass, and H.B. Klasky, Fracture Analysis of Vessels - Oak Ridge FAVOR, v16.1, Computer Code: Theory and Implementation of Algorithms, Methods, and Correlations, ORNL/LTR-2016/309, Oak Ridge, TN, September 2016 13 U.S. Nuclear Regulatory Commission: xLPR v2.1Public Release Announcement, ADAMS Accession No. ML20157A120, Washington, DC: U.S. NRC. | 12 P.T. Williams, T.L. Dickson, B.R. Bass, and H.B. Klasky, Fracture Analysis of Vessels - Oak Ridge FAVOR, v16.1, Computer Code: Theory and Implementation of Algorithms, Methods, and Correlations, ORNL/LTR-2016/309, Oak Ridge, TN, September 2016 13 U.S. Nuclear Regulatory Commission: xLPR v2.1Public Release Announcement, ADAMS Accession No. ML20157A120, Washington, DC: U.S. NRC. | ||
14 Patrick A.C. Raynaud, RG 1.99 Revision 2 Update FAVOR Scoping Study, Technical Letter Report TLR-RES/DE/CIB-2020-09, October 2020, ADAMS Accession No. ML20300A551, Washington, DC: U.S. NRC. | 14 Patrick A.C. Raynaud, RG 1.99 Revision 2 Update FAVOR Scoping Study, Technical Letter Report TLR-RES/DE/CIB-2020-09, October 2020, ADAMS Accession No. ML20300A551, Washington, DC: U.S. NRC. | ||
Line 424: | Line 415: | ||
23 U.S. Nuclear Regulatory Commission, Management Directive and Handbook 8.4: Management of Backfitting, Forward Fitting, Issue Finality, and Information Requests," U.S. NRC, Washington, DC, USA, 2019. | 23 U.S. Nuclear Regulatory Commission, Management Directive and Handbook 8.4: Management of Backfitting, Forward Fitting, Issue Finality, and Information Requests," U.S. NRC, Washington, DC, USA, 2019. | ||
24 Assessment for Nuclear Power Plant Applications, New York, NY: ASME, 2008 (R2019).-S - | 24 Assessment for Nuclear Power Plant Applications, New York, NY: ASME, 2008 (R2019).-S - | ||
2008(R2019): Standard for Level 1 / Large Early Release Frequency Probabilistic Risk Assessment for Nuclear Power Plant Applications, New York, NY: ASME, 2008 (R2019). | 2008(R2019): Standard for Level 1 / Large Early Release Frequency Probabilistic Risk Assessment for Nuclear Power Plant Applications, New York, NY: ASME, 2008 (R2019).}} | ||
Latest revision as of 19:49, 27 November 2024
ML21334A158 | |
Person / Time | |
---|---|
Issue date: | 01/31/2022 |
From: | Patrick Raynaud NRC/RES/DE/CIB |
To: | |
Song K | |
Shared Package | |
ML21259A190 | List: |
References | |
DG 1382 RG 1.245 Rev 0 | |
Download: ML21334A158 (33) | |
Text
U.S. NUCLEAR REGULATORY COMMISSION REGULATORY GUIDE 1.245, REVISION 0 Issue Date: January 2022 Technical Lead: Patrick Raynaud Written suggestions regarding this guide or development of new guides may be submitted through the NRCs public Web site in the NRC Library at https://nrcweb.nrc.gov/reading-rm/doc-collections/reg-guides/, under Document Collections, in Regulatory Guides, at https://nrcweb.nrc.gov/reading-rm/doc-collections/reg-guides/contactus.html.
Electronic copies of this RG, previous versions of RGs, and other recently issued guides are also available through the NRCs public Web site in the NRC Library at https://nrcweb.nrc.gov/reading-rm/doc-collections/reg-guides/, under Document Collections, in Regulatory Guides. This RG is also available through the NRCs Agencywide Documents Access and Management System (ADAMS) at http://www.nrc.gov/reading-rm/adams.html, under ADAMS Accession Number (No.) ML21334A158. The regulatory analysis may be found in ADAMS under Accession No. ML21034A261. The associated draft guide DG-1382 may be found in ADAMS under Accession No. ML21034A328, and the staff responses to the public comments on DG-1382 may be found under ADAMS Accession No ML21306A292.
PREPARING PROBABILISTIC FRACTURE MECHANICS SUBMITTALS A. INTRODUCTION Purpose This regulatory guide (RG) describes a framework to develop the contents of a licensing submittal that the staff of the U.S. Nuclear Regulatory Commission (NRC) considers acceptable when performing probabilistic fracture mechanics (PFM) analyses in support of regulatory applications.
Applicability This RG applies to nonreactor and reactor licensees that elect to use PFM as part of the technical basis for a licensing action. PFM could be used in a wide variety of applications to meet a wide range of regulations; consequently, it is not possible to provide a comprehensive list of such applications.
However, the staff anticipates that this RG could apply to nonreactor and reactor licensees subject to Title 10 of the Code of Federal Regulations (10 CFR) Part 50, Domestic Licensing of Production and Utilization Facilities (Ref. 1) ; 10 CFR Part 52, Licenses, Certifications, and Approvals for Nuclear Power Plants (Ref. 2); 10 CFR Part 71, Packaging and Transportation of Radioactive Material (Ref.
3); and 10 CFR Part 72, Licensing Requirements for the Independent Storage of Spent Nuclear Fuel, High-Level Radioactive Waste, and Reactor-Related Greater Than Class C Waste (Ref. 4).
Applicable Regulations This RG discusses acceptable ways to present PFM analyses in regulatory submittals to the NRC and can be used to demonstrate compliance with a wide range of regulations. The following is a list of regulations where PFM may be a useful tool for developing the technical basis for submittals. Potential uses of PFM in regulatory applications are not limited to the following list of regulations.
10 CFR Part 50, Domestic Licensing of Production and Utilization Facilities, applies to applicants for, and holders of, licenses for production and utilization facilities.
o 10 CFR 50.55a: Codes and standards.
o 10 CFR 50.60: Acceptance criteria for fracture prevention measures for light-water nuclear power reactors for normal operation.
RG 1.245 Revision 0, Page 2 o 10 CFR 50.61: Fracture toughness requirements for protection against pressurized thermal shock events.
o 10 CFR 50.61a: Alternate fracture toughness requirements for protection against pressurized thermal shock events.
o 10 CFR 50.66: Requirements for thermal annealing of the reactor pressure vessel.
o 10 CFR 50.69: Risk-informed categorization and treatment of structures, systems, and components for nuclear power reactors.
o Appendix G to Part 50: Fracture Toughness Requirements.
o Appendix H to Part 50: Reactor Vessel Material Surveillance Program Requirements.
10 CFR Part 52, Licenses, Certifications, and Approvals for Nuclear Power Plants, applies to applicants for, and holders of, early site permits, standard design certifications, combined licenses, standard design approvals, and manufacturing licenses for nuclear power facilities.
o Appendix A to Part 52: Design Certification Rule for the U.S. Advanced Boiling Water Reactor.
o Appendix B to Part 52: Design Certification Rule for the System 80+ Design.
o Appendix C to Part 52: Design Certification Rule for the AP600 Design.
o Appendix D to Part 52: Design Certification Rule for the AP1000 Design.
o Appendix E to Part 52: Design Certification Rule for the ESBWR Design.
o Appendix F to Part 52: Design Certification Rule for the APR1400 Design.
10 CFR Part 71, Packaging and Transportation of Radioactive Material, provides requirements for packaging, preparation for shipment, and transportation of licensed material.
o 10 CFR 71.43: General standards for all packages.
o 10 CFR 71.45: Lifting and tie-down standards for all packages.
o 10 CFR 71.51: Additional requirements for Type B packages.
o 10 CFR 71.55: General requirements for fissile material packages.
o 10 CFR 71.64: Special requirements for plutonium air shipments.
o 10 CFR 71.71: Normal conditions of transport.
o 10 CFR 71.73: Hypothetical accident conditions.
o 10 CFR 71.74: Accident conditions for air transport of plutonium.
o 10 CFR 71.75: Qualification of special form radioactive material.
RG 1.245 Revision 0, Page 3 10 CFR Part 72, Licensing Requirements for the Independent Storage of Spent Nuclear Fuel, High-Level Radioactive Waste, and Reactor-Related Greater Than Class C Waste, provides requirements, procedures, and criteria for the issuance of licenses to receive, transfer, and possess power reactor spent fuel, power reactor-related Greater than Class C (GTCC) waste, and other radioactive materials associated with spent fuel storage in an independent spent fuel storage installation (ISFSI) and the terms and conditions under which the Commission will issue these licenses.
o 10 CFR 72.122: Overall requirements.
Related Guidance PFM is likely to be used to risk-inform licensing applications. Consequently, guidance documents such as the following related to risk-informed activities as well as probabilistic risk assessment (PRA) may be related to this RG:
NUREG-0800, Standard Review Plan for the Review of Safety Analysis Reports for Nuclear Power Plants: LWR Edition (SRP), Chapter 19, Severe Accidents, Section 19.2, Review of Risk Information Used To Support Permanent Plant-Specific Changes to the Licensing Basis:
General Guidance (Ref. 5), provides general guidance on applications that address changes to the licensing basis.
NUREG/CR-7278, Technical Basis for the Use of Probabilistic Fracture Mechanics in Regulatory Applications (Ref. 6)
RG 1.174, An Approach for Using Probabilistic Risk Assessment of Risk-Informed Decisions on Plant-Specific Changes to the Licensing Basis (Ref. 7).
RG 1.175, An Approach for Plant-Specific, Risk-Informed Decisionmaking: Inservice Testing (Ref. 8).
RG 1.178, An Approach for Plant-Specific Risk-Informed Decisionmaking for Inservice Inspection of Piping (Ref. 9).
RG 1.200, Acceptability of Probabilistic Risk Assessment Results for Risk-Informed Activities (Ref. 10).
RG 1.201, Guidelines for Categorizing Structures, Systems, and Components in Nuclear Power Plants According to Their Safety Significance (Ref. 11), discusses an approach to support the new rule established as 10 CFR 50.69, Risk-Informed Categorization and Treatment of Structures, Systems, and Components for Nuclear Power Reactors.
Purpose of Regulatory Guides The NRC issues RGs to describe methods that are acceptable to the staff for implementing specific parts of the agencys regulations, to explain techniques that the staff uses in evaluating specific issues or postulated events, and to describe information that will assist the staff with its review of applications for permits and licenses. Regulatory guides are not NRC regulations and compliance with them is not mandatory. Methods and solutions that differ from those set forth in RGs are acceptable if supported by a basis for the issuance or continuance of a permit or license by the Commission.
RG 1.245 Revision 0, Page 4 Paperwork Reduction Act This RG provides voluntary guidance for implementing the mandatory information collections in 10 CFR Parts 50, 52, 71, and 72 that are subject to the Paperwork Reduction Act of 1995 (44 U.S.C. 3501 et. seq.). These information collections were approved by the Office of Management and Budget (OMB),
approval numbers 3150-0011, 3150-0151, 3150-0132, and 3150-0008. Send comments regarding this information collection to the FOIA, Library and Information Collections Branch (T6-A10M), U.S.
Nuclear Regulatory Commission, Washington, DC 20555-0001, or by e-mail to Infocollects.Resource@nrc.gov, and to the OMB reviewer at: OMB Office of Information and Regulatory Affairs (3150-0011, 3150-0151, 3150-0132, and 3150-0008), Attn: Desk Officer for the Nuclear Regulatory Commission, 725 17th Street, NW Washington, DC20503; e-mail:
oira_submission@omb.eop.gov.
Public Protection Notification The NRC may not conduct or sponsor, and no person is required to respond to, a collection of information unless the document requesting or requiring the collection displays a currently valid OMB control number.
RG 1.245 Revision 0, Page 5 TABLE OF CONTENTS Purpose............................................................................................................................................ 1 Applicability.................................................................................................................................... 1 Applicable Regulations.................................................................................................................... 1 Related Guidance............................................................................................................................. 3 Purpose of Regulatory Guides......................................................................................................... 3 Paperwork Reduction Act................................................................................................................ 4 Public Protection Notification.......................................................................................................... 4 Reason for Issuance......................................................................................................................... 6 Background...................................................................................................................................... 6 Consideration of International Standards......................................................................................... 6 Regulatory Position C.1................................................................................................................... 8
- 1.
General Considerations............................................................................................................ 8 1.1.
Graded Approach........................................................................................................... 8 1.2.
Analytical Steps in a Probabilistic Fracture Mechanics Analysis................................. 8 Regulatory Position C.2................................................................................................................. 10
- 2.
Probabilistic Fracture Mechanics Analysis and Submittal Contents...................................... 10 2.1.
Regulatory Context...................................................................................................... 10 2.2.
Information Made Available to the NRC Staff with a Probabilistic Fracture Mechanics Submittal................................................................................................... 10 2.3.
Quantities of Interest and Acceptance Criteria............................................................ 11 2.4.
Software Quality Assurance and Verification and Validation..................................... 11 2.5.
Models......................................................................................................................... 14 2.6.
Inputs........................................................................................................................... 16 2.7.
Uncertainty Propagation.............................................................................................. 18 2.8.
Convergence................................................................................................................ 19 2.9.
Sensitivity Analyses.................................................................................................... 21 2.10. Quantity of Interest Uncertainty Characterization....................................................... 23 2.11. Sensitivity Studies....................................................................................................... 24
RG 1.245 Revision 0, Page 6 B. DISCUSSION Reason for Issuance The NRC developed this RG to provide guidance for the use of PFM in regulatory applications. It is intended to ensure that the staff guidance is clear with regard to the contents of PFM regulatory applications. The use of this RG is anticipated to increase the efficiency of reviews for regulatory applications that use PFM as a supporting technical basis by providing a set of common guidelines for reviewers and licensees.
Background
In recent years, the NRC has observed an increase in the number of applications using PFM as a technical basis. The heightened focus on PFM is partly due to the increased emphasis on risk-informed regulation, but also because plant aging and new degradation mechanisms can be difficult to address using traditionally very conservative deterministic fracture mechanics. The increased use of PFM has also been facilitated by improvements in computational capability and the increased availability of PFM codes, such as Fracture Analysis of VesselsOak Ridge (FAVOR) (Ref. 12), Extremely Low Probability of Rupture (xLPR) (Ref. 13), and others. Furthermore, the NRC has used probabilistic fracture mechanics methods in developing regulatory positions, such as the alternate pressurized thermal shock (PTS) rule at 10 CFR 50.61a, Alternate fracture toughness requirements for protection against pressurized thermal shock events, and the 2020 assessment (Ref. 14) of RG 1.99, Radiation Embrittlement of Reactor Vessel Materials, (Ref. 15) Revision 2, issued May 1988.
In 2018, the NRC published the technical letter report, Important Aspects of Probabilistic Fracture Mechanics Analyses (Ref. 16), to outline the important concepts for using PFM in support of regulatory applications and held a public meeting to discuss this technical letter report. Following the October 23, 2018, public meeting on Discussion of a graded approach for probabilistic fracture mechanics codes and analyses for regulatory applications, (Ref. 17) the Electric Power Research Institute (EPRI) developed a proposal on the minimum contents of a submittal that uses PFM as part of its technical basis. Some licensees have submitted licensing applications claiming to have followed the EPRI minimum requirements. However, in reviewing these submittals, the NRC has found that the minimum requirements in the EPRI proposal are not always clear and concise. Further, the EPRI document does not precisely define its guidance when the minimum requirements specified in the EPRI guidance are not sufficient, leading to ambiguity, inefficient reviews, and uncertainty in regulatory outcomes. Importantly, the NRC staff accounted for EPRIs proposal when developing this RG. In 2021, the NRC will publish, concurrently with this RG, NUREG/CR-7278, Technical Basis for the use of Probabilistic Fracture Mechanics in Regulatory Applications (Ref. 6), which constitutes the detailed technical basis for this RG.
The NRC has an approved methodology for risk-informed decision making for design-basis changes (Ref. 7), and PFM may be used as a tool within that framework. The purpose of PFM is to model the behavior and degradation of systems more accurately and consequentially draw more precise and accurate conclusions about situations relative to performance criteria or design assumptions.
Consideration of International Standards The International Atomic Energy Agency (IAEA) works with member states and other partners to promote the safe, secure, and peaceful use of nuclear technologies. The IAEA develops safety requirements and safety guides for protecting people and the environment from harmful effects of ionizing radiation. These requirements and guides provide a system of safety standards categories that
RG 1.245 Revision 0, Page 7 reflect an international perspective on what constitutes a high level of safety. In developing or updating RGs the NRC has considered IAEA safety requirements, safety guides and other relevant reports in order to benefit from the international perspectives, pursuant to the Commissions International Policy Statement (Ref. 18) and NRC Management Directive and Handbook 6.6 (Ref. 19).
The following IAEA safety requirements and guides were considered in the development/ update of the RG:
International Atomic Energy Agency, Development and Application of Level 1 Probabilistic Safety Assessment for Nuclear Power Plants, IAEA Safety Standards Series No. SSG-3, IAEA, Vienna (2010) (Ref. 20).
RG 1.245 Revision 0, Page 8 C. STAFF REGULATORY GUIDANCE This section describes methods, approaches, and information that the NRC staff considers acceptable for performing PFM analyses and preparing the associated documentation in support of regulatory applications. To enhance the efficiency of the NRCs review of PFM submittals, the staff recommends that applicants using the framework presented in this guidance document identify any deviations in their application and provide explanations for each deviation.
Regulatory Position C.1
- 1. General Considerations 1.1.
Graded Approach For regulatory submittals to the NRC that use PFM as part of their supporting technical basis, the level of detail associated with the analysis and documentation activities should scale with the complexity and safety significance of the application, as well as the complexity of the supporting analysis (including methods and analysis tools).
This RG provides a graded set of guidelines on analysis steps, analytical software quality assurance (SQA) and verification and validation (V&V), and levels of associated documentation. When followed, these guidelines result in an acceptable practical framework for the content of PFM submittals to ensure the effectiveness and efficiency of NRC reviews of submittals containing PFM analysis and results.
1.2.
Analytical Steps in a Probabilistic Fracture Mechanics Analysis Applicants should follow the process charted in Figure C-1 when performing PFM analyses in support of regulatory applications. NUREG/CR-7278 describes the steps shown in detail, and Table C-1 shows the cross-referencing between the sections of this RG and the corresponding sections of NUREG/CR-7278.
RG 1.245 Revision 0, Page 9 Figure C-1: PFM analysis flowchart in support of regulatory submittals (corresponding sections of this guide are shown in parentheses when applicable)
RG 1.245 Revision 0, Page 10 Table C-1: Submittal Content Mapping to NUREG/CR-7278 RG Section NUREG/CR-7278 Sections Content 2.1 3.1.1 Regulatory Context 2.2 Information Made Available to NRC Staff 2.2.1 3.1.3 PFM Software 2.2.2 3.1.3 Supporting Documents 2.3 2.2.1 / 3.1.2 Quantities of Interest and Acceptance Criteria 2.4 2.2.2 / 3.1.3 SQA and V&V 2.5 2.2.3 / 3.1.3 Models 2.6 3.2.1 / 3.2.2 / 3.3.1 / 3.4.1 Inputs 2.7 3.3.1 Uncertainty Propagation 2.8 3.3.2 Convergence 2.9 3.3.3 Sensitivity Analyses 2.10 3.3.4 Output Uncertainty Characterization 2.11 3.4.1 / 3.4.2 Sensitivity Studies Regulatory Position C.2
- 2. Probabilistic Fracture Mechanics Analysis and Submittal Contents Each subsection in this Regulatory Position relates to an item expected in a submittal. The content in each subsection comes from, in large part, the suggested minimum content for PFM submittals that was developed in EPRIs white paper (Ref. 17). Tables in each subsection provide guidance for different documentation expectations. Each table contains circumstances under which specific information should be provided for a complete submittal. It is important to note that submittals should be informed by the specific details and elements of each analysis and need not include all of the listed elements, though careful consideration should be applied to arrive at that conclusion.
2.1.
Regulatory Context Regulatory submittals using PFM analyses should explain why a probabilistic approach is appropriate and how the probabilistic approach is used to demonstrate compliance with the regulatory criteria. When no specific regulatory acceptance criteria exist, the submittal should explain how the probabilistic approach informs the regulatory action and regulatory compliance demonstration. Applicants should be aware that the use of PFM in a regulatory submission is only one aspect of what is required for risk-informed decision making.
2.2.
Information Made Available to the NRC Staff with a Probabilistic Fracture Mechanics Submittal Applicants should make information supporting the submittal available for review. The NRC encourages applicants to discuss the contents of their submittal with the agency during pre-submittal meetings (the timing of such meetings is left to the applicant, but it could be desirable to schedule such meetings early in the lifecycle of the application) or other formal communications. Pre-submittal discussions should cover information that will be provided with the submittal, information that might be provided upon request, and information that may not be directly transmittable but might be reviewed under specific agreed-upon circumstances, such as an audit.
RG 1.245 Revision 0, Page 11 2.2.1 Probabilistic Fracture Mechanics Software In case the NRC staff determines that it is unable to perform independent confirmatory calculations or independent benchmarking of the PFM analyses, the applicant should ensure that an alternate approach is available to NRC reviewers. This will preferably be determined during preapplication meetings but can also be identified during the review. Such approaches may include one or more of the following options:
Provide NRC reviewers with direct access to the PFM software executable program and the necessary user instructions to use the tools.
Ensure the availability of analysts during NRC audits or NRC review meetings, such that the PFM submittal developers can run analysis cases as requested by NRC reviewers.
Allow the NRC reviewers to submit analysis requests to the applicant and provide the NRC reviewers with the results of the analyses, possibly as part of an audit or review meeting.
2.2.2 Probabilistic Fracture Mechanics Software Quality Assurance and Verification and Validation Documents The applicant should ensure that the SQA and V&V documentation for the PFM software is available for NRC review through in-person or virtual audits, if requested by the NRC.
2.3.
Quantities of Interest and Acceptance Criteria The quantities of interest (QoIs) to be compared to the acceptance criteria should be clearly defined and include the following:
the units of measurement and time period, the relationship to the outputs of the PFM software, the acceptance criteria, and if the QoI is a probability in the extreme tails of the distribution, a description of how this affected the analysis choices.
The use of previously approved acceptance criteria (if already in existence for the specific application at hand) is encouraged but should be appropriately justified and explained. Specifically, the applicant should ensure that inherent assumptions and requirements of the source activity are respected, and that any apparent differences are reconciled. If there is no precedent for an acceptance criterion, the applicant should derive probabilistic acceptance criteria based on risk-informed decision-making principles in accordance with RG 1.200 and RG 1.174, if applicable, and describe the bases for the chosen acceptance criteria.
If the PFM submittal includes more than one QoI, the applicant should document the above steps and information for each QoI.
2.4.
Software Quality Assurance and Verification and Validation In general, PFM software to be used in regulatory applications should be developed under the framework of an SQA plan and undergo V&V activities. The SQA and V&V activities should depend on the safety significance, complexity, past experience (previous use in regulatory applications), and status
RG 1.245 Revision 0, Page 12 of previous approval for the PFM software. The applicant should follow its SQA program and V&V procedures with these concepts in mind.
The applicant should determine to which Table C-2 category its PFM software belongs. The applicant should consider discussing its choice of categorization with the NRC in pre-submittal meetings to ensure that an acceptable determination has been made.
The applicant should perform and document SQA and V&V activities for the PFM software according to the guidelines in Table C-2. The NRC has approved applications using the following codes for a specific range of applications as of the publication of this RG:
the latest version of FAVOR, see FAVOR theory manual (Ref. 12) for validated application range the latest version of xLPR, see xLPR documentation (Ref. 21) for validated application range the version of the SRRA (Structural Reliability and Risk Assessment) code approved in Ref. 22, for the range approved in the referenced safety evaluation.
There may be instances where a code was approved just for a specific application, and these are considered approved for the same exact type of application.
RG 1.245 Revision 0, Page 13 Table C-2: SQA and V&V Code Categories Category Description Submittal Guidelines QV-1 Code used in NRC-approved application a QV-1A Exercised within previously validated range
- Demonstrate code applicability within the validated range.
- Describe features of the specific application where the code is validated and applicable (i.e., areas of known code capability).
QV-1B Exercised outside of previously validated range
- Provide evidence for the applicability of the code to the specific application with respect to the areas of unknown code capability.
- Describe features of the specific application where the code has not been previously validated and applied (i.e.,
areas of unknown code capability).
QV-1C Modified
- Give an SQA summary and V&V description for modified portions of the code.
- Demonstrate that the code was not broken as a result of changes.
- Make detailed documentation available for further review upon request (audit).
QV-2 Commercial off-the-shelf software designed for the specific purpose of the application b
- Demonstrate code applicability.
- Describe the software and its pedigree.
- Make software and documentation available for review upon request (audit).
QV-3 Custom code
- Summarize the SQA program and its implementation.
- Provide a basic description of the measures for quality assurance, including V&V of the PFM analysis code as applied in the subject report.
- For very simple applications, possibly provide the source code instead of standardized SQA and V&V.
- Include separate deterministic fracture mechanics analyses to support other validation results, as appropriate for a given application.
a As of the publication of this RG, PFM codes used in NRC-approved applications or having received general approval within a validated range include xLPR, FAVOR, and SRRA.
b Examples would include publicly available (for purchase or free) commercial software specifically to perform PFM analyses. Combinations of commercial off-the-shelf software may be acceptable (e.g., a finite-element software such as ABAQUS or ANSYS coupled with a probabilistic framework such as GoldSim or DAKOTA).
RG 1.245 Revision 0, Page 14 2.5.
Models The applicant should describe all the models implemented and used as part of the PFM analysis and software. Each model should be assessed independently and categorized as shown in Table C-3. The applicant should follow the submittal guidelines in Table C-3 for each model in the PFM software and/or analysis, based on the model category.
RG 1.245 Revision 0, Page 15 Table C-3: Submittal Guidelines for Models Category Description Submittal Guidelines M-1 Model from a code in category QV-1A within the same validated range
- Reference existing documentation for that model in the NRC-approved code, demonstrate that the current range of the model is within the previously approved and validated range, and demonstrate that the model functions as intended in the new software.
M-2 Model from a code in category QV-1B outside the validated range
- See the submittal guidelines for M-1, except demonstrate validity of the model for the new applicability range (document a comparison of model predictions for the entire new range to applicable supporting data, predictions made using alternative models, and/or using engineering judgment, optionally supported by quantitative goodness-of-fit analyses).
M-3 Model derived from a category M-1 or M-2 model
- See the submittal guidelines for M-2 and include a detailed description of changes to the M-1 or M-2 model, with justification for the validity of the new model.
M-4 Well-established model not previously part of an NRC-approved code
- Provide justification for model as being well-established by supporting references and engineering judgement.
- Describe gaps and limitations in the code capabilities for the analysis, combined with a strategy for mitigating identified gaps and communicating any remaining issues or risks.
- Describe the model(s) applied in the PFM analysis code in sufficient detail so a competent analyst familiar with the relevant subject area could independently implement the model(s) from the documentation alone. Model forms can either be theoretical, semiempirical, or empirical.
- Establish a basis for all significant aspects of the model(s). This may consist of raw data or published references. Document or reference any algorithms or numerical methods (e.g., root-finding, optimization) needed to implement the model(s). Discuss any significant assumptions, approximations, and simplifications made, including their potential impacts on the analysis.
- Identify important uncertainties or conservatisms.
- Describe the computational expense of the model and how that might affect analysis choices.
M-5 First-of-a-kind model not yet published in a peer-reviewed journal
- See the submittal guidelines for M-4, and perform and document model sensitivity studies to understand trends in the model, as compared to expected model behavior and to the data used to develop the model, and describe model maturity and the status of the technical basis.
RG 1.245 Revision 0, Page 16 2.6.
Inputs For each QoI, the applicant should categorize each input in the PFM software or analysis as shown in Table C-4. In Table C-4, knowledge refers to the depth of information available to prescribe either the deterministic inputs or the distributions on the uncertain inputs. Importance refers to the relative effect of input on the QoI. To determine the input category, the applicant should consider the following:
whether the input is deterministic or uncertain, how much knowledge is available about the input, and the inputs importance with regard to the QoI.
Throughout the analysis, the applicant should continuously assess the relative importance of inputs on the QoI and revisit the assumptions and choices made for the most important inputs to confirm their validity. The applicant should also consider the use of sensitivity studies to show the impact (or lack thereof) of some of the key assumptions made for the inputs that are most important to the outcome of the analyses.
Table C-4: Categorization Based on Knowledge and the Importance of Inputs Used in the Analysis Input Category Low Knowledge of Input Characteristics High Knowledge of Input Characteristics Deterministic Uncertain Deterministic Uncertain High Importance I-4D I-4R I-3D I-3R Low Importance I-2D I-2R I-1D I-1R For guidance on the documentation of inputs, the applicant should refer to Table C-5. The applicant should provide the information recommended in Table C-5 for each input based on each inputs independent categorization.
RG 1.245 Revision 0, Page 17 Table C-5: Submittal Guidelines for Inputs Category Submittal Guidelines I-1D
- List input value.
I-1R
- List input distribution type and parameters, as well as sampling frequency (if applicable).
- If applicable, list uncertainty classification (aleatory or epistemic).
I-2D
- List input value.
- If there is a lack of data, justify the use of expert judgment.
I-2R
- List input distribution type and parameters, as well as sampling frequency (if applicable).
- If applicable, list uncertainty classification (aleatory or epistemic).
- If there is a lack of data, justify the use of expert judgment.
I-3D
- List input value.
- State the rationale for setting the input to a deterministic value.
- For each deterministic input, give the rationale (method and data) for the selection of its numerical value, along with any known conservatisms or non-conservatisms in that numerical value and the rationale for such conservatisms or non-conservatisms.
- Reference documents that contain the foundation for input choices.
- Explain the correlations between inputs and how they are modeled and verify that correlated inputs remain consistent and physically valid.
- Describe any sensitivity analyses/studies performed to show that the input or its classification does not have a significant effect on the QoI.
I-3R
- List input distribution type and parameters, as well as sampling frequency (if applicable).
- If applicable, list uncertainty classification (aleatory or epistemic) and give the corresponding rationale.
- For each uncertain input, describe both its distribution parameter values and its distributional form. Give the rationale (method and data) for selecting each distribution, including any known conservatisms or non-conservatisms in the specified input distributions and the rationale for the conservatism or non-conservatism. Detail the distributional fitting method, including interpolation, extrapolation, distribution truncation, and curve fitting.
- Reference documents that contain the foundation for input choices.
- Explain the correlations between inputs and how they are modeled and verify that correlated inputs remain consistent and physically valid.
- Describe any sensitivity analyses/studies performed to show that the input or its classification does not have a significant effect on the QoI.
I-4D
- See the submittal guidelines for I-3D.
- If there is a lack of data, justify the use of expert judgment.
I-4R
- See the submittal guidelines for I-3R.
- If there is a lack of data, justify the use of expert judgment.
RG 1.245 Revision 0, Page 18 2.7.
Uncertainty Propagation The applicant should document the methods used to propagate uncertainty through the PFM model such that analysis results may be reproduced. The applicant should determine the PFM analysis uncertainty propagation category, as shown in Table C-6. The applicant should follow the guidelines in Table C-6 to document how uncertainties are propagated in the PFM analysis. If the submittal presents several analyses, the applicant should determine the category for each analysis and document the uncertainty propagation for each analysis according to the guidelines in Table C-6.
Table C-6: Submittal Guidelines for Uncertainty Propagation Category Description Submittal Guidelines UP-1 Analysis does not employ a surrogate model
- Give the method for uncertainty propagation and describe the simulation framework.
- If Monte Carlo sampling is used, describe the finalized sampling scheme and rationale for the sampling scheme, including sampling method, sample size, the pseudo-random number generation method, and the random seeds used.
- Describe the approach for maintaining separation of aleatory and epistemic uncertainties, if applicable.
- If importance sampling is used to oversample important regions of the input space, justify the choice of importance distribution.
UP-2 Analysis does employ a surrogate model
- See the submittal guidelines for UP-1 and describe the form of the surrogate model(s), any approximations or assumptions, the method used for fitting the surrogate, and the validation process for the surrogate model.
UP-2A Surrogate model used for sensitivity analysis
- See the submittal guidelines for UP-2 and describe the features of the different surrogate models used.
UP-2B Surrogate model is used for uncertainty propagation
- See the submittal guidelines for UP-2 and quantify the magnitude of error associated with the surrogate model approximation and include as additional uncertainty in the estimation of the QoI.
RG 1.245 Revision 0, Page 19 2.8.
Convergence To assess the convergence of the QoI estimate, the applicants documentation should demonstrate the convergence for any discretization used in the analysis (e.g., time step, spatial discretization), as well as statistical convergence based on the sample size and sampling method used in the probabilistic analysis. The primary goal should be to show that the conclusions of the analysis would not change significantly if the applicant used a reasonably, more refined discretization or a larger sample size.
To demonstrate and document discretization convergence, the applicant should do the following:
For PFM codes in category QV-1A, the applicant need not document discretization convergence, but analysts should nonetheless verify that discretization convergence is achieved.
For cases where the use of a QV-1 code exercised outside of the validated range, i.e.,
QV-1B, may directly impact discretization convergence, verification should be documented.
For new or modified codes (categories QV-1C, QV-2, and QV-3), the applicant should document the approach used for assessing discretization convergence and demonstrate and document that a more refined discretization does not significantly affect the outcome of the analysis.
To demonstrate and document statistical convergence, the applicant should follow the graded approach described in Table C-7. Figure C-2 illustrates the decision tree for the statistical convergence categories.
Figure C-2: Decision tree for statistical convergence categories.
RG 1.245 Revision 0, Page 20 Table C-7: Submittal Guidelines for Statistical Convergence Category Description Submittal Guidelines SC-1 a
[Acceptance criteria met with at least one order of magnitude margin] AND [no importance sampling AND no surrogate models used]
- No sampling uncertainty characterization recommended as long as the uncertainty is sufficiently small relative to the margin. b SC-2A
[Acceptance criteria met with at least one order of magnitude margin] AND [use of importance sampling OR surrogate models OR both]
- Describe the approach used for assessing statistical convergence, with one method needed for sampling uncertainty characterization.
- Explain the approach used for characterizing sampling uncertainty.
- Justify why the sampling uncertainty is small enough for the intended purpose (i.e., why statistical convergence is sufficient for the intended purpose).
- Describe how sampling uncertainty is used in the interpretation of the results.
SC-2B
[Acceptance criteria met with at least one order of magnitude margin] AND [use of importance sampling OR surrogate models OR both] AND [separation of aleatory and epistemic uncertainties is implemented in the PFM code]
- See the submittal guidelines for SC-2A and distinguish between epistemic and aleatory means and standard deviations.
SC-3A
[Acceptance criteria met with less than one order of magnitude margin]
- See the submittal guidelines for SC-2A and provide two different methods for sampling uncertainty characterization.
SC-3B
[Acceptance criteria met with less than one order of magnitude margin] AND [separation of aleatory and epistemic uncertainties is implemented in the PFM code]
- See the submittal guidelines for SC-3A and give a sample size convergence analysis for both the aleatory and epistemic sample sizes.
a Data type may have an impact on the convergence category. Continuous outputs can be category SC-1, but binary outputs inherently must be category SC-2 or SC-3 unless epistemic and aleatory uncertainties are separated.
b Some assessment of uncertainty is necessary, even if qualitative, as long as the uncertainty itself is understood to be small.
RG 1.245 Revision 0, Page 21 2.9.
Sensitivity Analyses In most cases, the applicant should perform sensitivity analyses to identify the inputs that drive the QoI uncertainty. The applicant should assess its PFM software and analysis to determine the sensitivity analyses category shown in Table C-8. The applicant should follow the guidelines in Table C-8 to document the details of sensitivity analyses. If the combination of PFM software and analysis belongs to category SA-1 in Table C-8, the NRC does not recommend performing sensitivity analyses.
If the submittal presents several PFM analyses, the applicant should determine the sensitivity analysis category for each PFM analysis and document sensitivity analyses for each PFM analysis according to the guidelines in Table C-8.
RG 1.245 Revision 0, Page 22 Table C-8: Submittal Guidelines for Sensitivity Analyses Category Description Sensitivity Analysis Needed? a Submittal Guidelines SA-1 Previously approved code (QV-1A, QV-1B) with same QoI characteristic and same input parameters b No
- Describe important input and measure of input importance from previous use.
SA-2 Previously approved code (QV-1A, QV-1B) with different QoI Yes
- Explain the methods used for sensitivity analysis, including any initial screening and model approximations and assumptions.
- State whether a local or global sensitivity analysis approach is used.
- Give the QoI used for the sensitivity analysis.
- For a global sensitivity analysis, describe the sampling scheme along with the rationale for selection, including the sampling technique, number of model realizations, and random seed for the model realizations.
- Provide the results of the sensitivity analysis, including the most important model inputs identified; a measure of the input importance, such as the variance explained by the most important inputs; and relevant graphical summaries of the sensitivity analysis results.
SA-3 Modified approved code (QV-1C) with limited independent variables (e.g., <5, determined on a case-by-case basis)
Yes
- Describe analyses, important input, and measure of input importance.
SA-4 Modified approved code (QV-1C) with many independent variables (e.g., >5, determined on a case-by-case basis)
Yes
- See the submittal guidelines for SA-2.
SA-5 First-of-a-kind code (QV-2, QV-3) with limited independent variables (e.g., <5, determined on a case-by-case basis)
Yes
- See the submittal guidelines for SA-3.
SA-6 First-of-a-kind code (QV-2, QV-3) with many independent variables (e.g., >5, determined on a case-by-case basis)
Yes, with sub-model SA as appropriate
- See the submittal guidelines for SA-2.
- Indicate how the sensitivity analysis results informed future uncertainty propagation for estimation of the QoI and associated uncertainty.
- State whether the results of the sensitivity analysis are consistent with the expected important inputs based on expert judgment.
a Local sensitivity analysis may be used as a screening step if completing a global sensitivity analysis with all inputs is not computationally feasible (as the cost of performing a global sensitivity analysis increases with the number of inputs). The results from local sensitivity analysis can help reduce the input space for a global sensitivity analysis, but local sensitivity analysis does have its risks in that it can miss important inputs if the input/output relationship is nonlinear. Sensitivity analysis should be performed unless there is a strong basis for what inputs are important (e.g., previous analyses, expert judgment, or it is obvious what inputs are important since it is a simple code).
b Inputs must remain the same because sensitivity is dependent on the input distributions.
RG 1.245 Revision 0, Page 23 2.10.
Quantity of Interest Uncertainty Characterization The applicant should characterize the uncertainty of the QoI to interpret the results of the analysis. In its description of the QoI uncertainty, the applicant should include a measure of the best estimate and uncertainty of the QoI, a graphical summary of the QoI uncertainty, and a detailed description of how the best estimate and its uncertainty were calculated. The applicant should also summarize the key uncertainties considered in the analysis, as well as any major assumptions (including conservatisms and simplifications) and assess their impact on the analysis conclusions.
The applicant should independently assess each QoI and determine the category for each applicable QoI, as shown in Table C-9. The applicant should follow the guidelines in Table C-9 to document the QoI uncertainty of each QoI, based on the category of each QoI.
Table C-9: Submittal Guidelines for Output Uncertainty Characterization Category Description Submittal Guidelines O-1 Acceptance criteria met with at least one order of magnitude margin
- Give a measure of the best estimate and uncertainty in the QoI.
- Include a graphical display of the output uncertainty.
- Describe how the best estimate and its uncertainty were calculated, including a clear description of the types of uncertainty (e.g., input, sampling, epistemic) being summarized.
- Summarize key uncertainties considered in the analysis and any major assumptions, conservatisms, or simplifications that were included and assess (qualitative or quantitative) their effect on the analysis conclusions.
O-2A Acceptance criteria met with less than one order of magnitude margin and a strong basis for input distributions and uncertainty classification
- See the submittal guidelines for O-1 and provide the reasoning behind a strong basis.
O-2B
[Acceptance criteria met with less than one order of magnitude margin]
and [no strong basis for input distributions or uncertainty classification, or both]
- See the submittal guidelines for O-1.
- Include a sensitivity analysis (if important inputs are unknown) and sensitivity studies for any inputs that do not have a strong basis.
O-3 O-1, O-2A, or O-2B and potential unknowns
- See the submittal guidelines for O-1 and provide the reasoning behind a strong basis.
- Describe potential unknowns and their possible effect on analysis results.
- Include a sensitivity analysis (if important inputs are unknown) and sensitivity studies for any inputs that do not have a strong basis.
RG 1.245 Revision 0, Page 24 2.11.
Sensitivity Studies In most cases, the applicant should perform sensitivity studies to understand how analysis assumptions impact the results of the overall analysis, to show why some assumptions may or may not impact the results, and to understand new and complex codes, models, or phenomena, especially if there are large perceived uncharacterized uncertainties. The applicant should assess its PFM software and analysis to determine the sensitivity studies category shown in Table C-10. The applicant should follow the guidelines in Table C-10 to document the details of sensitivity studies. If the combination of PFM software and analysis belongs to category SS-1 in Table C-10, the staff does not recommend performing sensitivity studies.
If the submittal presents several PFM analyses, the applicant should determine the sensitivity studies category for each PFM analysis and document sensitivity studies for each PFM analysis according to the guidelines in Table C-10.
RG 1.245 Revision 0, Page 25 Table C-10: Submittal Guidelines for Sensitivity Studies Category Description Sensitivity Study Needed?
Submittal Guidelines SS-1 Category QV-1A code with same QoI characteristic a No
- Summarize sensitivity studies conducted in prior approval.
SS-2 Category QV-1A code with different QoI characteristic Limited, focused on inputs related to QoI
- Summarize past sensitivity studies conducted in prior approval and current sensitivity studies.
SS-3 Category QV-1B or QV-1C code with limited independent variables (e.g., <5, determined on a case-by-case basis)
Limited, focused on impact of modification
- Summarize past and current sensitivity studies.
SS-4 Category QV-1B or QV-1C code with many independent variables (e.g., >5, determined on a case-by-case basis)
Yes, focused on inputs related to QoI
- Summarize past and current sensitivity studies.
- List the uncertain assumptions that are considered for sensitivity studies.
- State the impact and conclusion of each sensitivity study.
- Give the rationale for why certain assumptions were or were not considered for sensitivity studies.
- Provide the specific question(s) each sensitivity study is attempting to answer.
- Describe a reference realization.
- Describe how each sensitivity study is translated into model realizations and compare the study and the reference realization.
- List changes to the code and the QA procedure used.
SS-5 Category QV-2 or QV-3 code with limited independent variables (e.g., <5, determined on a case-by-case basis)
Yes
- See the submittal guidelines for SS-4.
SS-6 Category QV-2 or QV-3 code with many independent variables (e.g., >5, determined on a case-by-case basis)
Yes, model and input studies
- See the submittal guidelines for SS-4.
a Inputs must remain the same because sensitivity is dependent on the input distributions.
RG 1.245 Revision 0, Page 26 D. IMPLEMENTATION The NRC staff may use this RG as a reference in its regulatory processes, such as licensing, inspection, or enforcement. However, the NRC staff does not intend to use the guidance in this RG to support NRC staff actions in a manner that would constitute backfitting as that term is defined in 10 CFR 50.109, Backfitting, 10 CFR 72.62, Backfitting, or as described in NRC Management Directive 8.4, Management of Backfitting, Forward Fitting, Issue Finality, and Information Requests, (Ref. 23) nor does the NRC staff intend to use the guidance to affect the issue finality of an approval under 10 CFR Part 52, Licenses, Certifications, and Approvals for Nuclear Power Plants. The staff also does not intend to use the guidance to support NRC staff actions in a manner that constitutes forward fitting as that term is defined and described in Management Directive 8.4. If a licensee believes that the NRC is using this RG in a manner inconsistent with the discussion in this Implementation section, then the licensee may file a backfitting or forward fitting appeal with the NRC in accordance with the process in Management Directive 8.4.
RG 1.245 Revision 0, Page 27 GLOSSARY acceptance criteria Set of conditions that must be met to achieve success for the desired application.
aleatory uncertainty Uncertainty based on the randomness of the nature of the events or phenomena that cannot be reduced by increasing the analysts knowledge of the systems being modeled.
assumptions A decision or judgment that is made in the development of a model or analysis.
best estimate Approximation of a quantity based on the best available information. Models that attempt to fit data or phenomena as best as possible; that is, models that do not intentionally bound data for a given phenomenon or are not intentionally conservative or optimistic.
code The computer implementation of algorithms developed to facilitate the formulation and approximation solution of a class of problems.
conservative analysis An analysis that uses assumptions such that the assessed outcome is meant to be less favorable than the expected outcome.
convergence analysis An analysis with the purpose of assessing the approximation error in the quantity of interest estimates to establish that conclusions of the analysis would not change solely due to sampling uncertainty.
correlation A general term for interdependence between pairs of variables.
deterministic A characteristic of decision-making in which results from engineering analyses not involving probabilistic considerations are used to support a decision. Consistent with the principles of determinism, which hold that specific causes completely and certainly determine effects of all sorts. Also refers to fixed model inputs.
distribution A function specifying the values that the random variable can take and the likelihood they will occur.
epistemic uncertainty The uncertainty related to the lack of knowledge or confidence about the system or model; also known as state-of-knowledge uncertainty. The American Society of Mechanical Engineers/American Nuclear Society PRA standard (Ref. 24) defines epistemic uncertainty as the uncertainty attributable to incomplete knowledge about a phenomenon that affects our ability to model it. Epistemic uncertainty is reflected in ranges of values for parameters, a range of viable models, the level of model detail, multiple expert interpretations, and statistical confidence. In principle, epistemic uncertainty can be reduced by the accumulation of additional information.
(Epistemic uncertainty is sometimes also called modeling uncertainty.)
expert judgment Information (or opinion) provided by one or more technical experts based on their experience and knowledge. Used when there is a lack of information, for example, if certain parameter values are unknown, or there are questions about phenomenology in accident progression. May be part of a structured approach, such as expert elicitation, but is not necessarily as formal. May be the opinion of one or more experts, whereas expert elicitation is a highly structured process in which the opinions of several experts are sought, collected, and aggregated in a very formal way.
RG 1.245 Revision 0, Page 28 global sensitivity analysis The study of how the uncertainty in the output or quantity of interest of a model (numerical or otherwise) can be apportioned to different sources of uncertainty in the model input. The term global ensures that the analysis considers more than just local or one-factor-at-a-time effects. Hence, interactions and nonlinearities are important components of a global statistical sensitivity analysis.
important input variable An input variable whose uncertainty contributes substantially to the uncertainty in the response.
input Data or parameters that users can specify for a model; the output of the model varies as a function of the inputs, which can consist of physical values (e.g., material properties, tolerances) and model specifications (e.g., spatial resolution).
local sensitivity analysis A sensitivity analysis that is relative to location in the input space chosen and not for the entire input space.
model A representation of a physical process that allows for prediction of the process behavior.
outputs A value calculated by the model given a set of inputs.
parameter A numerical characteristic of a population or probability distribution. More technically, the variables used to calculate and describe frequencies and probabilities.
probabilistic A characteristic of an evaluation that considers the likelihood of events.
quantity of interest A numerical characteristic of the system being modeled, the value of which is of interest to stakeholders, typically because it informs a decision. Can refer to either a physical quantity that is an output from a model or a given feature of the probability distribution function of the output of a deterministic model with uncertain inputs.
random variable A variable, the values of which occur according to some specified probability distribution.
realization The execution of a model for a single set of input parameter values.
risk-informed A characteristic of decision making in which risk results or insights are used together with other factors to support a decision.
sampling The process of selecting some part of a population to observe, so as to estimate something of interest about the whole population.
sampling uncertainty The uncertainty in an estimate of a quantity of interest that arises due to finite sampling. Different sets of model realizations will result in different estimates. This type of uncertainty contributes to uncertainty in the true value of the quantity of interest and is often summarized using the sampling variance.
sensitivity analysis The study of how uncertainty in the output of a model can be apportioned to different sources of uncertainty in the model input.
sensitivity studies PFM analyses that are conducted under credible alternative assumptions.
simulation The execution of a computer code to mimic an actual system. Typically, comprises a set of model realizations.
RG 1.245 Revision 0, Page 29 software quality assurance A planned and systematic pattern of all actions necessary to provide adequate confidence that a software item or product conforms to established technical requirements; a set of activities designed to evaluate the process by which the software products are developed or manufactured.
surrogate A function that predicts outputs from a model as a function of the model inputs. Also known as response surface, metamodel, or emulator.
uncertainty propagation Quantifying the uncertainty of a models responses that results from the propagation through the model of the uncertainty in the models inputs.
validation The process of determining the degree to which a model is an accurate representation of the real world from the perspective of the intended uses of the model.
variance The second moment of a probability distribution, defined as E(X-)2, where is the first moment of the random variable X. A common measure of variability around the mean of a distribution.
verification The process of determining whether a computer program (code) correctly solves the mathematical-model equations. This includes code verification (determining whether the code correctly implements the intended algorithms) and solution verification (determining the accuracy with which the algorithms solve the mathematical-model equations for specified quantities of interest).
RG 1.245 Revision 0, Page 30 LIST OF FIGURES Figure C-1: PFM analysis flowchart in support of regulatory submittals (corresponding sections of this guide are shown in parentheses when applicable)........................................................................................ 9 Figure C-2: Decision tree for statistical convergence categories............................................................... 19
RG 1.245 Revision 0, Page 31 LIST OF TABLES Table C-1: Submittal Content Mapping to NUREG/CR-7278.................................................................. 10 Table C-2: SQA and V&V Code Categories............................................................................................. 13 Table C-3: Submittal Guidelines for Models............................................................................................. 15 Table C-4: Categorization Based on Knowledge and the Importance of Inputs Used in the Analysis...... 16 Table C-5: Submittal Guidelines for Inputs............................................................................................... 17 Table C-6: Submittal Guidelines for Uncertainty Propagation.................................................................. 18 Table C-7: Submittal Guidelines for Statistical Convergence................................................................... 20 Table C-8: Submittal Guidelines for Sensitivity Analyses........................................................................ 22 Table C-9: Submittal Guidelines for Output Uncertainty Characterization............................................... 23 Table C-10: Submittal Guidelines for Sensitivity Studies......................................................................... 25
RG 1.245 Revision 0, Page 32 REFERENCES1 1
U.S. Nuclear Regulatory Commission, 10 CFR Part 50: Domestic Licensing of Production and Utilization Facilities, Washington, DC: U.S. NRC.
2 U.S. Nuclear Regulatory Commission, 10 CFR Part 52: Licenses, Certifications, and Approvals for Nuclear Power Plants, Washington, DC: U.S. NRC.
3 U.S. Nuclear Regulatory Commission, 10 CFR Part 71: Packaging and Transportation of Radioactive Material, Washington, DC: U.S. NRC.
4 U.S. Nuclear Regulatory Commission, 10 CFR Part 72: Licensing Requirements for the Independent Storage of Spent Nuclear Fuel, High-Level Radioactive Waste, and Reactor-Related Greater Than Class C Waste, Washington, DC: U.S. NRC.
5 U.S. Nuclear Regulatory Commission, NUREG-0800: Standard Review Plan for the Review of Safety Analysis Reports for Nuclear Power Plants: LWR Edition, Agencywide Documents Access and Management System (ADAMS) Accession No. ML042080088, Washington, DC, U.S. NRC.
6 L. Hund, J. Lewis, N. Martin, M. Starr, D. Brooks, A. Zhang, R. Dingreville, A. Eckert, J.
Mullins, P. Raynaud, D. Rudland, D. Dijamco, S. Cumblidge, NUREG/CR-7278: Technical Basis for the Use of Probabilistic Fracture Mechanics in Regulatory Applications, ADAMS Accession No. ML21257A237, U.S. NRC, Washington, DC, U.S. NRC.
7 U.S. Nuclear Regulatory Commission, RG-1.174: An Approach for Using Probabilistic Risk Assessment in Risk-Informed Decisions on Plant-Specific Changes to the Licensing Basis, Washington, DC, USA: U.S. NRC.
8 U.S. Nuclear Regulatory Commission, RG-1.175: An Approach for Plant-Specific, Risk-Informed Decisionmaking: Inservice Testing, Washington, DC: U.S. NRC.
9 U.S. Nuclear Regulatory Commission, RG-1.178: An Approach for Plant-Specific Risk-Informed Decisionmaking for Inservice Inspection of Piping, Washington, DC: U.S. NRC.
10 U.S. Nuclear Regulatory Commission, RG-1.200: An Approach for Determining the Technical Adequacy of Probabilistic Risk Assessment Results for Risk-Informed Activities, Washington, DC, USA: U.S. NRC.
1 Publicly available NRC published documents are available electronically through the NRC Library on the NRCs public Web site at http://www.nrc.gov/reading-rm/doc-collections/ and through the NRCs Agencywide Documents Access and Management System (ADAMS) at http://www.nrc.gov/reading-rm/adams.html. The documents can also be viewed online or printed for a fee in the NRCs Public Document Room (PDR) at 11555 Rockville Pike, Rockville, MD. For problems with ADAMS, contact the PDR staff at 301-415-4737 or (800) 397-4209; fax (301) 415-3548; or e-mail pdr.resource@nrc.gov.
IAEA safety requirements and guides may be found at WWW.IAEA.Org/ or by writing the International Atomic Energy Agency, P.O. Box 100 Wagramer Strasse 5, A-1400 Vienna, Austria; telephone (+431) 2600-0; fax (+431) 2600-7; or e-mail Official.Mail@IAEA.Org. It should be noted that some of the international recommendations do not correspond to the NRC requirements which take precedence over the international guidance.
RG 1.245 Revision 0, Page 33 11 U.S. Nuclear Regulatory Commission, RG-1.201: Guidelines for Categorizing Structures, Systems, and Components in Nuclear Power Plants According to Their Safety Significance, Washington, DC: U.S. NRC.
12 P.T. Williams, T.L. Dickson, B.R. Bass, and H.B. Klasky, Fracture Analysis of Vessels - Oak Ridge FAVOR, v16.1, Computer Code: Theory and Implementation of Algorithms, Methods, and Correlations, ORNL/LTR-2016/309, Oak Ridge, TN, September 2016 13 U.S. Nuclear Regulatory Commission: xLPR v2.1Public Release Announcement, ADAMS Accession No. ML20157A120, Washington, DC: U.S. NRC.
14 Patrick A.C. Raynaud, RG 1.99 Revision 2 Update FAVOR Scoping Study, Technical Letter Report TLR-RES/DE/CIB-2020-09, October 2020, ADAMS Accession No. ML20300A551, Washington, DC: U.S. NRC.
15 U.S. Nuclear Regulatory Commission, RG-1.99: Radiation Embrittlement of Reactor Vessel Materials, Washington, DC: U.S. NRC.
16 P. Raynaud, M. Kirk, M. Benson and M. Homiack, "TLR-RES/DE/CIB-2018-01: Important Aspects of Probabilistic Fracture Mechanics Analyses," U.S. Nuclear Regulatory Commission, Washington, DC, 2018.
17 Palm, N., White Paper on Suggested Content for PFM Submittals to the NRC, BWRVIP 2019-016, ADAMS Accession No. ML19241A545, Electrical Power Research Institute, Palo Alto, CA, 2019.
18 U.S. Nuclear Regulatory Commission, International Policy Statement, ADAMS Accession No. ML14132A317, Washington, DC: U.S. NRC.
19 U.S. Nuclear Regulatory Commission, Management Directive and Handbook 6.6, Regulatory Guides, ADAMS Accession No. ML16083A122, Washington, DC: U.S. NRC.
20 International Atomic Energy Agency, Development and Application of Level 1 Probabilistic Safety Assessment for Nuclear Power Plants, IAEA Safety Standards Series No. SSG-3, IAEA, Vienna (2010).
21 Matthew Homiack, Giovanni Facco, Michael Benson, Marjorie Erickson, and Craig Harrington, NUREG-2247: Extremely Low Probability of Rupture Version 2 Probabilistic Fracture Mechanics Code, ADAMS Accession No. ML21225A736, Washington, DC, U.S. NRC.
22 U.S. Nuclear Regulatory Commission, Safety Evaluation Report related to Westinghouse Owners Group application of Risk-Informed Methods to Piping Inservice Inspection (Topical Report WCAP-14572, Revision 1), December 1998, Washington, DC: U.S. NRC.
23 U.S. Nuclear Regulatory Commission, Management Directive and Handbook 8.4: Management of Backfitting, Forward Fitting, Issue Finality, and Information Requests," U.S. NRC, Washington, DC, USA, 2019.
24 Assessment for Nuclear Power Plant Applications, New York, NY: ASME, 2008 (R2019).-S -
2008(R2019): Standard for Level 1 / Large Early Release Frequency Probabilistic Risk Assessment for Nuclear Power Plant Applications, New York, NY: ASME, 2008 (R2019).