ML043170314

From kanterella
Jump to navigation Jump to search
Draft Regulatory Guide DG-1130: Proposed Revision 2 of Regulatory Guide 1.152, Criteria for Use of Computers in Safety Systems of Nuclear Power Plants
ML043170314
Person / Time
Issue date: 12/31/2004
From: Aggarwal S
Office of Nuclear Regulatory Research
To:
Aggarwal S (301)415-6005
References
DG-1130 RG-1.152, Rev 2
Download: ML043170314 (15)


Text

U.S. NUCLEAR REGULATORY COMMISSION December 2004 OFFICE OF NUCLEAR REGULATORY RESEARCH Division 1 DRAFT REGULATORY GUIDE

Contact:

Satish K. Aggarwal, (301) 415-6005 DRAFT REGULATORY GUIDE DG-1130 (Proposed Revision 2 of Regulatory Guide 1.152)

CRITERIA FOR USE OF COMPUTERS IN SAFETY SYSTEMS OF NUCLEAR POWER PLANTS A. INTRODUCTION This regulatory guide describes a method that the staff of the U.S. Nuclear Regulatory Commission (NRC) deems acceptable for complying with the NRCs regulations for promoting high functional reliability and design quality for the use of computers1 in safety systems of nuclear plants. Specifically, General Design Criterion (GDC) 21, Protection System Reliability and Testability, of Appendix A, General Design Criteria for Nuclear Power Plants, to Title 10, Part 50, Domestic Licensing of Production and Utilization Facilities, of the Code of Federal Regulations (10 CFR Part 50), requires, among other things, that protection systems (or safety systems) must be designed for high functional reliability commensurate with the safety functions to be performed. Criterion III, Design Control, of Appendix B, Quality Assurance Criteria for Nuclear Power Plants and Fuel Reprocessing Plants, to 10 CFR Part 50, requires, among other things, that quality standards must be specified and design control measures must be provided for verifying or checking the adequacy of design.

1 For the purposes of this regulatory guide, the term computer means a system that includes computer hardware, software, firmware, and interfaces.

This regulatory guide is being issued in draft form to involve the public in the early stages of the development of a regulatory position in this area.

It has not received staff review or approval and does not represent an official NRC staff position.

Public comments are being solicited on this draft guide (including any implementation schedule) and its associated regulatory analysis or value/impact statement. Comments should be accompanied by appropriate supporting data. Written comments may be submitted to the Rules and Directives Branch, Office of Administration, U.S. Nuclear Regulatory Commission, Washington, DC 20555-0001. Comments may be submitted electronically through the NRCs interactive rulemaking Web page at http://www.nrc.gov/what-we-do/regulatory/rulemaking.html.

Copies of comments received may be examined at the NRC Public Document Room, 11555 Rockville Pike, Rockville, MD. Comments will be most helpful if received by February 11, 2005.

Requests for single copies of draft or active regulatory guides (which may be reproduced) or for placement on an automatic distribution list for single copies of future draft guides in specific divisions should be made to the U.S. Nuclear Regulatory Commission, Washington, DC 20555, Attention: Reproduction and Distribution Services Section, or by fax to (301)415-2289; or by email to Distribution@nrc.gov. Electronic copies of this draft regulatory guide are available through the NRCs interactive rulemaking Web page (see above); the NRCs public Web site under Draft Regulatory Guides in the Regulatory Guides document collection of the NRCs Electronic Reading Room at http://www.nrc.gov/reading-rm/doc-collections/;

and the NRCs Agencywide Documents Access and Management System (ADAMS) at http://www.nrc.gov/reading-rm/adams.html, under Accession No. ML043170314. Note, however, that the NRC has temporarily suspended public access to ADAMS so that the agency can complete security reviews of publicly available documents and remove potentially sensitive information. Please check the NRCs Web site for updates concerning the resumption of public access to ADAMS.

This regulatory guide also contains the staffs regulatory position on the Standard Criteria for Digital Computers in Safety Systems of Nuclear Power Generating Stations,2 which the Nuclear Power Engineering Committee of the Institute of Electrical and Electronics Engineers (IEEE) has promulgated as IEEE Std 7-4.3.2-2003. The NRC staff has collaborated in the development of IEEE Std 7-4.3.2-2003 to ensure that the guidance provided by the consensus standard is consistent with the NRCs regulations. This standard evolved from IEEE Std 7-4.3.2-1993 and reflects advances in digital technology. It also represents a continued effort by IEEE to support the specification, design, and implementation of computers in safety systems of nuclear power plants. In addition, IEEE Std 7-4.3.2-2003 specifies computer-specific requirements to supplement the criteria and requirements of IEEE Std 603-1998, Standard Criteria for Safety Systems for Nuclear Power Generating Stations.

Regulatory guides are issued to describe to the public methods that the NRC staff considers acceptable for use in implementing specific parts of the agencys regulations, to explain techniques that the staff uses in evaluating specific problems or postulated accidents, and to provide guidance to applicants. Regulatory guides are not substitutes for regulations, and compliance with regulatory guides is not required. Regulatory guides are issued in draft form to solicit public comment and involve the public in developing the agencys regulatory positions.

Draft regulatory guides have not received complete staff review; therefore, they do not represent official NRC staff positions.

This draft regulatory guide contains information collections that are covered by the requirements of 10 CFR Part 50, which the Office of Management and Budget (OMB) approved under OMB control number 3150-0011. The NRC may neither conduct nor sponsor, and a person is not required to respond to, an information collection request or requirement unless the requesting document displays a currently valid OMB control number.

2 IEEE publications may be purchased from the IEEE Service Center, 445 Hoes Lane, Piscataway, NJ 08854.

2

B. DISCUSSION Instrumentation and control (I&C) system designs that use computers in safety systems make extensive use of advanced technology (i.e., equipment and design practices).

These designs are expected to be significantly and functionally different from current designs, and may include the use of microprocessors, digital systems and displays, fiber optics, multiplexing, and different isolation techniques to achieve sufficient independence and redundancy.

With the introduction of digital systems into plant safety system designs, concerns have emerged regarding the possibility that a design error in the software in redundant channels of a safety system could lead to common-cause or common-mode failure of the safety system function. Conditions may exist under which some form of diversity may be necessary to provide additional assurance beyond that provided by the design and quality assurance (QA) programs that incorporate software QA and verification and validation (V&V). The design techniques of functional diversity, design diversity, diversity in operation, and diversity within the four echelons of defense in depth (provided by the reactor protection, engineered safety features actuation, control, and monitoring I&C systems) can be applied as defense against common-cause failures. Manual operator actuations of safety and nonsafety systems are acceptable, provided that the necessary diverse controls and indications are available to perform the required function under the associated event conditions and within the acceptable time.

The justification for equipment diversity, or for the diversity of related system software such as a realtime operating system, must extend to equipment components to ensure that actual diversity exists. For example, different manufacturers might use the same processor or license the same operating system, thereby incorporating common failure modes. Claims for diversity based only on different manufacturers are insufficient without consideration of the above.

With respect to software diversity, experience indicates that independence of failure modes may not be achieved in cases where multiple versions of software are developed from the same software requirements. Other considerations, such as functional and signal diversity, that lead to different software requirements form a stronger basis for diversity.

Some safety system designs may use computers that were not specifically designed for nuclear power plant applications. Clause 5.4.2 of IEEE Std 7-4.3.2-2003 provides general guidance for commercial grade dedication. Annex C to this standard provides useful information on providing confidence that an existing commercial computer is of sufficiently high quality and reliability to be used in a safety system.

IEEE Std 7-4.3.2-2003 does not provide guidance regarding security measures for computer-based system equipment and software systems. Consequently, the NRC has modified this draft regulatory guide to include Regulatory Positions 2.1 - 2.9, which provide specific guidance concerning computer-based (cyber) safety system security.

3

Clause 5.9 of IEEE Std 7-4.3.2-2003, Control of Access, refers to the applicable requirements in IEEE Std 603-1998 and states, The design shall permit the administrative control of access to safety system equipment. These administrative controls shall be supported by provisions within the safety systems, by provision in the generating station design, or by a combination thereof. For digital computer-based systems, controls of both physical and electronic access to system software and data should be provided to prevent changes by unauthorized personnel. Controls should address access via network connections and via maintenance equipment. Additionally, the design of the plant data communication systems should ensure that the systems do not present an electronic path by which unauthorized personnel can change plant software or display erroneous plant status information to the operators. Annex E to IEEE Std 7-4.3.2-2003 provides useful information for establishing communication independence of plant equipment and systems.

Computer-based systems must be secure from electronic vulnerabilities, as well as from physical vulnerabilities, which have been well addressed. Security of computer-based system software relates to the ability to prevent unauthorized, undesirable, and unsafe intrusions throughout the life cycle of the safety system. Computer-based systems are secure from electronic vulnerabilities if unauthorized access and use of those systems is prevented.

The security of computer-based systems is established through (1) designing the security features that will meet user security requirements in the systems, (2) developing the systems without undocumented codes (e.g., back door coding, viruses, worms, Trojan horses, and bomb codes), and (3) installing and maintaining those systems in accordance with the users security program.

Regulatory Positions 2.1 - 2.9 (presented in Section C of this draft regulatory guide) provide specific guidance concerning safety system security. The effectiveness of the security functions implemented in the software safety system should be confirmed during verification and validation (V&V) and in the configuration management of the safety system software in each lifecycle phase.

In addition to the aspects discussed in Section C of this draft regulatory guide, IEEE Std 7-4.3.2-2003 includes seven informative annexes. As discussed below, the NRC has not endorsed Annexes B - F:

(a) Annex A, Mapping of IEEE Std 603-1998 to IEEE Std 7-4.3.2-2003, does not provide any guidance or requirements.

(b) Annex B, Diversity Requirements Determination, is not endorsed by the NRC because it provides inadequate guidance. Branch Technical Position (BTP)

HICB-19, Guidance for Evaluation of Defense-in-Depth and Diversity in Digital Computer-Based Instrumentation and Control Systems, in NUREG-0800, Standard Review Plan, Section 7, Instrumentation and Controls, provides additional guidance.

(c) Annex C, Dedication of Existing Commercial Computers, is not endorsed by the NRC because it provides inadequate guidance. Adequate guidance is available in EPRI TR-106439, Guideline on Evaluation and Acceptance of Commercial Grade Digital Equipment for Nuclear Safety Applications, which the NRC has endorsed.

4

(d) Annex D, Identification and Resolution of Hazards, provides general information regarding the use of qualitative or quantitative fault tree analysis (FTA) and failure modes and effects analysis (FMEA) techniques throughout the system development life cycle. The staff agrees that FTA and FMEA are well-known techniques for analyzing potential hazards; however, this annex is not endorsed because it provides inadequate guidance concerning the use of FTA and FMEA.

Guidance is provided in Branch Technical Position HICB-14, Guidance on Software Reviews for Digital Computer-Based Instrumentation and Control Systems.

(e) Annex E, Communication Independence, is not endorsed by the NRC because it provides insufficient guidance. Additional guidance is provided in Appendix 7.0-A, Review Process for Digital Instrumentation and Control Systems, Appendix 7.1-C, Guidance for Evaluation of Conformance to IEEE Std 603, and Section 7.9, Data Communication Systems, in NUREG-0800.

(f) Annex F, Computer Reliability, describes an approach for measuring the reliability of digital computers used in safety systems. The NRC does not endorse the concept of quantitative reliability goals as a sole means of meeting its regulations for reliability of digital computers used in safety systems.

The NRCs acceptance of the reliability of computer systems is based on deterministic criteria for both hardware and software. Quantitative reliability determination, using a combination of analysis, testing, and operating experience, can provide an added level of confidence in the reliable performance of the computer systems.

(g) Annex G, Bibliography, provides the references used in the standard.

The bibliography provides sufficient detail to enable users to obtain further information regarding specific areas of the standard.

5

C. REGULATORY POSITION

1. Functional and Design Requirements Conformance with the requirements of IEEE Std 7-4.3.2-2003, Standard Criteria for Digital Computers in Safety Systems of Nuclear Power Generating Stations, is a method that the NRC staff has deemed acceptable for satisfying the NRCs regulations with respect to high functional reliability and design requirements for computers used in safety systems of nuclear power plants, with the exception that the use of barriers, as proposed in Clause 5.6(a) of IEEE Std 7-4.3.2, is not acceptable to the NRC, as a means of ensuring independence between safety functions and nonsafety functions on the same computer.

However, Clause 5.6(b) of IEEE Std 7-4.3.2 requires that, in the absence of using barriers, all software on a safety-related computer must be developed in accordance with IEEE Std 7-4.3.2-2003 and IEEE Std 603. This approach is acceptable to the NRC for meeting its existing regulatory requirements for addressing independence between safety software and nonsafety software residing on the same computer.

2. Security This regulatory position uses the waterfall lifecycle phases as a framework for describing specific digital safety system security guidance. Lifecycles other than the waterfall lifecycle may be used. The digital safety system development process should address potential security vulnerabilities in each phase of the digital safety system lifecycle.

The typical waterfall lifecycle consists of the following phases:

  • Concepts
  • Requirements
  • Design
  • Implementation
  • Test
  • Installation and Checkout
  • Operation
  • Maintenance
  • Retirement The lifecycle phase-specific security requirements should be commensurate with the risk and magnitude of the harm resulting from unauthorized access, use, disclosure, disruption, or destruction of the digital safety system.

The user should establish a security quality assurance program and a security configuration management program as part of its security program. The security quality assurance program and security configuration management program can be incorporated into the existing quality assurance and configuration management programs.

The Quality Assurance organization should conduct periodic audits to determine the effectiveness of the digital safety system security program.

Regulatory positions 2.1 - 2.9 describe digital safety system security activities and recommendations for the individual phases of the waterfall lifecycle.

6

2.1 Concepts Phase In the concepts phase, the user and developer should delineate safety system security features that should be implemented to meet the desired system security capabilities. During this activity, the system architecture is selected and the desired safety system security functional capabilities are allocated to hardware, software, and user interface components.

The user and developer should perform security risk analyses to identify potential security vulnerabilities in the relevant phases of the system and software life cycle.

The results of the analysis should be used to establish security requirements for the system (hardware and software).

Remote access to the safety system software functions or data from outside the technical environment of the plant (e.g., from the administrative or engineering buildings or from outside the plant) that involves a potential security threat to safety functions should not be implemented.

2.2 Requirements Phase 2.2.1 System Features The users and developers should define the security functional and performance requirements; interfaces external to the system; and the requirements for qualification, human factors engineering, data definitions, user documentation for the software and hardware, installation and acceptance, user operation and execution, and user maintenance.

The security requirements are part of the overall system requirements. Therefore, the V&V process of the overall system should ensure the correctness, completeness, accuracy, testability, and consistency of the system software and hardware system requirements, which include security requirements.

Requirements specifying the use of pre-developed software (e.g., reuse software and commercial off-the-shelf software) should minimize the vulnerability of the safety system (e.g., by minimizing the number of pre-developed software functions used by the safety system to the extent necessary or using existing security functions of the pre-developed software).

2.2.2 Development Activities The developer should delineate its security policies to ensure the developed products (hardware and software) do not contain undocumented code (e.g., back door coding),

malicious code (e.g., intrusions, viruses, worms, Trojan horses, or bomb codes),

and other unwanted and undocumented functions or applications.

7

2.3 Design Phase 2.3.1 System Features The safety system software security requirements identified in the system requirements specification should be translated into specific design configuration items in the software design description. The safety system software security design configuration items should address control over (1) access to the software functions, (2) use of safety system services, (3) data communication with other systems, and (4) the list of personnel who may access and use the system.

Design configuration items incorporating pre-developed software into the safety system should be specified such that vulnerability of the safety system security is minimized.

Access control should be based on the results of risk analyses. The results of the analyses may require more complex access control, such as a combination of knowledge (e.g., password), property (e.g., key, smart-card) and personal features (e.g., fingerprints),

rather than just a password.

2.3.2 Development Activities The developer should delineate the standards and procedures that will conform with the applicable security policies to ensure the system design products (hardware and software) do not contain undocumented code (e.g., back door coding), malicious code (e.g., intrusions, viruses, worms, Trojan horses, or bomb codes), and other unwanted or undocumented functions or applications.

2.4 Implementation Phase In the software implementation phase, the system design is transformed into code, database structures, and related machine executable representations. The implementation activity addresses software coding and testing, including the incorporation of reused software products.

2.4.1 System Features The developer should ensure that the security design configuration item transformations from the system design specification are correct, accurate, and complete.

8

2.4.2 Development Activities The developer should implement security procedures and standards to ensure against tampering with the developed software. The developers standards and procedures should include testing, including scanning, to ensure against undocumented codes or malicious codes that might (1) allow unauthorized access or use of the system or (2) cause systems to behave beyond the system requirements. There should be provisions against the incorporation of hidden functions in the application development software or the system software that could support potential unauthorized access. If provisions cannot be implemented for pre-developed software, the use of such software should be justified considering potential security threats.

The user and developer should review the possibility for deliberate modification of software to cause erroneous behavior of the software triggered by certain time or data constraints (e.g., viruses, worms, and Trojan horses).

2.5 Test Phase The objective of testing software security functions is to ensure that the software security requirements and system security requirements allocated to software are validated by execution of integration, system, and acceptance tests. Testing includes software testing, software integration testing, software qualification testing, system integration testing, and system qualification testing.

2.5.1 System Features The security requirements and configuration items are part of the overall system requirements and design configuration items. Therefore, testing security design configuration items is just one element of the overall system testing. The user and developer should test each system security feature to verify that the implemented system does not increase the risk of security vulnerabilities.

2.5.2 Development Activities The developer should perform testing and scanning to ensure the developed products (i.e., hardware and software) do not contain undocumented code (e.g., back door coding),

malicious code (e.g., intrusions, viruses, worms, Trojan horses, or bomb codes), and other unwanted and undocumented functions or applications. Additionally, the developer should audit the configuration management processes to ensure that the software is developed in accordance with the appropriate configuration management procedures and standards.

2.6 Installation and Checkout Phase In installation and checkout, the safety system is installed and tested in the target environment. The system user should perform an acceptance review and test the safety system security features. The objective of installation and checkout security testing is to verify and validate the correctness of the safety system security features in the target environment.

9

2.6.1 System Features The user should ensure that the system features enable the user to perform post-installation testing of the system to verify and validate that the security requirements have been incorporated into the system appropriately.

2.6.2 Development Activities A user or licensee should have a comprehensive digital system security program.

The security policies, standards, and procedures should ensure that installation of the digital system will not compromise the security of the digital system, other systems, or the plant.

This may require the user to perform a security assessment, which includes a risk assessment, to identify the potential security vulnerabilities caused by installation the digital system. The risk assessment should include an evaluation of new security constraints in the system; an assessment of the proposed system changes and their impact on system security; and an evaluation of operating procedures for correctness and usability. The results of this assessment should provide a technical basis for establishing certain security levels for the systems and the plant.

2.7 Operation Phase The operation lifecycle process involves the use of the safety system by the end user in its intended operational environment.

The user should monitor and record access and use of the system to ensure that its digital system security policies are implemented properly. The monitoring should include real-time monitoring and periodic audits. The type of monitoring is determined by the risk analyses performed in earlier lifecycle phases. The audit should include the security of any equipment that is connected to the system for maintenance.

The user should evaluate the impact of safety system changes in the operating environment on safety system security; assess the effect on safety system security of any proposed changes; evaluate operating procedures for compliance with the intended use; and analyze security risks affecting the user and the system. The user should evaluate new security constraints in the system; assess proposed system changes and their impact on system security; and evaluate operating procedures for correctness and usability.

10

2.8 Maintenance Phase The maintenance phase is activated when the user changes the system or associated documentation. These changes may be categorized as follows:

  • Modifications (i.e., corrective, adaptive, or perfective changes)
  • Migration (i.e., the movement of software to a new operational environment)
  • Retirement (i.e., the withdrawal of active support by the operation and maintenance organization, partial or total replacement by a new system, or installation of an upgraded system)

System modifications may be derived from requirements specified to correct errors (corrective), to adapt to a changed operating environment (adaptive), or to respond to additional user requests or enhancements (perfective).

2.8.1 Maintenance Activities Modifications of the safety system should be treated as development processes and should be verified and validated as described above. Security functions should be assessed as described in the above regulatory positions, and should be revised (as appropriate) to reflect requirements derived from the maintenance process.

When migrating software, the user should verify that the migrated software meets the safety system security requirements. The maintenance process should continue to conform to existing safety system security requirements unless those requirements are to be changed as part of the maintenance activity.

2.8.2 Quality Assurance If the safety system security functions were not previously verified and validated using a level of effort commensurate with the safety system security functional requirements, and appropriate documentation is not available or adequate, the user should determine whether the missing or incomplete documentation should be generated. In making this determination of whether to generate missing documentation, the minimum safety system security functional requirements should be taken into consideration.

The user should establish a security configuration management program as part of its security program. The security configuration program may be incorporated into the existing configuration management program.

2.8.3 Incident Response The user should develop an incident response and recovery plan for responding to digital system security incidents(e.g., intrusions, viruses, worms, Trojan horses, or bomb codes).

The plan should be developed to address various loss scenarios and undesirable operations of plant digital systems, including possible interruptions in service due to the loss of system resources, data, facility, staff, and/or infrastructure. The plan should define contingencies for ensuring minimal disruption to critical services in these instances.

11

2.8.4 Audits and Assessments The user should perform periodic computer system security self-assessments and audits, which are key components of a good security program. The user should assess proposed safety system changes and their impact on safety system security; evaluate anomalies that are discovered during operation; assess migration requirements; and re-perform V&V tasks to ensure that vulnerabilities have not been introduced into the plant environment.

2.9 Retirement Phase In the retirement lifecycle phase, the user should assess the effect of replacing or removing the existing safety system security functions from the operating environment.

The user should include in the scope of this assessment the effect on safety and nonsafety system interfaces of removing the system security functions. The user should document the methods by which a change in the safety system security functions will be mitigated (e.g.,

replacement of the security functions, isolation from other safety systems and user interactions, or retirement of the safety system interfacing functions).

3. Referenced Standards Clause 2 of IEEE Std 7-4.3.2-2003 references several industry codes and standards. If a referenced standard has been separately incorporated into the NRCs regulations, licensees and applicants must comply with the standard as set forth in the regulations. If the referenced standard has been endorsed by the NRC staff in a regulatory guide, the standard constitutes an acceptable method of meeting a regulatory requirement as described in the regulatory guide. If a referenced standard has been neither incorporated into the NRCs regulations nor endorsed in a regulatory guide, licensees and applicants may consider and use the information in the referenced standard, if appropriately justified, consistent with regulatory practice.

12

D. IMPLEMENTATION The purpose of this section is to provide information to applicants and licensees regarding the NRC staffs plans for using this draft regulatory guide. No backfitting is intended or approved in connection with the issuance of this guide.

The NRC has issued this draft guide to encourage public participation in its development.

Except when an applicant or licensee proposes or has previously established an acceptable alternative method for complying with specified portions of the NRCs regulations, the methods to be described in the active guide will reflect public comments and will be used in evaluating (1) submittals in connection with applications for construction permits, design certifications, operating licenses, and combined licenses for use of computers in safety systems, and (2) submittals from operating reactor licensees who voluntarily propose to initiate safety system modifications that have a clear nexus with this guidance.

REGULATORY ANALYSIS

Background

With the introduction of computers in safety systems, concerns have arisen over the possibility that the use of computer software could result in a common-mode failure.

Because of these concerns, the NRC staff has placed significant emphasis on defense-in-depth against propagation of common-mode failures within and between functions.

The two principal factors for defense against common-mode failures are quality and diversity.

Each postulated common-mode failure should be analyzed using best-estimate methods to address vulnerabilities to common-mode failures. Design qualification and quality assurance programs are intended to provide protection against design deficiencies and manufacturing errors. The guidelines in IEEE Std 603-1998 and IEEE Std 7-4.3.2-2003 should be applied to the development of digital computer systems for purposes of developing high-quality hardware and software.

1. Problem IEEE Std 7-4.3.2-1993 was endorsed by Revision 1 of Regulatory Guide 1.152 in January 1996. The development processes for computer systems continue to evolve. The revision of this standard (IEEE Std 7-4.3.2-2003) represents a continued effort by IEEE to support the specification, design, and implementation of computers in safety systems. The regulatory guide should, therefore, be revised to reflect the current state of the technology.
2. Objective The objective of the regulatory action is to update NRC guidance for the use of computers in safety systems and to provide guidance on safety system security.

13

3. Technical Approach Issuing a regulatory guide is consistent with the NRC policy of evaluating the latest versions of national consensus standards in terms of their suitability for endorsement by regulatory guides. This regulatory guide endorses the guidance of IEEE Std 7-4.3.2-2003 with a minor exception. As such, this guide provides a standardized approach so that the nuclear industry and the NRC staff may have a common understanding of the criteria for the use of computers in safety systems.

IEEE Std 7-4.3.2-2003 includes the following significant changes:

(a) The Software Quality Metrics clause was added. The industry practice is moving toward the use of software quality metrics to assure, monitor, and improve software quality, in addition to the verification and validation (V&V) that has traditionally been applied.

(b) The Qualification of Existing Commercial Computers clause was expanded to provide additional guidance that addresses the move toward the use of more commercial hardware and software in safety systems.

(c) The Software Tools clause was revised to address expanded use of software tools and methods.

(d) The Verification and Validation clause was revised to reference IEEE Std 1012-1998.

(e) The Software Configuration Management clause was expanded to provide additional guidance by identifying the key requirements for configuration management for safety system software using the guidance in IEEE Std 828-1998 and IEEE Std 1042-1987.

(f) A Software Project Risk Management clause was added to provide additional guidance consistent with IEEE Std 1540-2001 on risk management, and IEEE Std 12207.0-1996 on software lifecycle processes.

(g) A Fault Detection and Self-Diagnostics clause was added to discuss features that are unique to software and computer systems.

(h) The Identification clause was expanded to include software-specific requirements by extending the IEEE Std 603-1998 identification requirements to software.

(i) Annex C, Dedication of Existing Commercial Computers, was updated to more completely address issues associated with commercial off-the-shelf software (COTS).

(j) Annex D, Identification and Resolution of Hazards, was revised to represent current practices and processes for hazards analysis.

In addition, the staff has provided guidance specific to computer-based (cyber) safety system security.

14

4. Conclusion The NRC should revise Regulatory Guide 1.152, since this action should enhance the licensing process. The staff has concluded that the proposed action will reduce unnecessary burden on both the NRC and its licensees, and it will result in an improved process for the use of computers in safety systems. Furthermore, the staff sees no adverse effects associated with revising Regulatory Guide 1.152. Use of this revision by the licensees of currently operating nuclear power plants is entirely optional and voluntary.

BACKFIT ANALYSIS As described in 10 CFR 50.109(c), this draft revision of Regulatory Guide 1.152 does not require a backfit analysis because the use of this revision by the licensees of currently operating nuclear power plants is entirely voluntary.

15