ML24264A205

From kanterella
Jump to navigation Jump to search
IP 69020 App M, Inspection of Digital Instrumentation and Control-DI&C System Software Design at Non-Power Production and Utilization Facilities
ML24264A205
Person / Time
Issue date: 03/25/2025
From: Jared Nadel
NRC/NRR/DANU/UNPO
To:
References
CN 25-005
Download: ML24264A205 (1)


Text

Issue Date: 03/25/25 1

69020 App M NRC INSPECTION MANUAL DANU INSPECTION PROCEDURE 69020 APPENDIX M INSPECTION OF DIGITAL INSTRUMENTATION AND CONTROL (DI&C)

SYSTEM/SOFTWARE DESIGN AT NON-POWER PRODUCTION AND UTILIZATION FACILITIES Effective Date: March 25, 2025 PROGRAM APPLICABILITY: IMC 2550 69020.M-01 INSPECTION OBJECTIVES 01.01 To determine if work and related activities associated with safety-related digital instrumentation and controls (DI&C) system/software at non-power production and utilization facilities (NPUF) are being performed in accordance with regulatory requirements, the licensing basis, specifications, drawings, and work procedures.

01.02 To determine if the applicant/licensees system for preparing, reviewing, and maintaining records relative to safety-related DI&C system/software is functioning properly and to determine if the records reflect work accomplishment consistent with specifications and procedures.

01.03 To verify the as-built condition of safety-related DI&C system/software meets the specified design requirements, specifications, and drawings.

01.04 To determine that the implementation of the quality assurance program (QAP) related to work activities for safety-related DI&C system/software is effective and to verify that deviations from requirements are appropriately resolved.

69020.M-02 INSPECTION REQUIREMENTS 02.01 For the safety-related DI&C system/software activities selected for inspection, determine if appropriate and adequate procedures in the following areas are compatible with the QAP and prescribe adequate methods to meet the specifications:

a. Planning Phase
b. Requirements Phase
c. Design & Implementation Phase
d. Integration Phase
e. Validation and Test Phase
f.

Installation Phase 02.02 Determine if the applicant/licensee has an established audit program (including plans, procedures, and audit schedule) for assessing the adequacy of work control functions and requirements for DI&C system/software activities. Determine if examination,

Issue Date: 03/25/25 2

69020 App M inspection, and test personnel associated with performing tests and inspections of DI&C system/software activities are qualified and/or certified to perform their assigned work.

02.03 Determine if the following safety-related DI&C system/software activities are being controlled and accomplished in accordance with the requirements of the documents reviewed in Section 02.01, above:

a. Planning Phase
b. Requirements Phase
c. Design & Implementation Phase
d. Integration Phase
e. Validation and Test Phase
f.

Installation Phase 02.04 Review the documentation generated for the safety-related DI&C system/software activities. Determine if the applicant/licensee/contractor system for documenting safety-related work is functioning properly. Records should be complete, reviewed by quality control, engineering personnel, or designee, and readily retrievable.

a. Planning Phase
b. Requirements Phase
c. Design & Implementation Phase
d. Integration Phase
e. Validation and Test Phase
f.

Installation Phase

g. nonconformance/deviation record(s)
h. training/qualification records of craft, and quality inspection personnel (as required)
i.

configuration management records 69020.M-03 INSPECTION GUIDANCE General Guidance Inspectors should review the facility description in the safety analysis report (SAR) or equivalent and be familiar with the safety-related DI&C system/software activities at the site. The purpose of these as-built inspections is to verify that the assumptions and critical attributes reviewed during the licensing review process remain valid; the design was appropriately translated to construction specifications; the licensee/applicant constructed the facility in accordance with these specifications; and any changes made to the design described in the SAR comply with the licensees configuration management program.

Inspectors should also be familiar with the licensees QAP and use IP 69021, Inspections of Quality Assurance Program Implementation During Construction of Non-Power Production and Utilization Facilities, to perform vertical slice inspections as described in the body of this IP.

Inspectors should complete this appendix by inspecting the attributes listed in this appendix for DI&C system/software activities and the associated attachment(s) with a focus on safety-related items (and services).

Inspectors should contact the applicant/licensee prior to the on-site inspection to help determine which DI&C system/software activities are to be inspected. Observation during in-progress construction of the DI&C system/software activities is desirable but not required. If necessary,

Issue Date: 03/25/25 3

69020 App M inspectors may select completed DI&C system/software activities for inspection. Inspectors should not attempt to inspect all of the DI&C system/software activities associated with the facility but may expand if significant concerns with the applicant/licensees control of DI&C system/software activities arise.

Inspectors should collect applicant/licensee procedures and completion records in advance, if possible. If unable to review these documents in advance of the on-site inspection, then the licensee should be notified that these documents, and any other relevant documents, should be available when the inspector(s) arrives at the site.

Applicable portions of the license, amendment, or application should be reviewed to determine licensee commitments relative to construction and inspection requirements before review in this area. Inspectors should determine which versions of the industry codes and standards the licensee has committed to in the license.

Inspectors should choose one or more safety-related DI&C system/software activities and review the areas listed in 02.01 through 02.04 to the extent practical and may use their judgment in determining which areas to concentrate on if time is limited. However, inspectors should gain an understanding of the applicant/licensees program to the extent necessary to determine if the licensee conforms to regulatory requirements. Not all items in the inspection requirements section will be applicable or required in all situations for all safety-related structures, systems, and components.

General Guidance - Inspection Planning and Preparation for D&IC System/Software Inspection The actual planning and scheduling of the DI&C inspections is dependent on the licensees design development schedule and associated milestones. The guidance contained herein is intended to mirror a typical development life cycle. Inspections should not be planned until the completion of life cycle phases by the licensee can be anticipated and expected completion dates can be confirmed.

The development of safety-related DI&C systems and software should progress in accordance with a formally defined life cycle. Although life cycle activities may differ between licensees, all share certain characteristics. The staffs inspection and acceptance of digital safety system and software functions is based upon:

a. confirmation that acceptable plans were prepared to control software development activities;
b. evidence that the plans were implemented in the software development life cycles; and
c. evidence that the process produced acceptable design outputs.

Generic inspection attributes and criteria for each DI&C software life cycle phase are provided within Attachments 1 through 6 of this Appendix. It is recognized that not all DI&C life cycle phases may be inspected because they may not apply to each licensees development program/process. The goal of this inspection activity is to examine the governing documents and samples of activities that demonstrate the implementation of these documents in order to provide a comprehensive inspection of the licensees DI&C development process as delineated in the licensing basis.

Issue Date: 03/25/25 4

69020 App M

a. Gather pertinent information and consider the following inspection planning and scheduling issues:
1. importance/prioritization of activities
2. concurrent inspections to be conducted using other IPs
3. status and disposition of previous NRC findings
4. licensee documented responses to applicable Generic Letters, Bulletins, Regulatory Issue Summaries and Information Notices
5. commitments made in the COL pertaining to digital system/software development activities
6. technical attributes that should be the focus of the inspection
b. Contact the licensee for information needed to prepare the inspection plan, for example:
1. status of DI&C development activities, planned activities and schedule (used to focus inspection and determine inspection sample)
2. identification of individuals assigned key positions and functions described by the licensees Software Quality Assurance (QA) and Verification and validation (V&V) program
3. availability of licensee personnel during the period tentatively scheduled for the inspection
4. changes to applicable Software QA or V&V program since any previous NRC licensing and/or inspection activity (e.g., policy, personnel, program description, implementing documents)
c. Conduct the inspection in accordance with this Appendix and its associated attachments. The inspection should focus on safety-related requirements of the digital I&C systems, including redundancy, independence between safety significant and non-safety-related digital systems, independence of data communications, deterministic performance of trip and actuation functions, design simplicity (un-needed features implemented in safety systems), etc.

03.01 Inspection Requirement 02.01

a. Review construction specifications related to safety-related DI&C system/software activities and determine if the specified technical requirements conform to the commitments contained in the licensing basis. Review DI&C system/software procedures and verify that they specify provisions for adequate on-site engineering direction, are appropriate and adequate related to procurement and use of materials, specify adequate control of hold points, and provide adequate controls for design changes and incorporation of design changes into as-built drawings.
b. Review the IPs and compare with the requirements in the applicable codes and construction specifications. Evaluation should indicate if adequate quality-related IPs are established and are based on appropriate criteria, and further, if the results of the

Issue Date: 03/25/25 5

69020 App M licensee's inspection will be transmitted to responsible quality assurance and management personnel.

c. Procedures should be reviewed to ensure that technical requirements in the licensing document are reflected in construction specifications, drawings, work instructions, and work procedures. Verify that specifications, drawings, work instructions and IPs have been established that will assure the technical adequacy of the following activities pertaining to DI&C system/software. Verify that these documents comply with licensee commitments. Areas to review should include the following:
1. Planning Phase.

See Attachment 1, Inspection Guide for DI&C System/Software Life Cycle -

Planning Phase

2. Requirements Phase.

See Attachment 2, Inspection Guide for DI&C System/Software Life Cycle -

Requirements Phase

3. Design & Implementation Phase.

See Attachment 3, Inspection Guide for DI&C System/Software Life Cycle - Design

& Implementation Phase

4. Integration Phase.

See Attachment 4, Inspection Guide for DI&C System/Software Life Cycle -

Integration Phase

5. Validation and Test Phase.

See Attachment 5, Inspection Guide for DI&C System/Software Life Cycle -

Validation & Test Phase

6. Installation Phase.

See Attachment 6, Inspection Guide for DI&C System/Software Life Cycle -

Installation Phase 03.02 Inspection Requirement 02.02

a. Review applicant/licensees established audit program (including plans, procedures, and audit schedule) for assessing the adequacy of work control functions and requirements for safety-related DI&C system/software construction activities.
b. Review audit program to verify if examination, inspection, and test personnel associated with performing tests and inspections of DI&C system/software construction activities are qualified and/or certified to perform their assigned work.
c. Verify records establish that the required audits were performed and that deficiencies identified during audits were appropriately resolved.

Issue Date: 03/25/25 6

69020 App M 03.03 Inspection Requirement 02.03 Determine if the following applicable safety-related DI&C system/software activities are being controlled and accomplished in accordance with the requirements of the documents reviewed in Section 02.01, above:

1. Planning Phase.

See Attachment 1, Inspection Guide for DI&C System/Software Life Cycle -

Planning Phase

2. Requirements Phase.

See Attachment 2, Inspection Guide for DI&C System/Software Life Cycle -

Requirements Phase

3. Design & Implementation Phase.

See Attachment 3, Inspection Guide for DI&C System/Software Life Cycle - Design

& Implementation Phase

4. Integration Phase.

See Attachment 4, Inspection Guide for DI&C System/Software Life Cycle -

Integration Phase

5. Validation and Test Phase.

See Attachment 5, Inspection Guide for DI&C System/Software Life Cycle -

Validation & Test Phase

6. Installation Phase.

See Attachment 6, Inspection Guide for DI&C System/Software Life Cycle -

Installation Phase

7. Configuration management. For the activities observed during Inspection Requirement 02.03. for in-process installation, completed work, and as-built verification, and for testing, verify if changes occurred during these construction activities, the applicant/licensee properly controlled and documented these changes for engineering review, approval, and subsequent incorporation into the final as-built drawings. Verify these actions were completed in accordance with their procedures and QAP.

03.04 Inspection Requirement 02.04 Determine if the DI&C system/software activities, the applicant, licensee, or contractors system for documenting safety-related work is functioning properly. Review the documentation generated for DI&C system/software. Determine if the licensee/contractor system for documenting safety-related work is functioning properly. The record-keeping activities should reflect the actual conditions encountered in the field and provide adequate documentation of work and inspections. Determine if records are being maintained, reviewed, and approved, as specified. Records should include sufficient

Issue Date: 03/25/25 7

69020 App M detail to document work associated with DI&C system/software. Records should be complete, reviewed by quality control inspectors and/or engineering personnel and readily retrievable. Review a sample of the following records:

1. Planning Phase.

See Attachment 1, Inspection Guide for DI&C System/Software Life Cycle -

Planning Phase

2. Requirements Phase.

See Attachment 2, Inspection Guide for DI&C System/Software Life Cycle -

Requirements Phase

3. Design & Implementation Phase.

See Attachment 3, Inspection Guide for DI&C System/Software Life Cycle - Design

& Implementation Phase

4. Integration Phase.

See Attachment 4, Inspection Guide for DI&C System/Software Life Cycle -

Integration Phase

5. Validation and Test Phase.

See Attachment 5, Inspection Guide for DI&C System/Software Life Cycle -

Validation & Test Phase

6. Installation Phase.

See Attachment 6, Inspection Guide for DI&C System/Software Life Cycle -

Installation Phase

7. Nonconformance/Deviation Record (a)

Records include current status of these items. Nonconformance reports include the status of corrective action or resolution, (e.g., determine if adequate corrective action is being taken when moisture density test results are not within tolerance or acceptance criteria.)

(b)

The sample size and diversification of selection should be sufficient to determine if the system used to handle and control nonconformance issues is working in an effective manner. The effectiveness of the QAP this area can be determined, in part, by how adequately and promptly the root cause of nonconforming activities are identified and corrected.

(c)

For the inspection, review and evaluate a sampling of reports applicable to non-conformances or deviations in DI&C system/software installation, testing, and performance. Determine if:

(1) Records are complete and promptly reviewed by qualified personnel.

Issue Date: 03/25/25 8

69020 App M (2) Records have been routinely processed, evaluated in a timely manner and controlled through established channels, for resolution of the root-cause as well as the immediate problem.

(3) Records are properly identified and stored, indicate current status, and can be retrieved in a reasonable time.

(4) Nonconformance reports include the status of corrective action or resolution, and adequate justification is provided for use-as-is disposition.

8. Training/Qualification Records of Craft, and Quality Inspection Personnel. Determine if records establish that quality inspection personnel are adequately qualified for their assigned duties and responsibilities and that craft personnel have been trained in their assigned tasks.

Verify if a program has been established for ensuring that craft, examination, and inspection personnel associated with safety-related DI&C system/software activities are trained and qualified to perform their assigned duties. Verify the program includes:

(a)

The proper use of installation DI&C system/software.

(b)

The proper handling, supporting and protection of electric system and components stored in place.

(c)

The system of craft and inspection personnel qualification records meets stated requirements and is being maintained in a current status.

(d)

The records are sufficient to reasonably support qualification in terms of certification, experience, proficiency, training, testing, etc., as applicable.

(e)

Responsible licensee/contractor organizations have acted to independently authenticate the record material.

9. Configuration Management Records. Review and evaluate a selected sample of configuration management records, and determine if:

(a)

Records associated with design and field changes, as well as related work and IP changes, reflect that timely review and evaluation of design and field change documents have been performed by personnel who are qualified.

(b)

Records of periodic inspections ensure that only the most recent approved documents, including design changes, were used in the field.

(c)

Design changes are subject to adequate design control, including consideration of the impact of the change on the overall design and on as-built records.

(d)

Records of nonconformances to design requirements include preparation of a nonconformance report even if the nonconformance is resolved through the design-change process.

Issue Date: 03/25/25 9

69020 App M 69020.M-04 RESOURCE ESTIMATE The appendices, or sections of the appendices, and inspection samples and hours, applicable to a specific facility should be in the range of 80-160 hours. Inspection preparation, including review of licensing basis, safety analysis report (SAR), and applicable codes and standards, is not included in this estimate.

69020.M-05 PROCEDURE COMPLETION This inspection procedure appendix is complete when the applicable attachments to this appendix have been inspected for DI&C system/software activities. Refer to Section 69020-05, Procedure Completion, of IP 69020, Inspection of Safety Related Items (and Services) During Construction of Non-Power Production and Utilization Facilities, for details on what constitutes a completed inspection sample. Inspectors are not expected to complete every activity in the appendices of this IP. Instead, inspectors should prioritize inspection activities based on

1) importance of the activity to safety, 2) availability of the on-site activity at the time of the inspection, and 3) available inspection resources. An appendix to this IP need not be completed if there are no safety-related items (or services) covered by that appendix at an NPUF.

69020.M-06 REFERENCES Refer to licensing basis requirements for applicable codes and standards for each fuel facility.

IEEE Std. 730, IEEE Standard Criteria for Software Quality Assurance Plans IEEE Std. 828, IEEE Standard for Configuration Management Plans IEEE Std. 829, IEEE Standard for Software Test Documentation IEEE Std. 830, IEEE Recommended Practice for Software Requirements Specifications IEEE Std. 1008, IEEE Standard for Software Unit Testing IEEE Std. 1012, "IEEE Standard for Software Verification and Validation Plans" IEEE Std. 1028, IEEE Guide to Software Configuration Management IEEE Std. 1074, "IEEE Standard for Developing Software Life Cycle Processes" IEEE Std. 1228, "IEEE Standard for Software Safety Plans" IP 88200, Inspection of Instrumentation and Control Systems at Fuel Cycle Facilities, Appendix J IP 52003, Digital Instrumentation and Control Modification Inspection END Attachments: : Inspection Guide for DI&C System/Software Life Cycle - Planning Phase

Issue Date: 03/25/25 10 69020 App M : Inspection Guide for DI&C System/Software Life Cycle - Requirements Phase : Inspection Guide for DI&C System/Software Life Cycle - Design &

Implementation Phase : Inspection Guide for DI&C System/Software Life Cycle - Integration Phase : Inspection Guide for DI&C System/Software Life Cycle - Validation & Test Phase : Inspection Guide for DI&C System/Software Life Cycle - Installation Phase : Revision History for IP 69020 Appendix M

Issue Date: 03/25/25 Att1-1 69020 App M : Inspection Guide for DI&C System/Software Life Cycle - Planning Phase 69020.M.A1-01 INSPECTION OBJECTIVES 01.01 Appendix inspection objectives listed in Appendix M, Inspection of Digital Instrumentation and Control (DI&C) System/Software Design.

01.02 Attachment Inspection Objectives: Verify that the licensees DI&C development process Planning Phase documents are consistent with design commitments.

69020.M.A1-02 INSPECTION REQUIREMENTS Appendix inspection requirements listed in Appendix M, Inspection of Digital Instrumentation and Control (DI&C) System/Software Design.

69020.M.A1-03 INSPECTION GUIDANCE General Guidance.

A digital system/software development life cycle provides definition for a deliberate, disciplined, and quality development process. Implementation of this process should result in a quality DI&C system and supporting software. Verification of this process should confirm, by evaluation against applicable standards and criteria, that the licensee and vendor procedures and plans are sufficient to accomplish this goal.

The Planning Phase activities will provide documents that will be used to oversee the DI&C development project as it progresses from one Life Cycle Phase to the next. As committed to in the licensing basis, compliance with RG 1.173 and IEEE-Std-1074 Developing a Life Cycle Process, means mandatory activities are performed, requirements designated as Shall are met, and all inputs, outputs, activities and pre-and post-conditions mentioned by IEEE-Std-1074 are accounted for in the licensees/applicants life cycle model.

The documents resulting from the Planning Phase include the following minimum set; additional documents may be required by the development organization as part of their standard business procedures. It should be noted that software life cycle Plans for Operations, Maintenance and Training are not included; these elements are not considered part of the digital I&C DAC envelope and can be covered through DI&C system as-built inspection.

Software Management Plan (SMP)

Software Quality Assurance Plan (SQAP)

Software Configuration Management Plan (SCMP)

Software Verification and Validation Plan (SVVP)

Software Safety Plan (SSP)

Software Development Plan (SDP)

Software Integration Plan (SIntP)

Software Installation Plan (SInstP)

Software Test Plan (STP)

Issue Date: 03/25/25 Att1-2 69020 App M Generally, these Planning documents include management characteristics, implementation characteristics, and resource characteristics. Not all specific characteristics occur for every Plan. Management characteristics for each Plan should include a stated Purpose, identify Organizational and Oversight responsibilities, and account for risk and security management.

Implementation characteristics should include Process Metrics as well as guidance on Procedure Control and Recordkeeping. Resource characteristics should include details of Special Tools utilized in the development process, Personnel resources and qualification, and the Standards used to meet regulatory requirements. Inspection should focus on those aspects of the Plans which can impact the safety and quality of the resulting DI&C system/software.

The inspectable attributes identified in this attachment were compiled from many of the references listed in this procedure. This inspection procedure verifies commitments made in the licensing basis.

Sample Size Inspection of DI&C will typically rely on selection of a sample of attributes for verification. Given the importance of the various Life Cycle Plans in defining and detailing the quality design and development process expected for safety-related DI&C systems/software, inspection of a larger representative sample of attributes associated with each of the Planning Phase documents is appropriate. Initial sample size is left to the inspectors discretion based on review of inspection source documentation and may be modified based on inspection issues and findings identified.

Specific guidance In addition to below, for additional specific inspection guidance, refer to Appendix M, Inspection of Digital Instrumentation and Control (DI&C) System/Software Design.

a. Inspection of Software Management Plan (SMP)
1. Verify that the SMP addresses the following specific management aspects of the software development project, as committed to in the licensing basis:

(a)

Organizational structure is defined. Responsibilities are known and documented, and a management structure exists to keep the SMP up to date through a configuration control process.

(b)

Oversight of vendors. The SMP should describe the interaction between licensee and system/software vendors, extension of quality assurance (QA) requirements to vendors, what checks and audits the licensee will perform and their impact.

(c)

Independence between the software development group and the QA group, system/software safety group, and verification and validation (V&V) group. If independence aspects are described in the planning documents of these organizations, such as the V&V Plan, Safety Plan or QA plan, the SMP should provide a pointer to those plans.

(d)

Personnel responsible for various items have the experience, training and qualifications to perform those duties.

Issue Date: 03/25/25 Att1-3 69020 App M

2. Verify that the SMP includes the following key attributes, as committed to in the licensing basis:

(a)

Project schedule includes time allotted for review (management, V&V, etc.)

and audit.

(b)

Project work products and deliverables are adequately defined.

(c)

Responsibilities documented and communicated to the development organization.

(d)

Project constraints that may have an impact on safety are identified.

(e)

Known risk factors identified.

(f)

Required reports and technical documents identified.

(g)

Training requirements known and documented.

(h)

Internal review and audit processes identified.

b. Inspection of Software Quality Assurance Plan (SQAP)
1. Many aspects of software quality are described in the various life cycle Plans. These include the Configuration Management Plan, the Software Safety Plan, and the Software Verification and Validation Plan.

The SQAP should comply with the requirements of 10 CFR Part 50, Appendix B, and the licensees approved QA program. The SQAP should typically: 1) identify which QA procedures are applicable to specific software processes; 2) identify particular methods chosen to implement QA procedural requirements; and 3) augment and supplement the QA program as needed for software. Verify that the SQAP addresses the following, as committed to in the licensing basis:

(a) management tasks (b) documentation (c) recordkeeping (d) standards, practices, conventions (e) reviews and audits (f) problem reporting and corrective action (g) control of tools, techniques, and methodologies (h) supplier (vendor) control (i) version control (j) audit trails

2. Verify that the SQAP includes the following key attributes, as committed to in the licensing basis:

(a)

SQAP specifies which software products are covered by the Plan.

(b)

Project elements (organizations) that interact with the QA organization are listed.

Issue Date: 03/25/25 Att1-4 69020 App M (c)

Organization engaged in software QA activities is independent of the development organization, including cost and schedule.

(d)

Life Cycle development phases that will be subject to QA oversight are listed.

(e)

Required QA tasks are listed and described.

(f)

Conflict resolution among organizations is described.

(g)

Required software documents are listed.

(h)

Required reviews and audits are listed.

(i)

Methods by which each review and audit will be carried out is described.

(j)

SQAP includes provisions to assure that problems will be documented and corrected.

c. Inspection of Software Configuration Management Plan (SCMP)
1. Verify that the SCMP addresses the following specific activities, as committed to in the licensing basis:

(a)

Production/development baselines are identified and established.

(b)

Review, approval, and control of changes is defined.

(c)

Tracking and reporting of changes is defined (d)

Audits and reviews of the evolving products are established.

(e)

Control of interface documentation is defined.

2. Verify that the SCMP includes the following key attributes, as committed to in the licensing basis:

(a)

Product interfaces that have to be supported within the project are identified.

(b)

The required capabilities of the staff needed to perform SCM activities are defined.

(c)

The responsibilities for processing baseline changes are defined.

(d)

The SCMP specifies who is responsible for each SCM activity.

(e)

The organizational interfaces that affect the SCM process are identified.

(f)

SCM activities that will be coordinated with other project activities are described.

(g)

Describes how phase-specific SCM activities will be managed during the different life cycle phases.

(h)

Specific procedures exist to manage the change process.

Issue Date: 03/25/25 Att1-5 69020 App M (i)

Audit procedures are defined.

(j)

Configuration identification scheme matches the structure of the software product.

(k)

SCMP specifies which items will be placed under configuration control (configuration items (CI)).

(l)

SCMP describes the authority of the Configuration Control Board (CCB).

(m)

CCB authority is sufficient to control safety-related changes to the CI baseline.

(n)

SCMP requires the CCB to assess the safety impact of change requests.

(o)

Provisions are included for auditing the SCM process.

(p)

SCMP provides for periodic reviews and audits of the configuration baseline, including physical audits of the baseline.

(q)

SCMP provides for audits of suppliers and subcontractors, if such are used.

(r)

SCMP accounts for all assets, including backup and recovery software.

d. Inspection of Software Verification & Validation Plan (SVVP)
1. Verify that the SVVP addresses the following specific activities, as committed to in the licensing basis:

(a)

Management of Life Cycle V&V. The major portion of the V&V Plan will describe the methods in which V&V will be carried out through the life of the development project. In general, the following activities should be required for each phase of the life cycle:

(1) Identify the V&V tasks for the life cycle phase.

(2) Identify the methods that will be used to perform each task.

(3) Specify the source and form for each input item required for each task.

(4) Specify the purpose, target and form for each output item required for each task.

(5) Specify the schedule for each V&V task.

(6) Identify the resources required for each task.

(7) Identify the risks and assumptions associated with each V&V task.

(8) Identify the organizations or individuals responsible for performing each V&V task.

(b)

Requirements Phase V&V. The V&V Plan should describe how the various V&V tasks will be carried out for the following:

Issue Date: 03/25/25 Att1-6 69020 App M (1) Software Requirements Traceability Analysis (2) Software Requirements Evaluation (Report)

(3) Software Requirements Interface Analysis (4) System Test Plan Generation (5) Acceptance Test Plan Generation (c)

Design & Implementation Phase V&V. The V&V Plan should describe how the various V&V tasks will be carried out for the following:

(1) Software Design Traceability Analysis (2) Software Design Evaluation (Report)

(3) Software Design Interface Analysis (4) Test Plan Generation (5) Source Code Traceability Analysis (6) Source Code Evaluation (7) Source Code Interface Analysis (8) Source Code Documentation Analysis (d)

Integration Phase V&V. The V&V Plan should describe how the various V&V tasks will be carried out for the following:

(1) Integration Test Procedure Generation (2) Integration Test Procedure Execution (e)

Validation & Test Phase V&V. The V&V Plan should describe how the various V&V tasks will be carried out for the following:

(1) Acceptance Test Procedure Generation (2) System Test Procedure Execution (3) Acceptance Test Procedure Execution (f)

Installation Phase V&V. The V&V Plan should describe how the various V&V tasks will be carried out for the following:

(1) Installation Configuration Audit (2) Final V&V Report Generation

2. Verify that the SVVP includes the following key attributes, as committed to in the licensing basis:

(a)

SVVP references the SMP and/or SQAP.

(b)

Specific elements of the higher-level plans are addressed in the SVVP.

(c)

Scope of the V&V effort is defined.

(d)

SVVP describes the V&V organization, including its relationship to the development organization (should be independent).

(e)

Schedule is defined that provides enough time for V&V activities to be carried out. The V&V organization should have freedom from scheduling constraints or impact.

Issue Date: 03/25/25 Att1-7 69020 App M (f)

Tools, techniques, and methods to be used in the V&V process defined. SVVP identifies method of handling anomalies encountered during each activity.

(g)

V&V schedule and resource requirements are described in detail.

(h)

SVVP identifies the responsibilities, line organization and reporting requirements for the V&V organization.

(i)

SVVP defines procedure for management review of the V&V process.

(j)

Process is developed for periodic assessment and updating of V&V procedures and tools.

(k)

Defined procedure is developed for correlating V&V results with management and technical review documents.

(l)

SVVP is coordinated with project planning documents to ensure early availability of the planning documents for the V&V effort.

(m)

SVVP explicitly defines and describes the following (as a minimum) V&V tasks:

(1) audits (2) regression testing and analysis (3) security assessment (4) testing evaluation (5) documentation evaluation

3. Verify that the SVVP includes the following life cycle phase-specific attributes, as committed to in the licensing basis:
e. Requirements Activities
1. Concept documentation, software requirements specification (SRS), interface requirements, hazards analysis, and user documentation will be complete prior to beginning the V&V requirements analysis.
2. SVVP explicitly defines the activities required during the requirements analysis.
3. SVVP requires the performance of a software requirements traceability analysis.
4. SVVP requires that the SRS be evaluated for performance issues.
5. SVVP requires that a system test plan and an acceptance test plan be generated during the requirements phase.

(a)

Design & Implementation Activities (1) SVVP requires the generation and dissemination of anomaly reports.

(2) SVVP explicitly defines the activities required during the design and implementation phase.

Issue Date: 03/25/25 Att1-8 69020 App M (3) SVVP requires the performance of a design traceability analysis that traces elements of the detailed design and coding to elements of the software requirements.

(4) SVVP requires a design evaluation (report).

(5) SVVP requires a design interface analysis.

(6) SVVP requires that the software design document be evaluated against hardware requirements, operator requirements, and software interface requirements documentation.

(7) SVVP requires a software component test plan, an integration test plan, and a test design be generated for use in later testing.

(8) SVVP requires that the source code be evaluated for correctness, consistency, completeness, accuracy, readability, safety, and testability.

(9) SVVP requires generation and use of test cases to help ensure the adequacy of test coverage.

(10) SVVP requires the generation of test cases for software component, integration, system, and acceptance testing.

(b)

Integration, Validation & Test, and Installation Activities (1) SVVP explicitly defines the activities required during the integration and validation analysis and testing.

(2) SVVP requires the performance of sufficiently detailed testing requirements so as to ensure that there is a very low probability of error during operation, and describes the consequences if errors were to occur.

(3) SVVP explicitly defines the activities required during the installation analysis and testing.

(4) SVVP requires the performance of an installation configuration audit.

(5) SVVP requires the generation of a final report.

f.

Inspection of Software Safety Plan (SSP)

1. Verify, consistent with commitments in the licensing basis, that the SSP addresses the following documentation that will be required as part of the software safety program:

(a)

Results of all safety analyses (b)

Information on suspected or verified safety problems (c)

Results of audits performed on software safety program activity (d)

Results of safety tests carried out on the software system

Issue Date: 03/25/25 Att1-9 69020 App M (e)

Records on training provided to software safety personnel and software development personnel

2. Verify that the SSP includes the following key attributes, as committed to in the licensing basis:

(a)

Software safety organization is described and authority defined; authority sufficient to enforce compliance with safety requirements and practices.

(b)

SSP provides a mechanism for defining safety requirements, performing software safety analysis tasks, and testing safety-critical features of the DI&C system.

(c)

SSP describes what safety-related documents will be produced during the development life cycle; contents sufficient to ensure that known safety concerns are addressed in the appropriate places within the development life cycle.

(d)

SSP identifies the safety-related records that will be generated, maintained, and preserved.

(e)

SSP specifies the process of approving and controlling software tool use.

(f)

SSP provides a means to ensure that safety-critical software developed by a subcontractor meets the requirements of the software safety program.

g. Inspection of Software Development Plan (SDP)
1. Verify, consistent with commitments in the licensing basis, the following:

(a)

SDP defines the tasks that are a part of each life cycle phase.

(b)

SDP defines life cycle phase inputs and outputs, including review, verification and validation of those outputs.

(c)

SDP lists the international, national, industry, and company standards and guidelines, including regulatory guides, which will be followed, and whether or not these standards and guidelines have previously been approved by the NRC staff.

2. Verify that the SDP includes the following key attributes, as committed to in the licensing basis:

(a)

Technical standards that will be followed are listed.

(b)

Technical milestones are listed.

(c)

Milestones are consistent with the schedule provided in the SMP.

(d)

Technical documents that should be produced are listed.

(e)

Technical documents are consistent with those listed in the SMP.

(f)

Milestones, baselines, reviews, and signoffs are listed for each document.

(g)

Audit reports document that the SDP is being followed.

Issue Date: 03/25/25 Att1-10 69020 App M

h. Inspection of Software Integration Plan (SIntP)
1. Verify, consistent with commitments in the licensing basis, the following:

(a)

SIntP describes the general strategy for integrating the software modules together into one or more programs.

(b)

SIntP integration strategy includes integrating the various software modules together to form single programs.

(c)

SIntP integration strategy includes integrating the software with the hardware and instrumentation, and testing the resulting integrated product.

(d)

SIntP integration strategy includes regression testing.

2. Verify that the SIntP includes the following key attributes, as committed to in the licensing basis:

(a)

SIntP specifies the levels of integration required.

(b)

SIntP is consistent with the software design specification.

(c)

SIntP describes each step of the integration process.

(d)

SIntP describes the environment that will be used to perform and test each integration step.

(e)

Software and hardware tools that will be used to integrate the computer system are listed.

(f)

SIntP includes instructions on how to carry out integration steps.

(g)

SIntP includes a contingency plan in case the integration fails.

(h)

SIntP includes a requirement for configuration control of the completed product.

i.

Inspection of Software Installation Plan (SInstP)

1. Verify that the SInstP includes the following key attributes, as committed to in the licensing basis:

(a)

General procedures for installing the software product are described.

(b)

Materials required are listed in an Installation Package.

(c)

Complete step-by-step procedures exist for installation in the operational environment.

(d)

Expected results from each installation step are described.

(e)

Known installation error conditions and recovery procedures are described.

(f)

Installation Plan is fully tested.

Issue Date: 03/25/25 Att1-11 69020 App M

2. Verify that the SInstP exhibits the following characteristics, as committed to in the licensing basis:

(a)

Includes a description of the System operating environment.

(b)

Includes a description of the organization responsible for installation, their responsibilities, and interfaces with other organizations.

(c)

Includes acceptance criteria to determine success/failure of the installation effort.

(d)

Includes a description of procedures for System, combined hardware/software, and software installation.

(e)

Includes requirements for functional checks of the installation, including checks for correct version of the application software and source code on the correct hardware (platform).

(f)

Includes requirement for adequate safety function testing.

(g)

Includes requirement that installation tools be qualified to a degree commensurate with the safety significance of the installed software.

j.

Inspection of Software Test Plan (STP)

1. Note - Attributes of the STP may also be verified as part of the Validation and test Phase activities in Appendix 5. Verify that the STP includes the following key attributes, as committed to in the licensing basis:

(a)

General testing for the software, including unit, integration, factory and site acceptance, and installation testing is described.

(b)

Testing responsibilities have been assigned to appropriate personnel groups (test group and V&V group).

(c)

Provisions are included for re-test in the event of test failures.

(d)

Provisions are included for full testing after any software modifications.

(e)

Provisions are included for the V&V group to have oversight of final System (hardware and software) testing.

2. Verify that the STP exhibits the following characteristics, as committed to in the licensing basis:

(a)

Testing organization and interfaces are described.

(b)

Scope of testing activities is described.

(c)

Test scope includes secure development and operational environment (SDOE) testing strategy.

Issue Date: 03/25/25 Att1-12 69020 App M (d)

Acceptance criteria are included to determine the success or failure of testing activities.

(e)

Procedures for testing each software item are identified.

(f)

Methods for synchronizing procedures and test cases with the software design are described.

(g)

Procedures for tracking problem reports and their resolution are described.

(h)

Software test record keeping requirements are described.

69020.M.A1-04 RESOURCE ESTIMATE Appendix resource estimate listed in Appendix M, Inspection of Digital Instrumentation and Control (DI&C) System/Software Design.

69020.M.A1-05 PROCEDURE COMPLETION Appendix procedure completion listed in Appendix M, Inspection of Digital Instrumentation and Control (DI&C) System/Software Design.

69020.M.A1-06 REFERENCES Appendix references listed in Appendix M, Inspection of Digital Instrumentation and Control (DI&C) System/Software Design.

END

Issue Date: 03/25/25 Att2-1 69020 App M : Inspection Guide for DI&C System/Software Life Cycle - Requirement Phase 69020.M.A2-01 INSPECTION OBJECTIVES 01.01 Appendix inspection objectives listed in Appendix M, Inspection of Digital Instrumentation and Control (DI&C) System/Software Design.

01.02 Attachment Inspection Objectives: Verify that the licensees DI&C development Requirements Phase process and documentation are consistent with design commitments.

69020.M.A2-02 INSPECTION REQUIREMENTS Appendix inspection requirements listed in Appendix M, Inspection of Digital Instrumentation and Control (DI&C) System/Software Design.

69020.M A2-03 INSPECTION GUIDANCE General Guidance The scope and activities associated with the Requirements Phase result in a complete description of what the Digital I&C System (hardware and software) should accomplish and establishes functional requirements for the system software. Note: any reference to the Digital I&C System implies both the hardware and software elements. Activities include:

translation of functional and regulatory requirements to DI&C system(s) requirements define and document I&C System (hardware and software) requirements define, document, prioritize, and integrate software requirements define and document software interface and performance requirements Requirements Safety Analysis Requirements Verification The product of these activities will be documented in a Software Requirements Specification (SRS). Inspection will verify that System and Software Requirements were developed in accordance with applicable codes, standards, and regulations. Inspection should focus on critical requirements essential for robust I&C system (hardware and software) design, including Redundancy, Independence (both physical and data communication), Deterministic performance, and design Simplicity. The SRS is an essential part of the record of the design of safety-related system software and is a design input to the remainder of the software development process. Software requirements serve as the design bases for the software to be developed.

Sample Size Inspection of DI&C will typically rely on selection of a sample of attributes for verification.

Inspection of Requirements Phase activities will be accomplished through verification of selected attributes for a representative sample of critical system/software requirements (typically 10-20 percent of critical requirements). Traceability (through thread audit techniques) should be

Issue Date: 03/25/25 Att2-2 69020 App M verified for the same representative sample of critical requirements. Sample size should be expanded at the inspectors discretion if inspection issues or findings are identified.

Specific guidance In addition to below, for additional specific inspection guidance, refer to Appendix M, Inspection of Digital Instrumentation and Control (DI&C) System/Software Design.

a. Digital I&C (DI&C) System Requirements
1. DI&C System requirements will form the basis for the DI&C System design specification. Verify the following attributes for the DI&C system, as committed to in the licensing basis:

(a)

Each DI&C System requirement is separately identified.

(b)

DI&C System requirements satisfy applicable codes, standards and regulatory requirements.

(c)

DI&C System requirements are correctly derived from originating or source (facility) requirements (for functionality and performance) and safety requirements (from the facility Safety Analysis).

(d)

Critical DI&C System safety considerations are identified.

(e)

Critical DI&C System physical and SDOE considerations are identified.

(f)

Critical DI&C System boundary value testing requirements are identified.

(g)

DI&C System Source Requirements Lists (or equivalent index) are developed and requirements traceability defined/documented.

Note - Requirements Traceability Matrix (or equivalent tracking/traceability tool) should show, as a minimum, each requirement, the source of the requirement, the life cycle phases that are utilized by the development project, the testing to be applied to each requirement, and the associated requirement item identification. This will allow the matrix to be reviewed in order to assure that each requirement is addressed by the output products of each phase. The matrix will allow life cycle phase-to-phase and end-to-end review. The inspector should be able to trace requirements through the development life cycle, forward and backward. The traceability matrix can also be used to assist in impact assessment as requirements change, and provides documentation that safety requirements and licensing commitments are met.

2. Verify traceability of functional, regulatory and design basis (source) requirements to DI&C system requirements.

(a)

Using the Traceability Matrix (or equivalent traceability tool), select a representative sample of DI&C system requirements and trace them backwards to the source requirements from which they were derived.

(b)

Using the Traceability Matrix (or equivalent traceability tool), select a sample of source requirements and trace them forward to verify that the DI&C system

Issue Date: 03/25/25 Att2-3 69020 App M requirements were correctly derived. Where possible, select source requirements that provide more than one system requirement (output), and trace the outputs accordingly.

(c)

Verify that forward traceability also includes an output to a test and validation process. Test and validation will ultimately confirm that requirements have been met.

Note - tracing activities are designed to identify any source requirements not adequately addressed by the System requirements, System requirements not meeting the intent of source requirements, and any missing System requirements.

b. Software Requirements
1. The DI&C software requirements are derived from the DI&C system requirements.

Verify that software requirements exhibit the following attributes, as committed to in the licensing basis:

(a)

Software functional requirements are individually identified.

(b)

Software functional requirements are unambiguously stated.

(c)

Software functional requirements specify what the software should do.

(d)

Each software functional requirement is verifiable through inspection or testing of the completed software product (includes application software and source code).

(e)

Software requirements are defined in a Software Requirements Specification (SRS).

2. Verify that the SRS lists and defines the following software interface requirements, as committed to in the licensing basis:

(a)

Interfaces between software systems (software-to-software) are defined.

(b)

External interfaces to software (user, hardware) are specified.

(c)

Communication protocols are defined, including error detection methods.

(d)

Input/Output (I/O), with validation methods of sensors/actuators is adequately described.

(e)

Instrumentation parameters and values are adequately described, and conversion algorithms are defined.

3. Verify that the SRS lists and defines the following software performance attributes, as committed to in the licensing basis:

(a)

Static performance requirements (number of terminals, simultaneous users, etc.) are adequately described.

(b)

Timing requirements are described and specified numerically.

Issue Date: 03/25/25 Att2-4 69020 App M (c)

All performance requirements are individually identified.

(d)

Each performance requirement is testable.

4. Verify traceability of DI&C system requirements to software requirements.

(a)

Using the Traceability Matrix (or equivalent traceability tool), select a representative sample of critical software requirements and trace them backwards to the DI&C System requirements from which they were derived.

(b)

Using the Traceability Matrix (or equivalent traceability tool), select a sample of DI&C System requirements and trace them forward to verify that the software requirements were correctly derived. Where possible, select system requirements that provide more than one software requirement (output), and trace the outputs accordingly.

(c)

Verify that the selected software requirements are included in software test documentation.

Note - each identifiable requirement in an SRS should be traceable backwards to the system requirements and the design bases or regulatory requirements that it satisfies. Each identifiable requirement should be written so that it is also "forward traceable" to subsequent design outputs, e.g., from SRS to design, implementation, integration, and validation stages of the development project. Forward traceability to all documents resulting from the SRS includes verification and validation materials.

For example, a forward trace should exist from each requirement in the SRS to the specific test used to confirm that the requirement has been met.

Tracing activities are designed to identify any system requirements not adequately addressed by the software requirements, software requirements not meeting the intent of system requirements, and any missing software requirements.

c. Requirements Phase Documentation
1. Verify that development processes supporting Requirements Phase activities are documented, as committed to in the licensing basis. Verify the following:

(a)

Audit requirements are specified.

(b)

System and Software Requirements Review process is defined and implemented.

(c)

Configuration Management process (change control, version control, audit trails, etc.) process is defined and implemented.

(d)

Independent Verification and Validation (IV&V) process is defined and implemented.

2. Verify the following documentation exists for the Requirements Phase activities, as committed to in the licensing basis:

(a)

Audit Report and any Condition Reports for issues, nonconformances and audit findings.

Issue Date: 03/25/25 Att2-5 69020 App M (b)

Documentation of corrective action stemming from Condition Reports.

(c)

Requirements Configuration Management Report; verify that configuration change orders are communicated to the IV&V and Test organizations.

(d)

Requirements Verification (IV&V) Report; verify that the report confirms that there is 100% verification of software requirements by the IV&V organization.

d. Requirements Safety Analysis
1. A Safety Analysis encompassing Requirements Phase activities should be performed and documented. The analysis should determine which software requirements are critical to system safety, that all safety requirements imposed by the DI&C System design have been correctly addressed in the SRS, and that no additional hazards have been created by the SRS. Verify the following attributes and results of the Safety Analysis, as committed to in the licensing basis:

(a)

DI&C system safety requirements are correctly included in the SRS.

(b)

Each DI&C system safety requirement can be traced to one or more software requirements.

(c)

Each software requirement can be traced to one or more system requirements.

(d)

There are no missing or inconsistently specified DI&C system functions or software requirements (from the selected sample).

(e)

No new hazards have been introduced to the process.

(f)

Requirements that can affect safety have been identified.

(g)

Safety issues and resolutions are documented.

2. Analyses of software safety requirements should be performed to determine the impact of incorrectly developed or incorporated requirements on safety-related software. Verify that the following elements have been considered:

(a)

Diverse Requirements for the DI&C system and software are included based on a Diversity and Defense-in-Depth (D3) analysis.

(b)

Critical/Non-Critical Software Requirement discriminators and criteria, based on criticality analysis that determines the impact of a failure to meet software requirements, are included.

(c)

Requirements Traceability, as a mechanism to determine compliance with DI&C system specifications, is included.

(d)

Specific Requirements are included to ensure deterministic behavior of the DI&C system.

(e)

Specific Requirements are included to address Secure Development and Operational Environment (per RG 1.152) vulnerabilities.

Issue Date: 03/25/25 Att2-6 69020 App M 69020.M.A2-04 RESOURCE ESTIMATE Appendix resource estimate listed in Appendix M, Inspection of Digital Instrumentation and Control (DI&C) System/Software Design.

69020.M.A2-05 PROCEDURE COMPLETION Appendix procedure completion listed in Appendix M, Inspection of Digital Instrumentation and Control (DI&C) System/Software Design.

69020.M.A2-06 REFERENCES Appendix references listed in Appendix M, Inspection of Digital Instrumentation and Control (DI&C) System/Software Design.

END

Issue Date: 03/25/25 Att3-1 69020 App M : Inspection Guide for DI&C System/Software Life Cycle -

Design & Implementation Phase 69020.M.A3-01 INSPECTION OBJECTIVES 01.01 Appendix inspection objectives listed in Appendix M, Inspection of Digital Instrumentation and Control (DI&C) System/Software Design.

01.02 Attachment Inspection Objectives: Verify that the licensees DI&C development Design and Implementation Phase process and documentation are consistent with commitments.

69020.M.A3-02 INSPECTION REQUIREMENTS Appendix inspection requirements listed in Appendix M, Inspection of Digital Instrumentation and Control (DI&C) System/Software Design.

69020.M.A3-03 INSPECTION GUIDANCE General Guidance.

Appendix M: Attachment 3 - Inspection Guide for System/Software Life Cycle - Design &

Implementation Phase The follow-on to development of software requirements is the design of the software system.

Activities include the documentation, analysis, and review of the various software design processes. Design control products/documents (Design Outputs) resulting from these activities includes:

Hardware and Software Architecture Descriptions Software Design Specification Code Listings The software design activities translate the software requirements specifications into a hardware/software architecture specification and a software design specification. Reviewing the software architecture of the digital I&C system allows the staff to understand how the high-level coded functions of the system interact to accomplish the design function. Review of the Software Design Specification enables the staff to ensure that the software code accurately reflects the software requirements.

Sample Size Inspection of DI&C will typically rely on selection of a sample of attributes for verification.

Inspection of Design Phase activities will be accomplished through verification of attributes for a representative sample of critical design elements (typically 10-20 percent of design elements).

Traceability (through thread audit techniques) should be verified for the same representative sample of critical design elements. Sample size should be expanded at the inspectors discretion if inspection issues or findings are identified.

Issue Date: 03/25/25 Att3-2 69020 App M Specific guidance In addition to below, for additional specific inspection guidance, refer to Appendix N, Inspection of Digital Instrumentation and Control (DI&C) System/Software Design.

a. System Architecture
1. Verify that a Hardware/Software Architecture Description (or equivalent document) has been prepared, as committed to in the licensing basis. The Architecture Description should be correct, consistent, unambiguous, and verifiable. All major hardware and all major software processes should be included. Verify that the Architecture Description includes mapping of software to hardware, mapping of logical and physical communication paths, and descriptors for independent software elements.
2. Verify that the Architecture Description addresses the following attributes, as committed to in the licensing basis:

(a)

Design architecture shows how the various hardware elements are connected together.

(b)

Independent software elements are shown in the design architecture. This includes:

(1) Processes, which perform computations (2) Files and databases, which store information (3) I/O messages, which receive and transmit information (4) Displays and human/machine interface (HMI)

(5) Communication, which moves information among processes, files and databases, I/O channels, and HMI (c)

Design architecture shows how the various software elements are logically connected together (d)

Design architecture shows how each software element is mapped to a hardware element

b. Software Design Specification (SDS)
1. The SDS should depict exactly how the software requirements will be implemented in software modules and programs. It should be correct, complete, internally consistent, unambiguous, verifiable, and testable. Each Design Element in the SDS should be traceable to one or more specific software requirements. Verify that each Design Element exhibits the following attributes:

(a)

Name of the Design Element (b)

Type (system, subsystem, module, database, file, data structure, screen display, message, program, or process)

Issue Date: 03/25/25 Att3-3 69020 App M (c)

Purpose/Function of the Design Element (why the element exists in the design)

(d)

Detailed description of the ways in which the Design Element interacts with other design elements.

A list of all requirements implemented by the Design Element may also be included.

This is used to verify that the design will implement the requirements, and only the requirements. Some elements may implement more than one requirement, while some requirements may need several elements for a successful implementation.

2. Select a representative sample of Design Elements from the SDS. Verify that the Design Elements address the following attributes, as committed to in the licensing basis:

(a)

Every software requirement listed in the Software Requirement Specification (SRS) can be traced to one or more specific design elements that implement the requirement.

(b)

Every design element can be traced to one or more specific software requirements that the design element implements.

(c)

Documentation exists to demonstrate that there are no unintended functions in the design.

(d)

Static and dynamic structures exhibit minimal connections between design elements.

(e)

Safety-critical functions are separated from normal operating functions, with well-defined interfaces between them.

(f)

Design Element - if any of the following concepts are used in the design, adequate justification is given for their use:

(1) Point arithmetic (2) Interrupts, except for periodic timer interrupts (3) Multi-processing on a single processor (4) Dynamic memory management (5) Event-driven communications between processes (6) Time-critical subroutines (g)

Inputs to each software module are checked for validity.

(h)

Auto swap-over to backup software is built into the design.

(i)

For commercial-off-the-shelf (COTS) software that is part of the design, non-required features can be disabled.

3. Verify traceability of software requirements to SDS design elements.

(a)

Using the Traceability Matrix (or equivalent tool), select a representative sample of software Design Elements and trace them backwards to the software requirements from which they were derived.

Issue Date: 03/25/25 Att3-4 69020 App M (b)

Using the Traceability Matrix (or equivalent tool), select a representative sample of Design Elements and trace them forward to verify they were properly coded in the detailed design.

(c)

Trace the selected software design element to the test(s) that will be used to confirm that the element has been correctly designed.

Note: Tracing activities are designed to identify any software requirements not adequately addressed by the design elements, design elements not meeting the intent of software requirements, and any missing design elements.

c. Detailed Design & Implementation (Code Development)
1. Implementation consists of the translation of the completed software design into code and data stores. The risks involved in writing the code are that the design may not be correctly implemented in the code, or coding errors may add additional hazards. Verify the following, as committed to in the licensing basis:

(a)

Application software and system code is compiled in a Code Listing.

(b)

All system code is documented.

2. Verify that, as the Code Listing was developed, the following Code functional characteristics were considered, as committed to in the licensing basis:

(a)

Accuracy requirements were considered.

(b)

The System is coded such that corrupted data will not have an adverse impact on the system.

(c)

No new hazards or security threats are introduced by the Code.

(d)

The Code is written such that system execution and timing is deterministic.

d. Design Phase Documentation
1. Verify that development processes supporting Design Phase activities are documented, as committed to in the licensing basis. Verify the following:

(a)

Audit requirements are specified.

(b)

Design Review process is defined and implemented.

(c)

Configuration Management (change control) process is defined and implemented.

(d)

Independent Verification and Validation (IV&V) process is defined and implemented.

2. Verify the following documentation exists for the Design Phase activities, as committed to in the licensing basis:

Issue Date: 03/25/25 Att3-5 69020 App M (a)

Audit report and any Condition Reports for issues, nonconformances, and audit findings.

(b)

Documentation of corrective actions stemming from Condition Reports.

(c)

Design Configuration Management Report; verify that configuration change orders are communicated to the IV&V and Test organizations.

(d)

Design Verification (IV&V) Report; verify that the report confirms 100%

verification of design elements and correlation to code elements by the IV&V organization.

e. Design Safety Analysis
1. The Design Safety Analysis verifies that the design correctly and consistently incorporates the system safety requirements, identifies safety-critical software design elements, and detects errors that might result in violations of the system safety requirements. The analysis should ensure that all safety critical requirements have been included in the design, and that no new design features are developed that have no basis in the requirements. The analysis should identify any conditions that would prevent safety-critical processing from being accomplished. Select a sample from the following attributes and verify that they have been included in the Safety Analysis, as committed to in the licensing basis:

(a)

Logic (1) Equations and algorithms in the software design correctly implement safety-critical requirements.

(2) Control logic on the software design completely and correctly implements the safety-critical requirements.

(3) Control logic correctly implements error handling, off-normal processing, and emergency processing requirements. The system should be recoverable (e.g. from a power loss)

(4) Design logic is such that design elements that are not considered safety-critical cannot adversely affect the operation of the safety-critical design elements. The system should be fail-safe.

(b)

Data (1) Safety-critical data items are identified by type, unit, range, and error bounds.

(2) Analog inputs (from field devices) are properly scaled.

(3) Design elements that can change a safety-critical data item are known.

(4) No interrupt will change the value of a safety-critical data item in an unanticipated manner.

Issue Date: 03/25/25 Att3-6 69020 App M (c)

Interface (1) Control linkages between design elements are correctly and consistently designed.

(2) All parameters passed between design elements are consistent in type, structure, physical units, and direction (input/output).

(3) No safety-critical data item is used before being initialized.

(d)

Constraints (1) Design constraints listed in the SRS have been followed in the design.

(2) Known external limitations on the design have been recognized and included in the design (includes hardware limitations, instrumentation limitations, operation of the system equipment, etc.).

(3) Design meets timing and sizing requirements.

(4) Equations and algorithms work across the complete range of input data item values.

(5) Equations and algorithms provide sufficient accuracy and response times as specified in the SRS.

2. A Code Safety Analysis should be performed and documented. The analysis should determine that the code correctly implements the software design, does not violate any safety requirements and that no additional hazards have been created by the coding activity. Select a sample from the following attributes and verify that they have been included in the Code Safety Analysis, as committed to in the licensing basis:

(a)

Logic (1) Code logic correctly implements the safety-critical design criteria.

(2) Design equations and algorithms are correctly implemented in the code.

(3) Error handling design is correctly implemented in the code.

(4) Off-normal and emergency operations design is correctly implemented in the code.

(5) Non-critical code cannot adversely impact the function, timing, and reliability of safety-critical code.

(6) Interrupts included in the code will not take precedence over, or prevent the execution of, safety-critical code modules.

(7) Software failures and compensatory measures are implemented.

(8) Overrides and bypasses are implemented.

Issue Date: 03/25/25 Att3-7 69020 App M (b)

Data (1) Definition and use of data items in the code are consistent with the software design.

(2) No safety-critical data item can have its value changed in an unanticipated manner, or by an unanticipated module.

(3) No interrupt can destroy safety-critical data items.

(c)

Interface (1) Parameters passed between code modules are analyzed for consistency, including typing, structure, physical units, and number and order of parameters.

(2) Direction of parameters is consistent, both internally in the code, and externally with the software design.

(3) External interfaces are evaluated for correct format of messages, content, timing, and consistency.

(d)

Constraints (1) Adequate memory space is allocated for the safety-critical code and data structure. Consider normal, off-normal, and emergency operating modes.

(2) Actual timing of events in the code is consistent with the timing analysis performed as part of the software design.

(3) All timing requirements are met.

3. Failure Analysis A Failure Modes and Effects Analysis (FMEA) or other failure analysis should be performed to ensure that single failure requirements associated with system safety analyses and assumptions are confirmed. The FMEA should include system architecture to ensure that key design principles of redundancy and independence have been incorporated and that single failure requirements are met (Refer to NRC RG 1.53 Single Failure Criterion). The FMEA attributes are incorporated from IEEE-Std-352 Reliability Analysis. Verify the following from the FMEA or equivalent failure analysis:

(a)

Credible failure modes for the system/software were identified.

(b)

Impact (failure effect) on the system/software was evaluated.

(c)

Provisions to compensate for failure(s) are identified and included.

Issue Date: 03/25/25 Att3-8 69020 App M 69020.M.A3-04 RESOURCE ESTIMATE Appendix resource estimate listed in Appendix M, Inspection of Digital Instrumentation and Control (DI&C) System/Software Design.

69020.M.A3-05 PROCEDURE COMPLETION Appendix procedure completion listed in Appendix M, Inspection of Digital Instrumentation and Control (DI&C) System/Software Design.

69020.M.A3-06 REFERENCES Appendix references listed in Appendix M, Inspection of Digital Instrumentation and Control (DI&C) System/Software Design.

END

Issue Date: 03/25/25 Att4-1 69020 App M : Inspection Guide for DI&C System/Software Life Cycle -

Integration Phase 69020.M.A4-01 INSPECTION OBJECTIVES 01.01 Appendix inspection objectives listed in Appendix M, Inspection of Digital Instrumentation and Control (DI&C) System/Software Design.

01.02 Attachment Inspection Objectives: Attachment Inspection Objectives: Verify that the licensees DI&C development Integration Phase process and documentation are consistent with design commitments.

69020.M.A4-02 INSPECTION REQUIREMENTS Appendix inspection requirements listed in Appendix M, Inspection of Digital Instrumentation and Control (DI&C) System/Software Design.

69020.M.A4-03 INSPECTION GUIDANCE General Guidance.

Integration consists of the activities that combine the various software and hardware components into a single system. Software integration consists of three major phases:

Integrating the various software modules together to form single executable programs Integrating the single programs with the hardware and instrumentation Testing the resulting integrated product The Software Requirements and the Software Detailed Design should be analyzed to determine the order for combining software components into an overall system. The Integration Plan should include the tools, techniques, and methodologies needed to perform the integration.

System Build Documents (SBDs) are generally needed to verify that the programs actually delivered and installed are the programs that underwent the V&V process and were tested. Any future maintenance, modifications or updates will require that the maintainers know which version of the programming to modify. As a result, the System Build Documents are closely tied to the configuration management program.

Sample Size Inspection of DI&C will typically rely on selection of a sample of attributes for verification.

Inspection of Integration Phase activities will be accomplished through verification of attributes for a representative sample of integrated builds (typically 10-20 percent of builds). Additionally, traceability should be conducted for the same sample of integrated builds (software subsystem or software system) backward to the code elements contained in the build, and forward to the software field installation. Sample size should be expanded at the inspectors discretion if inspection issues or findings are identified.

Issue Date: 03/25/25 Att4-2 69020 App M Specific guidance In addition to below, for additional specific inspection guidance, refer to Appendix N, Inspection of Digital Instrumentation and Control (DI&C).

a. System/Software Integration
1. SBDs should be developed. The SBDs describe precisely how the system hardware and software components are combined into an operational system. The SBDs are a Design Output of the Integration Activities. From a configuration control standpoint, the items included in the SBD should be sufficient to show that the programming listed in the build documentation is identified by version, revision, and date, and that the version and revision was tested and found appropriate. Verify the following, as committed to in the licensing basis:

(a)

SBDs are developed.

(b)

SBDs are complete.

(c)

Software build procedures are specified.

(d)

SBDs include all required software units, including code and data that are part of the build.

(e)

Hardware and software component names and versions are described.

(f)

Locations of software and hardware components are described. Consideration is given for EMI/RFI and anti-virus protection of hardware and software components.

(g)

Methods by which the hardware components are connected together and to the sensors, actuators, and terminals are described.

(h)

There is documented evidence that the system made ready for verification was built in accordance with the SBD.

(i)

Software listed in the SBD is identified by version, revision, and date.

(j)

Software version and revision listed are the same version and revision tested.

(k)

The SBDs are consistent with the software specifications, as described in the SRS, software design description, and software code.

(l)

Consistent and uniform terminology, notation, and definitions are used throughout the SBD.

(m)

The SBDs specify methods to detect incorrectly built software releases.

(n)

The SBDs identify all errors and anomalies discovered during software build activities.

2. An integration strategy should be developed to incorporate procedures, tools and methods for managing the system/software integration effort. The strategy may be

Issue Date: 03/25/25 Att4-3 69020 App M detailed in the Software Integration Plan. Verify the following, as committed to in the licensing basis:

(a)

Acceptance criteria is used to determine, measure and analyze the success or failure of the integration effort.

(b)

Integration instructions provide the technical guidance needed to carry out each integration step.

(c)

Procedures for the integration steps describe the input items (hardware, instrumentation, software, and data) for the steps.

(d)

Procedures describe the integration process for each step.

(e)

Procedures list the outputs of each integration step.

(f)

Contingency procedures/strategies are in place for an incomplete integration.

(g)

Procedures exist for delivering the completed integration product (System or Subsystem) to the configuration management organization.

(h)

Procedures exist for delivering the completed integration product (System or Subsystem) to the V&V organization for integration testing.

(i)

Integration tools are qualified for use.

(j)

Methods and controls are in place for software integration.

(k)

Methods and controls are in place for hardware/software integration.

(l)

Methods and controls are in place for system integration (if multiple vendors are involved).

(m)

Procedures and documentation describe all integration testing to be conducted and expected test results. Consideration is given to Black Box/White Box testing.

b. Integration Phase Documentation
1. Verify that development processes supporting Integration Phase activities are documented, as committed to in the licensing basis. Verify the following:

(a)

Audit requirements are specified.

(b)

Design Review process is defined and implemented.

(c)

Configuration Management (change control) process is defined and implemented.

(d)

Independent Verification and Validation (IV&V) process is defined and implemented.

Issue Date: 03/25/25 Att4-4 69020 App M

2. Verify the following documentation exists for the Integration Phase activities, as committed to in the licensing basis:

(a)

Integration Phase Audit Report.

(b)

Integration Phase condition reports and associated corrective action.

(c)

Integration Phase Configuration Management report.

(d)

Integration Phase IV&V Report; verify that the report confirms 100%

verification of SBDs and integration elements by the V&V organization.

c. Integration Safety Analysis
1. An Integration Safety Analysis should be performed and documented. The analysis should determine that the complete system does not violate any safety requirements and that no additional hazards have been created by the integration activity. Verify the following, as committed to in the licensing basis:

(a)

Procedures exist for hazard analysis of the integration effort.

(b)

Hazard analysis concludes that the integrated system does not introduce new hazards to the developmental effort.

(c)

Procedures exist for risk assessment of the integration effort.

2. The risk to an incorrect integration activity is that the system will not operate as intended, and that this will not be discovered until actual operation, possibly during an emergency. Verifying that the integration activity has been successfully completed is part of the V&V inspection, analysis, and test activities. Verify the following, as committed to in the licensing basis:

(a)

Integration V&V activities demonstrate that all unit and subsystem tests required by the SVVP were successfully completed.

(b)

Anomalies or errors found during integration tests are resolved and documented. Final integration tests should be completed and documented.

(c)

Reports are consistent with the Software Integration Plan.

69020.M.A4-04 RESOURCE ESTIMATE Appendix resource estimate listed in Appendix M, Inspection of Digital Instrumentation and Control (DI&C) System/Software Design.

69020.M.A4-05 PROCEDURE COMPLETION Appendix procedure completion listed in Appendix M, Inspection of Digital Instrumentation and Control (DI&C) System/Software Design.

Issue Date: 03/25/25 Att4-5 69020 App M 69020.M.A4-06 REFERENCES Appendix references listed in Appendix M, Inspection of Digital Instrumentation and Control (DI&C) System/Software Design.

END

Issue Date: 03/25/25 Att5-1 69020 App M : Inspection Guide for DI&C System/Software Life Cycle -

Validation & Test Phase 69020.M.A5-01 INSPECTION OBJECTIVES 01.01 Appendix inspection objectives listed in Appendix M, Inspection of Digital Instrumentation and Control (DI&C) System/Software Design.

01.02 Attachment Inspection Objectives: Verify that the licensees DI&C development Validation and Test Phase process and documentation are consistent with design commitments.

69020.M.A5-02 INSPECTION REQUIREMENTS Appendix inspection requirements listed in Appendix M, Inspection of Digital Instrumentation and Control (DI&C) System/Software Design.

69020.M.A5-03 INSPECTION GUIDANCE General Guidance.

The DI&C system validation activities are the test and evaluation of the integrated system to ensure compliance with functional, performance, and interface requirements. This determination may include analysis and evaluation of products and processes.

Testing will determine if the requirements for the DI&C system are complete and correct.

Accordingly, V&V will determine if the products of each life cycle development phase implement the requirements to meet the criteria imposed by the previous life cycle phase, and that the final DI&C system complies with specified requirements. The V&V effort is governed by the V&V Plan, and activities are typically documented as follows for each life cycle phase:

V&V Task Activity Reports V&V Summary Reports Techniques and methods for V&V include document review, design review, requirements traceability, and testing. A Requirements Traceability Matrix (RTM, or equivalent tracing tool) may be developed. The RTM should depict every distinct DI&C developmental requirement and sub-requirement, and what portion of the software requirement, software design description, detailed design (code), and test will address each distinct requirement.

Sample Size Inspection of DI&C will typically rely on selection of a sample of attributes for verification.

Inspection of Validation and Test Phase activities will be accomplished through verification of attributes for a representative sample of documents (typically 10-20 percent of a combination of test procedures, test plans, and test reports) stemming from testing activities. Sample size should be expanded at the inspectors discretion if inspection issues or findings are identified.

Issue Date: 03/25/25 Att5-2 69020 App M Specific guidance In addition to below, for additional specific inspection guidance, refer to Appendix N, Inspection of Digital Instrumentation and Control (DI&C)

a. Verification and Validation (V&V)
1. Metrics/indicators should be developed to determine overall effectiveness of the V&V effort. Verify the following V&V attributes, as committed to in the licensing basis:

(a)

Criteria are used to verify the completion of each V&V task.

(b)

Evaluation criteria should be provided for test plans, test specifications, test procedures and test cases.

(c)

Evaluation criteria should be provided for review plans, review specifications, and review procedures.

(d)

The error rate found during software reviews and software testing should be measured, recorded, analyzed and reported. Note - the FMEA should have identified potential errors.

2. Procedures should be developed to govern each V&V task. Verify procedures are in place for the following, as committed to in the licensing basis:

(a)

V&V task performance, including assumptions for each task.

(b)

Evaluation of design outputs.

(c)

Analysis and reassessment of errors detected during the V&V effort.

(d)

Risk assessment for DI&C developmental activities.

(e)

Assessment and management of changes in DI&C developmental activities.

(f)

V&V reporting practices.

(g)

V&V testing and test documentation, including testing plans, specifications, procedures and cases.

b. System/Software Testing
1. Activities include unit testing, integration (subsystem) testing, validation testing, installation (acceptance) testing, and the regression testing of modifications. Test procedures include test documentation requirements, readiness and evaluation criteria, error reporting, and anomaly resolution requirements. Verify that the testing process and documentation includes the following attributes, as committed to in the licensing basis:

(a)

Documentation includes test item descriptions, test data, and test logs.

(b)

Test plans and procedures identify test personnel.

Issue Date: 03/25/25 Att5-3 69020 App M (c)

Results documentation includes types of observations, results, acceptability, and actions taken in connection with any deficiencies.

(d)

Software testing process includes one or more tests for each requirement in the software requirement specification (SRS), as well as the acceptance criteria for each test.

(e)

The result of each test clearly shows that the associated requirement has been met.

(f)

Test procedures contain detailed information for the test setup, input data requirements, output data expectations, and completion time.

(g)

Provisions in place for remediation testing, as necessary.

2. Testing for DI&C systems includes software testing, software integration testing, software qualification testing, system integration testing, and system qualification testing. The objective of Test V&V is to ensure that the software requirements and system requirements allocated to software are validated by execution of integration, system, and acceptance tests. Verify that Test V&V includes the following activities, as committed to in the licensing basis:

(a)

Traceability analysis using the RTM.

(b)

Integration V&V test procedure generation and execution.

(c)

System V&V test procedure generation and execution.

(d)

Acceptance V&V test procedure generation and execution.

(e)

Hazard, Risk and Security analyses.

c. Validation & Test Phase Safety Analysis
1. Analysis should be performed of software testing to show test coverage for all software safety requirements. Analysis should also be performed on testing results for safety-critical design elements.
2. Analysis should include the following, as committed to in the licensing basis:

(a)

Relationship between each test and the safety requirement that the test supports.

(b)

Evidence to determine if each software safety requirement has been satisfactorily tested.

(c)

Assessment of risk associated with implementation as indicated by test analysis.

(d)

Recommendation as to whether or not adequate testing has been performed.

d. Documentation
1. Documents that summarize V&V activities, including analyses and test plan development, should be generated. Verify a sample from the following, as committed to in the licensing basis:

Issue Date: 03/25/25 Att5-4 69020 App M (a)

Requirements Phase V&V includes:

(1) Traceability analysis (2) Software requirements evaluation (3) Interface analysis (4) Criticality analysis (5) System V&V test plan generation (6) Acceptance V&V test plan generation (7) Configuration management assessment (b)

Design Phase V&V includes:

(1) Traceability analysis (2) Software design evaluation (3) Interface analysis (4) Criticality analysis (5) Component V&V test plan generation (6) Integration V&V test plan generation (7) Component V&V test design generation (8) Integration V&V test design generation (9) System V&V test design generation (10) Acceptance V&V test design generation (c)

Implementation Phase V&V includes:

(1) Traceability analysis (2) Source code and source code documentation evaluation (3) Interface analysis (4) Criticality analysis (5) Component V&V test case generation (6) Integration V&V test case generation (7) System V&V test case generation (8) Acceptance V&V test case generation (9) Component V&V test procedure generation (10) Integration V&V test procedure generation (11) System V&V test procedure generation (12) Component V&V test execution

2. Reports that detail V&V activities should be developed. Verify the following, as committed to in the licensing basis:

(a)

V&V Task Reports are developed that provide details of evaluations and analyses (activities) completed during each life cycle phase. Task Reports should also include V&V safety assessments performed as stipulated by the Software Safety Plan.

(b)

V&V Summary Reports are developed for the Requirements, Design, and Implementation (Detailed Design) Phases.

(c)

V&V Final Report details the validation testing that was completed, problems encountered, and disposition of issues.

Issue Date: 03/25/25 Att5-5 69020 App M 69020.M.A5-04 RESOURCE ESTIMATE Appendix resource estimate listed in Appendix M, Inspection of Digital Instrumentation and Control (DI&C) System/Software Design.

69020.M.A5-05 PROCEDURE COMPLETION Appendix procedure completion listed in Appendix M, Inspection of Digital Instrumentation and Control (DI&C) System/Software Design.

69020.M.A5-06 REFERENCES Appendix references listed in Appendix M, Inspection of Digital Instrumentation and Control (DI&C) System/Software Design.

END

Issue Date: 03/25/25 Att6-1 69020 App M : Inspection Guide for DI&C System/Software Life Cycle -

Installation Phase 69020.M.A6-01 INSPECTION OBJECTIVES 01.01 Appendix inspection objectives listed in Appendix M, Inspection of Digital Instrumentation and Control (DI&C) System/Software Design.

01.02 Attachment Inspection Objectives: Verify that the licensees DI&C development Installation Phase process and documentation are consistent with design commitments.

69020.M.A6-02 INSPECTION REQUIREMENTS Appendix inspection requirements listed in Appendix M, Inspection of Digital Instrumentation and Control (DI&C) System/Software Design.

69020.M.A6-03 INSPECTION GUIDANCE General Guidance Appendix M: Attachment 6 - Inspection Guide for System/Software Life Cycle -

Installation Phase In installation and checkout, the software product is installed and tested in the target environment. Part of the V&V activity supports the software system installation activities. The objective of installation and checkout V&V is to verify and validate the correctness of the software installation in the target environment.

Installation activities include:

Installation Configuration Table development Installation Configuration audit Installation checkout Safety Analysis Acceptance Testing Sample Size Inspection of DI&C will typically rely on selection of a sample of attributes for verification.

Inspection of Installation Phase activities will be accomplished through verification of attributes for a representative sample of installation configuration items (10-20 percent of installation configuration items). Traceability (through thread audit techniques) should be verified for the same representative sample of installation configuration items. Sample size should be expanded at the inspectors discretion if inspection issues or findings are identified.

Specific guidance In addition to below, for additional specific inspection guidance, refer to Appendix N, Inspection of Digital Instrumentation and Control (DI&C).

Issue Date: 03/25/25 Att6-2 69020 App M

a. Installation Configuration Tables
1. Installation Configuration Tables (ICTs) should be produced. They should include functional and process characteristics to ensure that the software will be correctly configured in the operating DI&C system. Verify the following for the ICTs, as committed to in the licensing basis:

(a)

ICTs configure the installed system to have the functionality that is required for the plant.

(b)

ICTs are consistent with the software specifications, as described in the SRS, software design description, software code, and SBDs.

(c)

ICTs contain all target environment (plant-specific) data.

(d)

ICTs are traceable; installed elements can be traced to the integrated software elements that created that installed program element.

(e)

ICTs allow for analysis, review, or test of each installed software system installed element.

2. Any software item which is changeable should have the intended configuration recorded in the ICT. Verify a sample of configuration item setpoints agree with values determined from setpoint calculations, as committed to in the licensing basis.
b. Installation Activities
1. Installation configuration audit: verify the following, as committed to in the licensing basis:

(a)

All software products required to correctly install and operate the software are present in the installation package.

(b)

Installation procedures are developed and validated.

(c)

Installation test documentation is developed and validated.

(d)

Anomaly and error reporting procedures are developed.

2. Installation Checkout: verify the following are performed as part of installation, as committed to in the licensing basis:

(a)

Analyses or tests to verify that the installed software corresponds to the software subjected to V&V.

(b)

Verification that the software code and databases initialize, execute, and terminate as specified.

(c)

Verification that, in the transition from one version of software to the next, the software can be removed from the system without affecting the functionality of the remaining system components.

Issue Date: 03/25/25 Att6-3 69020 App M

3. User documentation: verify that Operations, Maintenance and Training Manuals are developed for the DI&C system software.
c. Safety Analysis
1. Verify that a Hazard Analysis has been performed for the installation activities.
2. Verify the performance of a security analysis, and that the security analysis results conclude that the installed software does not introduce new or increased vulnerabilities or physical and/or SDOE risks to the overall DI&C system.
3. Verify that a risk assessment has been performed to provide recommendations to eliminate, reduce, or mitigate risks associated with the installed software.
4. Verify that corrective actions have been initiated for any deficiencies identified as a result of the Hazards analysis, Security analysis, and Risk assessment.
d. Installation Testing
1. The installation (acceptance) test activities should document the test plan, test configuration, the required inputs, expected outputs, the steps necessary to execute the test, and the acceptance criteria for each test. Verify the following, as committed to in the licensing basis:

(a)

Acceptance Test procedures are developed.

(b)

Issues identified during the test activity, and any action items required to mitigate or eliminate each issue, are documented. Installation problems and their resolution should be documented.

2. Test Reporting; an Acceptance Test Report should be produced describing the execution of the plan and summarizing the results. Verify the following, as committed to in the licensing basis:

(a)

The Report contains a statement that the plan was successfully executed, and the system is ready for operation.

(b)

The Report should document that the DI&C system operates correctly and is identical to the system that was validated during the validation phase.

(c)

The report should summarize the test results after all problems have been satisfactorily resolved.

69020.M.A6-04 RESOURCE ESTIMATE Appendix resource estimate listed in Appendix M, Inspection of Digital Instrumentation and Control (DI&C) System/Software Design.

Issue Date: 03/25/25 Att6-4 69020 App M 69020.M.A6-05 PROCEDURE COMPLETION Appendix procedure completion listed in Appendix M, Inspection of Digital Instrumentation and Control (DI&C) System/Software Design.

69020.M.A6-06 REFERENCES Appendix references listed in Appendix M, Inspection of Digital Instrumentation and Control (DI&C) System/Software Design.

END

Issue Date: 03/25/25 Att7-1 69020 App M - Revision History for IP 69020 Appendix M Commitment Tracking Number Accession Number Issue Date Change Notice Description of Change Description of Training Required and Completion Date Comment and Feedback Resolution Accession Number (Pre-Decisional, Non-Public)

N/A ML24264A205 03/25/25 CN 25-005 Procedure was rewritten for conformance with changes to IMC 2550 and is now a standalone appendix to IP 69020.

N/A N/A