ML18072A147

From kanterella
Jump to navigation Jump to search
NRR E-mail Capture - (External_Sender) Industry Feedback for March 14th RIS Public Meeting - Public Comments Related to Digital I&C RIS 2002-22, Supplement 1
ML18072A147
Person / Time
Site: Nuclear Energy Institute
Issue date: 03/12/2018
From: Hanson J
Nuclear Energy Institute
To: Eric Benner, Tekia Govan
Division of Engineering, Division of Licensing Projects
References
Download: ML18072A147 (36)


Text

NRR-DMPSPEm Resource From: HANSON, Jerud <jeh@nei.org>

Sent: Monday, March 12, 2018 1:54 PM To: Benner, Eric; Govan, Tekia Cc: REMER, Jason; HANSON, Jerud

Subject:

[External_Sender] Industry Feedback for March 14th RIS Public Meeting Attachments: NEI Members Comment Worksheet - March 2018 RIS Version 3-12-2018.xlsx; ML18051A084 - RIS 2017-XX - With Line Numbers.pdf; RIS Operating Experience Section Suggested Rewrite.pdf; Table 2 Suggested Revision.pdf Eric/Tekia, Attached you will find documents detailing comments provided by NEI members on the latest version of the RIS. There are a total of 34 comments for our discussion on Wednesday with the following breakdown:

  • Priority 1 (Showstoppers) - 18
  • Priority 2 (Important and should be incorporated) - 11
  • Priority 3 (Editorial) - 5 For each of the comments, we have provided a suggested rewrite that incorporates our recommendation. In some cases, the proposed rewrite is included in a separate document and attached to this email. Each comment is identified with a line number corresponding to the attached version of the RIS with line numbers added. I recommend we start with the Priority 1 comments, and work our way down the list. The spreadsheet is arranged such that the Priority 1 comments are first, followed by the Priority 2, and then Priority 3. As we work our way through the comments, we will refer to the supplemental attachments provided, which include a suggested rewrite of the operating experience section, as well as Table 2. I think it would also be very helpful if you could provide hard copies during the public meeting.

Please contact me with any questions.

Thank you, Jerud Jerud E. Hanson l Sr. Project Manager, Life Extension & New Technology 1201 F Street, NW, Suite 1100 l Washington, DC 20004 P: 202.739.8053 M: 202.497.2051 nei.org This electronic message transmission contains information from the Nuclear Energy Institute, Inc. The information is intended solely for the use of the addressee and its use by any other person is not authorized. If you are not the intended recipient, you have received this communication in error, and any review, use, disclosure, copying or distribution of the contents of this communication is strictly prohibited. If you have received this electronic transmission in error, please notify the sender immediately by telephone or by electronic mail and permanently delete the original message. IRS Circular 230 disclosure: To ensure compliance with requirements imposed by the IRS and other taxing authorities, we inform you that any tax advice contained in this communication (including any attachments) is not intended or written to be used, and cannot be used, for the purpose of (i) avoiding penalties that may be imposed on any taxpayer or (ii) promoting, marketing or recommending to another party any transaction or matter addressed herein.

Sent through www.intermedia.com 1

Hearing Identifier: NRR_DMPS Email Number: 244 Mail Envelope Properties (1290D5881ECCFA4FBF93A2D70AC229067C7CC765)

Subject:

[External_Sender] Industry Feedback for March 14th RIS Public Meeting Sent Date: 3/12/2018 1:53:37 PM Received Date: 3/12/2018 1:54:12 PM From: HANSON, Jerud Created By: jeh@nei.org Recipients:

"REMER, Jason" <sjr@nei.org>

Tracking Status: None "HANSON, Jerud" <jeh@nei.org>

Tracking Status: None "Benner, Eric" <Eric.Benner@nrc.gov>

Tracking Status: None "Govan, Tekia" <Tekia.Govan@nrc.gov>

Tracking Status: None Post Office: MBX023-E2-VA-2.EXCH023.DOMAIN.LOCAL Files Size Date & Time MESSAGE 2614 3/12/2018 1:54:12 PM NEI Members Comment Worksheet - March 2018 RIS Version 3-12-2018.xlsx 28896 ML18051A084 - RIS 2017-XX - With Line Numbers.pdf 321831 RIS Operating Experience Section Suggested Rewrite.pdf 29849 Table 2 Suggested Revision.pdf 64825 Options Priority: Standard Return Notification: No Reply Requested: No Sensitivity: Normal Expiration Date:

Recipients Received:

INDUSTRY COMMENTS TO MARCH 2018 DRAFT RIS 17-XX SUPPLEMENT-1 TO RIS 2002-22 LINE NO. PAGE NO. PRIORITY INDUSTRY COMMENT RECOMMENDED CHANGE ADDITIONAL COMMENTS ACTION TAKEN Stating that DI&C upgrades associated with the RPS/ESFAS is out-of- Suggest replacing 2 sentences starting on line 61:

scope for the RIS Supplement has the potential to communicate that SSCs supporting or actuated by the RPS/ESFAS logic would also be off "This RIS Supplement is not directed toward large-scale analog-to-digital upgrades of the the table. This significantly limits the SSCs available to be upgraded reactor trip and engineered safety features actuation logic, since application of the 61 2 of 5 1 under this RIS Supplement. guidance in this RIS Supplement to such changes would likely involve additional considerations. This RIS Supplement does not provide new design process guidance; however it does highlight vulnerabilities that could be introduced by a digital modification that should be considered in the design process."

Section 3: Suggesting revising (or deleting) the wording in Section 3 Suggested wording:

(Lines 446 to 491).

The following examples of proposed changes are considered within the scope of this RIS.

The existing wording will be interpreted by licensees such that of a non- The list is by no means inclusive and is simply provided to illustrate the nature and relative safety related DCS would require a LAR to implement. Additionally, item complexity of proposed changes targeted by the RIS.

1.c would prevent a licensee from a simple one-for-one replacement of antiquated analog/pneumatic sequencer timing relays with modern

  • A one-for-one replacement of analog timing relays with digital timing relays on timing relays containing an embedded digital device. Item 3 redundant load sequencer trains reintroduces 100% testing which is not possible to achieve with
  • Replacement of emergency diesel generator (EDG) analog voltage regulators with digital software and then introduces an input/output state analysis. Licensees voltage regulators will assume this requirement applies to non-safety related equipment
  • Replacement of EDG auxiliary support system analog controls with digital controls as well as safety related equipment.
  • Replacement of turbine driven auxiliary feedwater system controls with digital controls 445 4 of 17 1
  • Installation of safety related breakers and relays (including timing relays) containing embedded digital devices
  • Replacement of safety related analog and electromechanical protective relays with digital multifunction relays
  • All digital upgrades to non-safety related systems are within the scope of this RIS Proposed changes that employ digital equipment with complex application software in redundant safety related trains is considered beyond the scope of this RIS (e.g., a complete analog-to-digital RPS upgrade). Additionally, proposed changes that add new cross-channel communications between redundant safety related trains/equipment would also be considered beyond the scope of this RIS.

Section 3, Item 1c) should be deleted from the RIS, as it already Remove this section from the RIS. Including the emergency load sequencers would appear excludes RPS and ESFAS. With respect to emergency load sequencers, to be a backfit issue.

this is new requirement that was not previously contained in any of the 482 5 of 17 1 I&C guidance documents. The emergency load sequencers are clearly not part of the RPS or ESFAS. They are within the scope of the onsite AC power system, with guidance contained in SRP Section 8.3.

Section 3, Item 2: "reduces, redundancy, diversity, separation or Suggest eliminating this section. If it is kept, then suggest clarifying that the impact of independence" - if these attributes are design features, rather than reducing these needs to be assessed for the impact on the plant safety analysis.

design, or regulatory requirements for that design function, it may not be necessary to maintain that level of UFSAR described redundancy, 485 5 of 17 1 diversity, separation, or independence to be an acceptable design. In other words, if these are not "credited" then maintaining those design features may not be required. This particularly true for non-safety related systems, where these attributes are not typical required.

The industry has previously commented on the use of the term "100%" Suggest removing this from this section, and correcting it in the later table in the RIS testing, and the terms "simple" or "simplicity". These should be noted attachment. Industry and NRC had previously settled on the term "highly testable". For 489 5 of 17 1 as examples of, but not the only examples of, design attributes that can example, a circuit breaker or timing relay with an EDD would be examples of digital be used in conjunction with other things, such as quality and Operating devices that are "highly testable". They each have one input and one output and can be Experience. easily tested to verify full functionality and repeatability.

Page 1 of 7

INDUSTRY COMMENTS TO MARCH 2018 DRAFT RIS 17-XX SUPPLEMENT-1 TO RIS 2002-22 LINE NO. PAGE NO. PRIORITY INDUSTRY COMMENT RECOMMENDED CHANGE ADDITIONAL COMMENTS ACTION TAKEN Section 3.1.2 - Quality of the Design Process: This comment applies to Consider replacing the last 3 paragraphs of this section with the following:

Lines 573 through 590. The two paragraphs discussing industry standards. Licensees will interpret this guidance in a way that concludes "Quality of the design process is a key element in determining the dependability of SSCs non-safety related equipment must comply with industry standards. affected by proposed modifications. Licensees employing design processes consistent with their NRC-approved quality assurance programs will result in a quality design process."

"The use of applicable industry consensus standards contributes to a quality design process and provides a previously established acceptable approach. In some cases, other nuclear or non-nuclear standards also provide technically justifiable approaches that can be used if confirmed applicable for the specific application."

"For non-safety related SSCs, adherence to generally accepted quality standards is 573 7 of 17 1 sufficient. The qualitative assessment should list the generally accepted industry standards utilized by the equipment manufacturer. If an equipment manufacturer used NRC-endorsed industry standards during the design and/or manufacturing process for non-safety related equipment, these standards should be documented in the qualitative assessment to provide additional evidence of quality."

"Specific NRC-endorsed industry standards may be required for qualification of safety related equipment depending on the licensees Appendix B quality assurance program and specific commitments made within their licensing bases. The qualitative assessment should provide evidence that the required industry standards were used in the design as applicable. Any additional industry standards used should also be documented as this can help bolster the quality argument."

Section 3.1.3 - Operating Experience - Several issues noted. Suggestions: See industry provided write-up for the Operating Experience section.

The language seems focused on applicability with specific sited Be less prescriptive on specific evidence required and focus more on what the value of references or evidence: operating history can provide and define what industry should be looking for such as:

Examples:

  • Page 8 - 1st para - with comparable performance standards and
  • Operating history should be viewed as the largest test bed you could ask for. What you operating environments. want is a large population used across various industries and in a vast range of operating
  • Page 8 - 2nd para - In all cases, the architecture of the referenced conditions, many different applications, etc. The vendor should have published equipment. requirements (e.g., temperature, humidity, voltage limits) for the product. Licensees have
  • Page 8 - 3rd para - what operating conditions were experienced by an engineering design process that verifies plant environments against these published 592 7 of 17 1 the reference design. vendor design specs. This should be left to engineering judgment - the RIS should not require specific environmental evidence as part of the operating history evaluation.

At the site inspection level, this type of language would be problematic as it appears to focus on traceability of documented evidence rather

  • Another area to look at is whether the vendor has a corrective action or problem than evaluating and using operating history to inform the design. reporting structure, what is their threshold for problem reporting, and is the data used to Vendors will not usually provide names of customers associated with a fix and improved the product. This was touched on in last sentence of page 8, para 1.

given problem report, so traceability or specific references to environmental conditions and other design attributes are not generally possible to obtain.

Page 2 of 7

INDUSTRY COMMENTS TO MARCH 2018 DRAFT RIS 17-XX SUPPLEMENT-1 TO RIS 2002-22 LINE NO. PAGE NO. PRIORITY INDUSTRY COMMENT RECOMMENDED CHANGE ADDITIONAL COMMENTS ACTION TAKEN Section 3.1.3 - Operating Experience (Continued): Additional Suggestions:

  • It is not clear what are we looking for operating experience to do for The OE evaluation should not be specifically focused on a failure rate number. Evaluations us? should look at the number of failures but it should look at the data (the failure report description) for weaknesses.

o Provide a failure rate number or o Review the problem reports (filtering on specific areas of interest such a redundancy o Provide us insight into the vendors product and processes helps to manage large amounts of data) o The insights can be a lot better than the calculated failure rates o Repetitive failures would point to a weakness in processes o Look for software failure vs hardware failures (hardware may be more of a concern) 592 7 of 17 1 Failure rate numbers are normally military spec numbers based upon o Do not limit operating experience to specific revisions (provides insight into the vendor sub-component failure rates and are normally inflated (i.e., module has change process) a failure rate of 1000 yrs.). We should be looking at real failure rate o Problem reporting process (data representative and are reporting thresholds) and what numbers based upon operating history. is done with the data (did they fix the issue)

  • Table 1 - 3rd bullet - User configurable software applications should not be an issue if Applicability in the past has meant documenting a specific you have large population set. Operating experience of user configurable devices will get a software/hardware version. Also, looking at operating history across large number of different and representative configurations to provide valuable insights.

revision changes is very valuable. It can provide supporting evidence along with identifying weaknesses in the vendors programs (e.g., SQA, change management and testing).

Table 1 - Design Attributes:

Second bullet - with respect to watchdog timers, suggest the changing to the following to clarify: "Watchdog timers that interface with but operate independently of the software" Fourth Bullet: Suggest changing simple (i.e., enabling 100 percent testing or comprehensive testing in combination with analysis of likelihood of occurrence of input/output states not tested)" to "(i.e.,

highly testable)" as stated on Line 334 of the RIS.

628 9 of 17 1 Fifth Bullet: Suggest changing this item to the following: "Assurance that failures in the new equipment either (1) places the affected SSC in a safe state, (2) are equivalent to or bounded by the failure state of the equipment being replaced, or (3) result in a failure state that is inconsequential at the plant level."

The last version of the RIS had the following Design Attribute: "Unlikely series of events - evaluation of a given digital I&C modification requires postulating multiple independent detected and/or undetected random failures in order to arrive at a state in which a CCF is actually a concern."

Industry would like NRC to consider maintaining this Design Attribute.

Page 3 of 7

INDUSTRY COMMENTS TO MARCH 2018 DRAFT RIS 17-XX SUPPLEMENT-1 TO RIS 2002-22 LINE NO. PAGE NO. PRIORITY INDUSTRY COMMENT RECOMMENDED CHANGE ADDITIONAL COMMENTS ACTION TAKEN Table 1 - Quality of the Design Process: Industry would like NRC to consider changing this part of Table 1 to the following:

Safety related equipment:

  • Compliance with industry software standards as applicable and required by the plants licensing basis and preferably those consensus standards currently endorsed by the NRC.
  • For non-NRC endorsed codes and standards, the licensee should provide a documented explanation for why use of the particular non-endorsed software or system standard is acceptable.

628 9 of 17 1

  • Demonstrated qualification testing to withstand environmental conditions within which the SSC is credited to perform its design function (e.g., temperature, humidity, seismic, and EMI/RFI susceptibility) as well as not creating unacceptable EMI/RFI emissions.

Non-safety related equipment:

  • Adherence to generally accepted quality standards.
  • Documentation that equipment manufacturer specifications meet or exceed the temperature, humidity, and EMI/RFI emissions and susceptibility requirements levels equal or better than the equipment being replaced.

Table 1 - Operating Experience: Consider updating the portion of Table 628 9 of 17 1 1 based on previous comments on Operating Experience.

The statement: "Sources of CCF, could include the introduction of Consider deleting Section 4.3, Failure Analysis. Licensees already have procedural design Section 4.3 - Failure Analyses - is particularly troubling identical software into redundant channels, the use of shared guidance on the development of various types of failure analyses, including Failure Modes in that it requires a licensee to evaluate every SSC in resources; or the use of common hardware and software among and Effects Analyses, Single Failure Analyses, and Software Hazard Analyses, to name a the plant for each design change to identify if a given systems performing different design functions." implies that a licensee few. digital component was previously installed. Then the must evaluate every occurrence within the facility where a digital licensee is required to perform an evaluation to device is being used. This is a new concept for licensees. A licensee does As an alternate suggestion - consider removing "or the use of common hardware and determine the potential for CCF across SSCs that use a not currently perform an evaluation of the entire plant to determine if a software among systems performing different design functions". similar digital device. This in and of itself would turn new piece of equipment to be installed as part of a plant modification out to be a very difficult research project. For happens to be used in some other plant system. example, assume a licensee desires to install a digital timing relay in a given SSC. Per the guidance, the licensee would have to identify any other place in the plant where that relay was used and then determine if 730 12 of 17 1 there is a potential for CCF. Or perhaps a licensee desires to replace an analog Rosemount transmitter with a digital Rosemount transmitter in a given system. Per this guidance, the licensee would have to evaluate the thousands of Rosemount digital transmitter installed across the plant to identify if there is a potential for CCF. This is simply not reasonable. In some cases, the research to comply with this portion of the guidance would take longer than the time needed to complete development of the engineering change package needed to implement the change.

Page 4 of 7

INDUSTRY COMMENTS TO MARCH 2018 DRAFT RIS 17-XX SUPPLEMENT-1 TO RIS 2002-22 LINE NO. PAGE NO. PRIORITY INDUSTRY COMMENT RECOMMENDED CHANGE ADDITIONAL COMMENTS ACTION TAKEN Consider deleting Figure 1. This figure leads a licensee to believe that only a Dependability Evaluation is needed. Licensees would prefer to have the RIS provide guidance strictly on Qualitative Assessments for 807 14 of 17 1 determining equipment/SSC reliability, and then using the Qualitative Assessment to support arguments made in the 50.59 Evaluation.

810 15 of 17 1 Table 2 - See industry version of Table 2.

This document provides a lot of guidance on I&C design, in particular, Remove design requirements from the RIS. Inclusion of design attributes is useful, and Licensees know how to design - they have very for non-safety related systems. In some cases, the requirements go should remain. detailed and proceduralized guidance along with a beyond what is required by the operational quality assurance program quality assurance program. NRC and licensees have used by licensees, and design and licensing bases requirements for non- not been aligned on adequate documentation of NA NA 1 safety related systems. The RIS should focus on the licensing aspects, design considerations. The new RIS is supposed to not the design aspects of digital upgrades. provide licensees with acceptable methods for documenting qualitative assessments for relaying their design thought process in a way that an inspector can understand pertinent design considerations.

Introduction of the term "Dependability Evaluation" is not beneficial. Suggest eliminating the use of Dependability Evaluation throughout the document, To simplify the RIS, consider only providing guidance 264 1 of 17 1 Industry would like to limit the RIS scope to development of a including Section 4.5, and provide guidance only on development of Qualitative on development of a qualitative assessment. With the Qualitative Assessment. Suggest deleting this entire paragraph. Assessment guidance as NEI 01-01 appears to use these two terms interchangeably. addition of dependability assessments, a licensee will Last paragraph of page, second sentence states: "Section 4 of this attachment provides acceptable approaches for engineering evaluations that may be used in performing and documenting a 280 1 of 17 1 qualitative assessment." Industry would ask the NRC to consider removing the engineering evaluation sections of the RIS as licensees have existing guidance on development of engineering evaluations for digital changes.

"implicitly" - The use of "implicitly modeled, or "implicitly described" in Remove "implicitly" described, or provide a regulatory basis for including it.

470 5 of 17 1 this section and (b) below are much too vague, and cannot be accurately assessed.

The statement: "Additional guidance for addressing potential common Suggest deleting the sentence or provide a reference to the additional NRC guidance cause failure of digital I&C equipment is contained in other NRC documents that address DI&C CCF.

guidance documents and NRC-endorsed industry guidance documents."

66 2 of 5 2 should be deleted. This RIS Supplement is essentially providing guidance on how to address DI&C CCF through qualitative assessments.

As such, this sentence is somewhat contradictory.

This sends an unbalanced message to the public and other Replace the first two sentences starting on line 132 with the following:

stakeholders, implying that digital modifications are adverse to safety.

"In general, utilization of proven digital technology has the potential for significant improvements in the performance and reliability of equipment used in nuclear plants.

Care is appropriate in the implementation of digital technology to avoid unintended consequences that could include potential software failure, including common cause 132 3 of 5 2 failure, or changes introduced in the transition from analog to digital technology including the scope and effect of a potential failure due to consolidation or reconfiguration of equipment. The potential for these unintended consequences can be appropriately managed in the design process as documented in engineering evaluations and addressed in the assessment of the change required by 10 CFR 50.59 to determine whether NRC approval is required prior to implementation."

The following statement is problematic: "For digital modifications, Suggest deleting this statement as it is irrelevant to the discussion and is not an accurate In practice, the introduction of digital equipment has particularly those that introduce software, there may be the potential statement. Please provide the basis for declaring that the potential for SCCF is directly proven to decrease the likelihood of failure due to increase in likelihood of failure, including a single failure. For redundant proportional to the potential increase in likelihood of failure, as not all failures are such things as elimination of single points of 354 3 of 17 2 SSCs, this potential increase in the likelihood of failure creates a similar common cause. vulnerability and self diagnostics.

increase in the likelihood of a common cause failure."

As an alternative, use the wording suggested for Line 132 above.

Page 5 of 7

INDUSTRY COMMENTS TO MARCH 2018 DRAFT RIS 17-XX SUPPLEMENT-1 TO RIS 2002-22 LINE NO. PAGE NO. PRIORITY INDUSTRY COMMENT RECOMMENDED CHANGE ADDITIONAL COMMENTS ACTION TAKEN "directly related" - Not all equipment in a system that performs a This should be clarified here, and in other sections of the document that use this design function has an equal contribution to the likelihood of discussion.

malfunction of that design function. It may be directly related, but may 375 3 of 17 2 not directly increase the likelihood. In other words, the contribution of the I&C potential common cause failure to the overall failure probability of the design function, is not a "one to one" relationship.

"vulnerability" - There is not an issue if a CCF vulnerability is created, Suggest eliminating this section. If it is kept, then suggest clarifying that the vulnerability 455 5 of 17 2 that is either bounded by existing analyses, or has no safety impact. needs to be assessed for the impact on the plant safety analysis.

The key is to assess the vulnerability.

Combining design functions - This section uses several similar, but very This section should be simplified to state that if design functions are combined as a result different terms, such as "design functions", "component functions" and of a digital upgrade, that the vulnerabilities to common cause software failure need to be 701 2 "design functions of SSCs". This is confusing. In particular for non- identified, understood, and analyzed for impact on the plant safety analysis. The rationale safety related systems, there is no requirement to keep SSCs separate, should be explained in the engineering documentation or qualitative assessment.

nor a restriction on combining them.

"risk significant" - The use of the term risk significant needs to be Define the use of risk significant, or remove it.

clarified here, and in other places in the document. The context appears to be implying safety significant. If that is the case, then the 775 12 of 17 2 document should tie together this concept by equating risk significant to important to safety in technical space. This would better align with the use of design functions in 50.59 space.

Section 4.6, Engineering Documentation. Consider deleting this section as licensees already have procedural guidance for development of 794 13 of 17 2 retainment of the various engineering products required to support a engineering design change in accordance with their Appendix B quality programs.

Table 2 - Step 1, last bullet - the use of "mode" needs to be defined. Clarify the use of mode, or delete it.

This could be interpreted that an evaluation of design functions in 810 15 of 17 2 different plant modes (Mode 1, Mode 2, etc.) is required.

Concern: The RIS expands the use of qualitative assessment beyond Minor changes throughout the document can resolve the consistency issue.

its use in NEI 01-01. In NEI 01-01, the term is primarily used to address software. Hardware can be assessed deterministically (i.e. not 137 3 of 5 2 qualitatively, or with engineering judgment). In the RIS, the qualitative assessment is expanded to include deterministic hardware considerations.

Page 6 of 7

INDUSTRY COMMENTS TO MARCH 2018 DRAFT RIS 17-XX SUPPLEMENT-1 TO RIS 2002-22 LINE NO. PAGE NO. PRIORITY INDUSTRY COMMENT RECOMMENDED CHANGE ADDITIONAL COMMENTS ACTION TAKEN Concern: The sentence starting with "A qualitative assessment" on Propose that the balance of that paragraph and the following paragraph, lines 137 through line 137 is an abrupt change that introduces new terminology and 153, be replaced with the following:

concepts without appropriate transition.

"Conforming changes will follow in the body of the RIS Appropriate considerations in the engineering evaluations can avoid potential unintended consequences of a digital upgrade and provide reasonable assurance that the change results in dependable equipment supporting design functions, including that the software will support the intended design functions. Hardware implemented in the change can be designed to ensure functions are not combined in a manner that expands the impact of a potential failure beyond that evaluated in the plant Final Safety Analysis Report, as updated. Utilization of appropriate software development processes, design attributes, and operating experience can provide assurance that the software will be reliable, with a sufficiently low likelihood of failure.

137 3 of 5 2 "Prior to implementing a digital upgrade, licensees assess the change using the criteria in 10 CFR 50.59 to determine whether the change can be implemented without prior NRC approval. Assessment of the hardware and design configuration considerations can be addressed deterministically. Assessment of the software can be addressed qualitatively to conclude that the likelihood of a consequential software failure is sufficiently low, such that the effects of a software failure, including potential common cause failure, need not be further evaluated. Both hardware and software considerations need to be assessed to support a conclusion that prior NRC approval is not required. The attachment to this RIS supplement provides a framework for preparing and documenting engineering evaluation and qualitative assessments of software."

Concern: The sentence starting with "A qualitative assessment" on Start a new paragraph. EDITORIAL 137 3 of 5 3 line 137 is the start of a new topic and should start a new paragraph.

"Adverse" has a distinct meaning in the 50.59 screening process. The Replace "adverse" with "negative' in both places it appears in the document. EDITORIAL 311 2 of 17 3 use of adverse in the RIS does line up with the meaning of adverse in 50.59.

Suggest deleting the following statement: "This sufficiently low Suggest deleting this statement as it is irrelevant to the discussion and may cause EDITORIAL threshold is not interchangeable with that for distinguishing between confusion. In addition, the term is not used anywhere else in the document.

events that are credible or not credible. The threshold for 345 2 of 17 3 determining whether an event is credible or not is whether it is as likely as (i.e., not much lower than) malfunctions already assumed in the UFSAR."

This example uses a steam generator tube rupture event. This does not Suggest using an I&C example. EDITORIAL 371 3 support use by I&C engineers.

Consider striking "need to" in the following statement: "However, EDITORIAL 527 6 of 17 3 design features external to the proposed modification (e.g., mechanical stops on valves) may also need to be considered."

Page 7 of 7

1 UNITED STATES 2 NUCLEAR REGULATORY COMMISSION 3 OFFICE OF NUCLEAR REACTOR 4 REGULATION OFFICE OF NEW REACTORS 5 WASHINGTON, D.C. 20555-0001 6

7 Month XX, 2018 8

9 DRAFT NRC REGULATORY ISSUE

SUMMARY

2002-22, SUPPLEMENT 1 10 CLARIFICATION ON ENDORSEMENT OF NUCLEAR ENERGY INSTITUTE GUIDANCE 11 IN DESIGNING DIGITAL UPGRADES IN INSTRUMENTATION AND CONTROL SYSTEMS 12 13 14 ADDRESSEES 15 16 All holders and applicants for power reactor operating licenses or construction permits under 17 Title 10 of the Code of Federal Regulations (10 CFR) Part 50, Domestic Licensing of 18 Production and Utilization Facilities.

19 20 All holders of and applicants for a combined license, standard design approval, or 21 manufacturing license under 10 CFR Part 52, Licenses, Certifications, and Approvals for 22 Nuclear Power Plants. All applicants for a standard design certification, including such 23 applicants after initial issuance of a design certification rule.

24 25 All holders of, and applicants for, a construction permit or an operating license for non-power 26 production or utilization facilities under 10 CFR Part 50, including all existing non-power reactors 27 and proposed facilities for the production of medical radioisotopes, such as molybdenum-99, 28 except those that have permanently ceased operations and have returned all of their fuel to the 29 U.S. Department of Energy.

30 31 INTENT 32 33 The U.S. Nuclear Regulatory Commission (NRC) is issuing a supplement to Regulatory Issue 34 Summary (RIS) 2002-22, dated November 25, 2002 (Agencywide Documents Access and 35 Management System (ADAMS) Accession No. ML023160044). In RIS 2002-22, the NRC staff 36 endorsed Guideline on Licensing Digital Upgrades: EPRI TR-102348, Revision 1, NEI 01-01: A 37 Revision of EPRI TR-102348 to Reflect Changes to the 10 CFR 50.59 Rule, (Nuclear Energy 38 Institute (NEI) hereinafter NEI 01-01 ) (ADAMS Accession No. ML020860169). NEI 01-01 39 provides guidance for designing, licensing, and implementing digital upgrades and 40 replacements to instrumentation and control (I&C) systems (hereinafter digital I&C ) in a 41 consistent and comprehensive manner.

42 43 The purpose of this RIS Supplement is to clarify RIS 2002-22, which remains in effect. The 44 NRC continues to endorse NEI 01-01 as stated in RIS 2002-22, as clarified by this RIS 45 Supplement. Specifically, the guidance in this RIS Supplement clarifies the NRC staff s 46 endorsement of the guidance pertaining to Sections 4, 5, and Appendices A and B of NEI 01-01.

47 This RIS Supplement clarifies the guidance for preparing and documenting qualitative 48 assessments, that can be used to evaluate the likelihood of failure of a proposed digital 49 modification, including the likelihood of failure due to a common cause, i.e., common cause 50 failure (CCF). Licensees can use these qualitative assessments to support a conclusion that a 51 52 53 54 ML18051A084

Draft RIS 2002-22 Supplement 1 Page 2 of 5 55 56 proposed digital I&C modification has a sufficiently low1 likelihood of failure. This conclusion, 57 and the reasons for it, should be documented, per 10 CFR 50.59(d)(1), as part of the 58 evaluations of proposed digital I&C modifications against some of the criteria in 10 CFR 50.59, 59 Changes, tests and experiments.

60 61 This RIS Supplement is not directed toward digital I&C upgrades and replacements of reactor 62 protection systems and engineered safety features actuation systems, since application of the 63 guidance in this RIS Supplement to such changes would likely involve additional considerations.

64 This RIS Supplement does not provide new design process guidance for addressing common 65 cause failure of the reactor protection systems and engineered safety features actuation 66 systems. Additional guidance for addressing potential common cause failure of digital I&C 67 equipment is contained in other NRC guidance documents and NRC-endorsed industry 68 guidance documents.

69 70 This RIS Supplement requires no action or written response on the part of an addressee.

71 72 BACKGROUND INFORMATION 73 74 By letter dated March 15, 2002, NEI submitted EPRI TR-102348, Revision 1 (NEI 01-01) for 75 NRC staff review. NEI 01-01 replaced the original version of EPRI TR-102348, dated 76 December 1993, which the NRC endorsed in Generic Letter 1995-02, Use of NUMARC/EPRI 77 Report TR-102348, Guideline on Licensing Digital Upgrades, in Determining the Acceptability 78 of Performing Analog-to-Digital Replacements Under 10 CFR 50.59, dated April 26, 1995 79 (ADAMS Accession No. ML031070081). In 2002, the NRC staff issued RIS 2002-22 to notify 80 addressees that the NRC staff had reviewed NEI 01-01 and was endorsing the report for use as 81 guidance in designing and implementing digital upgrades to nuclear power plant instrumentation 82 and control systems.

83 84 Following the NRC staff s 2002 endorsement of NEI 01-01, holders of construction permits and 85 operating licenses have used that guidance in support of digital design modifications in 86 conjunction with Regulatory Guide 1.187, Guidance for Implementation of 10 CFR 50.59, 87 Changes, Tests, and Experiments, dated November 2000 (ADAMS Accession 88 No. ML003759710), which endorsed NEI 96-07, Guidelines for 10 CFR 50.59 Implementation, 89 Revision 1, dated November 2000 (ADAMS Accession No. ML003771157).

90 91 NRC inspections of documentation for digital I&C plant modifications prepared by some 92 licensees using the guidance in NEI 01-01 identified inconsistencies in the performance and 93 documentation of licensee engineering evaluations. NRC inspections also identified 94 documentation issues with the written evaluations of the 10 CFR 50.59(c)(2) criteria. The term 95 engineering evaluation refers to evaluations performed in designing digital I&C modifications 96 other than the 10 CFR 50.59 evaluation, for example, evaluations performed under the 97 licensee s NRC approved quality assurance program. This RIS Supplement clarifies the 98 guidance for licensees performing and documenting engineering evaluations and the 99 development of qualitative assessments.

100 101 In response to staff requirements memorandum (SRM)-SECY-16-0070 Integrated Strategy to 102 Modernize the Nuclear Regulatory Commission s Digital Instrumentation and Control Regulatory 103 104 1 NEI 01-01, Page 4-20, defines sufficiently low to mean much lower than the likelihood of failures that are 105 considered in the UFSAR (e.g., single failures) and comparable to other common cause failures that are 106 not considered in the UFSAR (e.g., design flaws, maintenance errors, calibration errors).

Draft RIS 2002-22 Supplement 1 Page 3 of 5 107 108 Infrastructure (ADAMS Accession No. ML16299A157), NRC staff has engaged the public, 109 including NEI and industry representatives, to improve the guidance for applying 10 CFR 50.59 110 to digital I&C-related design modifications as part of a broader effort to modernize I&C 111 regulatory infrastructure. Making available the guidance in this RIS Supplement is described as 112 a near-term action in the integrated action plan to provide specific guidance for documenting 113 qualitative assessments concluding that a proposed digital I&C modification will exhibit a 114 sufficiently low likelihood of failure.

115 116 Applicability to Non-Power Reactor Licensees 117 118 The examples and specific discussion in this RIS Supplement and other guidance referenced 119 by this RIS Supplement (i.e., NEI 01-01 and original RIS 2002-22) primarily focus on power 120 reactors. Nonetheless, licensees of non-power production or utilization facilities (NPUFs) may 121 also use the guidance in RIS 2002-22 and apply the guidance in this RIS Supplement to 122 develop written evaluations addressing the criteria in 10 CFR 50.59(c)(2). In particular, NPUF 123 licensees may use the guidance to prepare qualitative assessments that consider design 124 attributes, quality measures, and applicable operating experience to evaluate proposed digital 125 I&C changes to their facilities as described in Sections 4, 5, and Appendix A of NEI 01-01.

126 However, certain aspects of the guidance that discuss the relationship of other regulatory 127 requirements to 10 CFR 50.59 may not be fully applicable to NPUFs (e.g., 10 CFR Part 50, 128 Appendix A and B are not applicable to NPUFs).

129 130

SUMMARY

OF ISSUE 131 132 In general, digital I&C modifications may include a potential for an increase in the likelihood of 133 equipment failures occurring within modified SSCs, including common cause failures. In 134 particular, digital I&C modifications that introduce or modify identical software within 135 independent trains, divisions, or channels within a system, and those that introduce new shared 136 resources, hardware, or software among multiple control functions, may include such a 137 potential. A qualitative assessment can be used to support a conclusion that there is not more 138 than a minimal increase in the frequency of occurrence of accidents or in the likelihood of 139 occurrence of malfunctions (10 CFR 50.59(c)(2)(i) and (ii)). A qualitative assessment can also 140 be used to support a conclusion that the proposed modification does not create the possibility of 141 an accident of a different type or malfunction with a different result than previously evaluated in 142 the UFSAR (10 CFR 50.59(c)(2)(v) and (vi)).

143 144 For digital I&C modifications, an adequate basis for a determination that a change involves a 145 sufficiently low likelihood of failure may be derived from a qualitative assessment of factors 146 involving system design features, the quality of the design processes employed, and an 147 evaluation of relevant operating experience of the software and hardware used (i.e., product 148 maturity and in-service experience). A licensee may use a qualitative assessment to document 149 the factors and rationale for concluding that there is an adequate basis for determining that a 150 digital I&C modification will exhibit a sufficiently low likelihood of failure. In doing so, a licensee 151 may consider the aggregate of these factors. The attachment to this RIS Supplement provides 152 a framework for preparing and documenting qualitative assessments and engineering 153 evaluations.

154 155 In addition, this RIS Supplement clarifies the applicability of some aspects of the NRC policy 156 described in Item II.Q of SRM/SECY 93-087, Policy, Technical, and Licensing Issues 157 Pertaining to Evolutionary and Advanced Light Water Reactor (ALWR) Designs, (ADAMS

Draft RIS 2002-22 Supplement 1 Page 4 of 5 158 159 No. ML003708056), in regard to the application of 10 CFR 50.59(c)(2) criteria for digital I&C 160 modifications.

161 162 BACKFITTING AND ISSUE FINALITY DISCUSSION 163 164 This RIS Supplement clarifies but does not supersede RIS 2002-22, and includes additional 165 guidance regarding how to perform and document qualitative assessments for digital I&C 166 changes under 10 CFR 50.59.

167 168 The NRC does not intend or approve any imposition of the guidance in this RIS Supplement, 169 and this RIS Supplement does not contain new or changed requirements or staff positions that 170 constitute either backfitting under the definition of backfitting in 10 CFR 50.109(a)(1) or a 171 violation of issue finality under any of the issue finality provisions in 10 CFR Part 52. Therefore, 172 this RIS Supplement does not represent backfitting as defined in 10 CFR 50.109(a)(1), nor is it 173 otherwise inconsistent with any issue finality provision in 10 CFR Part 52. Consequently, the 174 NRC staff did not perform a backfit analysis for this RIS Supplement or further address the issue 175 finality criteria in 10 CFR Part 52.

176 177 FEDERAL REGISTER NOTIFICATION 178 179 The NRC will publish a notice of opportunity for public comment on this draft RIS in the Federal 180 Register.

181 182 CONGRESSIONAL REVIEW ACT 183 184 This RIS is a rule as defined in the Congressional Review Act (5 U.S.C. §§ 801-808). However, 185 the Office of Management and Budget has not found it to be a major rule as defined in the 186 Congressional Review Act.

187 188 PAPERWORK REDUCTION ACT STATEMENT 189 190 This RIS provides guidance for implementing mandatory information collections covered by 191 10 CFR Part 50 that are subject to the Paperwork Reduction Act of 1995 (44 U.S.C. 3501 et.

192 seq.). This information collection was approved by the Office of Management and Budget 193 (OMB) under control number 3150-0011. Send comments regarding this information collection 194 to the Information Services Branch, U.S. Nuclear Regulatory Commission, Washington, DC 195 20555-0001, or by e-mail to Infocollects.Resource@nrc.gov, and to the Desk Officer, Office of 196 Information and Regulatory Affairs, NEOB-10202, (3150-0011) Office of Management and 197 Budget, Washington, DC 20503.

198 199 Public Protection Notification 200 201 The NRC may not conduct or sponsor, and a person is not required to respond to, a request 202 for information or an information collection requirement unless the requesting document 203 displays a currently valid OMB control number.

Draft RIS 2002-22 Supplement 1 Page 5 of 5 204 205 CONTACT 206 207 Please direct any questions about this matter to the technical contact(s) or the Lead Project 208 Manager listed below.

209 210 211 212 Timothy J. McGinty, Director Christopher G. Miller, Director 213 Division of Construction Inspection Division of Inspection and Regional Support 214 and Operation Programs Office of Nuclear Reactor Regulation 215 Office of New Reactors 216 217 218 Technical Contacts: David Rahn, NRR Wendell Morton, NRR 219 301-415-1315 301-415-1658 220 e-mail: David.Rahn@nrc.gov e-mail: Wendell.Morton@nrc.gov 221 222 Norbert Carte, NRR David Beaulieu, NRR 223 301-415-5890 301-415-3243 224 e-mail: Norbert.Carte@nrc.gov e-mail: David.Beaulieu@nrc.gov 225 226 Duane Hardesty, 227 NRR 301-415-3724 228 email: Duane.Hardesty@nrc.gov (Specifically for non-power reactors) 229 230 231 Project Manager

Contact:

Tekia Govan, NRR 232 301-415-6197 233 e-mail: Tekia.Govan@nrc.gov 234 235 236 Note: NRC generic communications may be found on the NRC public Web site, 237 http://www.nrc.gov, under NRC Library/Document Collections.

238 239 240

Attachment:

Qualitative Assessment and Engineering Evaluation Framework

241 Qualitative Assessment and Engineering Evaluation Framework 242 243 244 1. Purpose 245 246 Regulatory Issue Summary (RIS) 2002-22 provided the U.S. Nuclear Regulatory Commission 247 (NRC) staff s endorsement of Nuclear Energy Institute (NEI) Guidance document NEI 01-01, 248 Guideline on Licensing Digital Upgrades: EPRI TR-102348, Revision 1, NEI 01-01: A Revision 249 of EPRI TR-102348 To Reflect Changes to the 10 CFR 50.59 Rule. NEI 01-01 provides 250 guidance for implementing and licensing digital upgrades, in a consistent, comprehensive, and 251 predictable manner, as well as guidance in performing qualitative assessments of the 252 dependability of digital instrumentation and control (I&C) systems.

253 254 The purpose of this attachment is to provide supplemental clarifying guidance to licensees to 255 ensure that, if qualitative assessments are used, they are described and documented 256 consistently, through an evaluation of applicable qualitative evidence. Following the guidance in 257 RIS 2002-22 and NEI 01-01, as clarified by the guidance in this RIS Supplement, will help 258 licensees document qualitative assessments in sufficient detail that an independent third 259 party can verify the judgements, as stated in NEI 01-01. While this qualitative assessment is 260 used to support the Title 10 of the Code of Federal Regulations (10 CFR) 50.59, Changes tests 261 and experiments, evaluation, it does not provide guidance for screening and it does not 262 presume that all digital modifications screen in.

263 264 NEI 01-01 uses the terms qualitative assessment and dependability evaluations 265 interchangeably. Within this document only the terms qualitative assessment and sufficiently 266 low2 are used in conjunction with performance of 10 CFR 50.59 evaluations. The term 267 dependability evaluation is used in the context of engineering evaluations, which are not 268 performed or documented as part of a 10 CFR 50.59 evaluation, but engineering evaluations 269 are performed in accordance with the licensee s NRC quality assurance program in developing 270 digital I&C modification.

271 272 If a qualitative assessment determines that a potential failure (e.g., software common cause 273 failure (CCF) has a sufficiently low likelihood, then the effects of the failure do not need to be 274 considered in the 10 CFR 50.59 evaluation. Thus, the qualitative assessment provides a 275 means of addressing software CCF. In some cases, the effects of a software CCF may not 276 create a different result than any previously evaluated in the updated final safety analysis report 277 (UFSAR).

278 279 Sections 2 and 3 of this attachment provide acceptable approaches for describing the scope, 280 form, and content of the type of a qualitative assessment described above. Section 4 of this 281 attachment provides acceptable approaches for engineering evaluations that may be used in 282 performing and documenting a qualitative assessment.

283 284 285 286 287 288 289 290 2 NEI 01-01, Page 4-20, defines sufficiently low to mean much lower than the likelihood of failures that are 291 considered in the UFSAR (e.g., single failures) and comparable to other common cause failures that are 292 not considered in the UFSAR (e.g., design flaws, maintenance errors, calibration errors).

293 294 295 Attachment

Draft RIS 2002-22 Supplement 1, Attachment Page 2 of 17 296 297 2. Regulatory Clarification Application of Qualitative Assessments to Title 10 of the 298 Code of Federal Regulations, Section 50.59 299 300 When a licensee decides to undertake an activity that changes its facility as described in the 301 updated final safety evaluation report, the licensee first performs the engineering and technical 302 evaluations in accordance with plant procedures. If the licensee determines that an activity is 303 acceptable through appropriate engineering and technical evaluations, the licensee enters the 304 10 CFR 50.59 process. The regulations in 10 CFR 50.59 provide a threshold for regulatory 305 review, not a determination of safety, for the proposed activities. In addition, 10 CFR 50.59 306 establishes the conditions under which licensees may make changes to the facility or 307 procedures and conduct tests or experiments without prior NRC approval.

308 309 Evaluations must address all elements of proposed changes. Some elements of a change may 310 have positive effects on SSC failure likelihood while other elements of a change may have 311 adverse effects. As derived from the guidance in NEI 96-07, positive and negative elements 312 can be considered together if they are interdependent. This means that if elements are not 313 interdependent, they must be evaluated separately.

314 315 2.1 Likelihood 316 317 Properly documented qualitative assessments may be used to support a conclusion that a 318 proposed digital I&C modification has a sufficiently low likelihood of failure, consistent with the 319 UFSAR analysis assumptions. This conclusion is used in the 10 CFR 50.59 written evaluation 320 to determine whether prior NRC approval is required.

321 322 Qualitative Assessment 323 324 The determination that a digital I&C modification will exhibit a sufficiently low likelihood of failure 325 can be derived from a qualitative assessment of factors involving system design attributes, the 326 quality of the design processes employed, the operating experience with the software and 327 hardware used (i.e., product maturity and in-service experience). Documenting the qualitative 328 assessment includes describing the factors, rationale, and reasoning (including engineering 329 judgement) for determining that the digital I&C modification exhibits a sufficiently low likelihood 330 of failure.

331 332 The determination of likelihood of failure may consider the aggregate of all the factors described 333 above. Some of these factors may compensate for weaknesses in other areas. For example, 334 for a digital device that is simple and highly testable, thorough testing may provide additional 335 assurance of a sufficiently low likelihood of failure that helps compensate for a lack of operating 336 experience.

337 338 Qualitative Assessment Outcome 339 340 There are two possible outcomes of the qualitative assessment: (1) failure likelihood is 341 sufficiently low, and (2) failure likelihood is not sufficiently low. Guidance in NEI 01-01, 342 Section 4.3.6, states, sufficiently low means much lower than the likelihood of failures that are 343 considered in the UFSAR (e.g., single failures) and comparable to other common cause failures 344 that are not considered in the UFSAR (e.g., design flaws, maintenance error, calibration errors).

345 This sufficiently low threshold is not interchangeable with that for distinguishing between 346 events that are credible or not credible. The threshold for determining whether an event is

Draft RIS 2002-22 Supplement 1, Attachment Page 3 of 17 347 348 credible or not is whether it is as likely as (i.e., not much lower than ) malfunctions already 349 assumed in the UFSAR.

350 351 Likelihood Thresholds for 10 CFR 50.59(c)(2)(i), (ii), (v), and (vi) 352 353 A key element of 10 CFR 50.59 evaluations is demonstrating whether the modification 354 considered will exhibit a sufficiently low likelihood of failure. For digital modifications, 355 particularly those that introduce software, there may be a potential increase in likelihood of 356 failure. For redundant SSCs, this potential increase in the likelihood of failure creates a similar 357 increase in the likelihood of a common cause failure.

358 359 The sufficiently low threshold discussions have been developed using criteria from NEI 96-07, 360 Revision 1, and NEI 01-01. They are intended to clarify the existing 10 CFR 50.59 guidance 361 and should not be interpreted as a new or modified NRC position.

362 363 Criteria 364 365 Although it may be required by other criteria, prior NRC approval is not required by 10 CFR 366 50.59(c)(2)(i), (ii), (v), and (vi) if there is a qualitative assessment outcome of sufficiently low, as 367 described below:

368 369 10 CFR 50.59(c)(2)(i) 370 371 Does the activity result in more than a minimal increase in the frequency of occurrence of an 372 accident previously evaluated in the UFSAR?

373 374 Sufficiently low threshold The frequency of occurrence of an accident is directly 375 related to the likelihood of failure of equipment that initiates the accident (e.g., an 376 increase in the likelihood of a steam generator tube failure has a corresponding increase 377 in the frequency of a steam generator tube rupture accident). Thus, an increase in 378 likelihood of failure of the modified equipment results in an increase in the frequency of 379 the accident. Therefore, if the qualitative assessment outcome is sufficiently low, then 380 there is a no more than a minimal increase in the frequency of occurrence of an accident 381 previously evaluated in the UFSAR.

382 383 10 CFR 50.59(c)(2)(ii) 384 385 Does the activity result in more than a minimal increase in the likelihood of occurrence of a 386 malfunction of a structure, system, or component (SSC) important to safety3 previously 387 evaluated in the UFSAR?

388 389 Sufficiently low threshold The likelihood of occurrence of a malfunction of an SSC 390 important to safety is directly related to the likelihood of failure of equipment that causes 391 a failure of SSCs to perform their intended design functions4 (e.g., an increase in the 392 393 3 NEI 96-07, Revision 1, Section 3.9, states, Malfunction of SSCs important to safety means the failure of SSCs to 394 perform their intended design functions described in the UFSAR (whether or not classified as safety-related in 395 accordance with 10 CFR [Part] 50, Appendix B).

396 4 The term design functions, as used in this RIS Supplement, conforms to the definition of design functions in NEI 397 96-07, Revision 1.

Draft RIS 2002-22 Supplement 1, Attachment Page 4 of 17 398 399 likelihood of failure of an auxiliary feedwater (AFW) pump has a corresponding increase 400 in the likelihood of occurrence of a malfunction of SSCs the AFW pump and AFW 401 system). Thus, the likelihood of failure of modified equipment that causes the failure of 402 SSCs to perform their intended design functions is directly related to the likelihood of 403 occurrence of a malfunction of an SSC important to safety. Therefore, if the qualitative 404 assessment outcome is sufficiently low, then the activity does not result in more than a 405 minimal increase in the likelihood of occurrence of a malfunction of an SSC important to 406 safety previously evaluated in the UFSAR.

407 408 10 CFR 50.59(c)(2)(v) 409 410 Does the activity create a possibility for an accident of a different type than any previously 411 evaluated in the UFSAR?

412 413 Sufficiently low threshold NEI 96-07, Revision 1, Section 4.3.5, states, Accidents of a 414 different type are limited to those that are as likely to happen as those previously 415 evaluated in the UFSAR. Accidents of a different type are caused by failures of 416 equipment that initiate an accident of a different type. If the outcome of the qualitative 417 assessment of the proposed change is that the likelihood of failure associated with the 418 proposed activity is sufficiently low, then there are no failures introduced by the activity 419 that are as likely to happen as those in the UFSAR that can initiate an accident of a 420 different type. Therefore, the activity does not create a possibility for an accident of a 421 different type than any previously evaluated in the UFSAR. If the qualitative assessment 422 determines that a potential failure (e.g., software CCF) does not have a sufficiently low 423 likelihood, then the effects of this failure need to be considered in the 10 CFR 50.59 424 evaluation.

425 426 10 CFR 50.59(c)(2)(vi) 427 428 Does the activity create a possibility for a malfunction of an SSC important to safety with a 429 different result than any previously evaluated in the UFSAR?

430 431 Sufficiently low threshold NEI 96-07, Section 4.3.6, states, malfunctions with a 432 different result are limited to those that are as likely to happen as those in the UFSAR.

433 A malfunction of an SSC important to safety is an equipment failure that causes the 434 failure of SSCs to perform their intended design functions. If the outcome of the 435 qualitative assessment of the proposed change is that the likelihood of failure associated 436 with the proposed activity is sufficiently low, then there are no failures introduced by the 437 activity that are as likely to happen as those in the UFSAR. Therefore, the activity does 438 not create a possibility for a malfunction of an SSC important to safety with a different 439 result than any previously evaluated in the UFSAR. If the qualitative assessment 440 determines that a potential failure (e.g., software CCF) does not have a sufficiently low 441 likelihood, then the effects of this failure need to be considered in the 10 CFR 50.59 442 evaluation using methods consistent with the plant s UFSAR.

443 444 3. Qualitative Assessments 445 446 The NRC staff has determined that proposed digital I&C modifications having the characteristics 447 listed below are likely to result in qualitative assessment outcomes that support a sufficiently low 448 likelihood determination:

Draft RIS 2002-22 Supplement 1, Attachment Page 5 of 17 449 450 451 452 453 1. Digital I&C modifications that:

454 455 a) Do not create a CCF vulnerability due to the integration of subsystems or 456 components from different systems that combine design functions that were 457 not previously combined within the same system, subsystem, or component 458 being replaced.

459 460 Note: Integration, as used in this RIS supplement refers to the process of 461 combining software components, hardware components, or both into an overall 462 system, or the merger of the design function of two or more systems or 463 components into a functioning, unified system or component. Integration also 464 refers to the coupling of design functions (software/ hardware) via bi-directional 465 digital communications. Modifications can result in design functions of different 466 systems being integrated or combined either directly in the same digital device or 467 indirectly via shared resources, such as bi-directional digital communications or 468 networks, common controllers, power supplies, or visual display units. Such 469 integration could be problematic because the safety analysis may have explicitly 470 or implicitly modeled the equipment performing the design functions that would 471 be integrated on the basis that it is not subject to any potential source of common 472 cause failure.

473 474 b) Do not create a CCF vulnerability due to new shared resources (such as 475 power supplies, controllers, and human-machine interfaces) with other design 476 functions that are (i) explicitly or implicitly described in the UFSAR as 477 functioning independently from other plant design functions, or (ii) modeled in 478 the current design basis to be functioning independently from other plant 479 design functions.

480 481 c) Do not affect reactor trip or engineered safety feature initiation/control logic or 482 emergency power bus load sequencers.

483 484 2. Digital I&C modifications that maintain the level of diversity, separation, and 485 independence of design functions described in the UFSAR. A change that reduces 486 redundancy, diversity, separation or independence of USFAR-described design 487 functions is considered a more than minimal increase in the likelihood of malfunction.

488 489 3. Digital I&C modifications that are sufficiently simple (as demonstrated through 100 490 percent testing or a combination of testing and input/output state analysis); or 491 demonstrate adequate internal diversity.

492 493 3.1 Qualitative Assessment Categories 494 495 Consistent with the guidance provided in NEI 01-01, this attachment specifies three general 496 categories of characteristics: design attributes, quality of the design process, and operating 497 experience. Qualitatively assessing and then documenting these characteristics separately, by 498 category, and in the aggregate provides a common framework that will better enable licensees

Draft RIS 2002-22 Supplement 1, Attachment Page 6 of 17 499 500 to document qualitative assessments in sufficient detail that an independent third party can 501 verify the judgements.

502 503 Table 1 provides acceptable examples of design attributes, quality of the design processes, and 504 documentation of operating experience. This listing is not all inclusive nor does the qualitative 505 assessment need to address each specific item.

506 507 3.1.1 Design attributes 508 509 NEI 01-01 Section 5.3.1 states:

510 511 To determine whether a digital system is sufficiently dependable, and therefore 512 that the likelihood of failure is sufficiently low, there are some important 513 characteristics that should be evaluated. These characteristics, discussed in 514 more detail in the following sections include: Hardware and software design 515 features that contribute to high dependability (See Section 5.3.4). Such 516 [hardware and software design] features include built-in fault detection and failure 517 management schemes, internal redundancy and diagnostics, and use of software 518 and hardware architectures designed to minimize failure consequences and 519 facilitate problem diagnosis.

520 521 Consistent with the above-quoted text, design attributes of a proposed modification can prevent 522 or limit failures from occurring. A qualitative assessment describes and documents hardware 523 and software design features that contribute to high dependability. Design attributes focus 524 primarily on built-in features such as fault detection and failure management schemes, internal 525 redundancy and diagnostics, and use of software and hardware architectures and facilitate 526 problem diagnosis. However, design features external to the proposed modification (e.g.,

527 mechanical stops on valves) may also need to be considered.

528 529 Many system design attributes, procedures, and practices can contribute to significantly 530 reducing the likelihood of failure (e.g., CCF). A licensee can account for this by deterministically 531 assessing the specific vulnerabilities through postulated failure modes (e.g., software CCF) 532 within a proposed modification and applying specific design attributes to address those 533 vulnerabilities (see Table 1). An adequate qualitative assessment regarding the likelihood of 534 failure of a proposed modification would consist of a description of: (a) the potential failures 535 introduced by the proposed modification, (b) the design attributes used to resolve identified 536 potential failures, and (c) how the chosen design attributes and features resolve identified 537 potential failures.

538 539 Diversity is one example of a design attribute that can be used to demonstrate an SSC modified 540 with digital technology is protected from a loss of design function due to a potential common 541 cause failure. In some cases, a plant s design basis may specify diversity as part of the design.

542 In all other cases, the licensees need not consider the use of diversity (e.g., as described in the 543 staff requirements memorandum on SECY 93-087) in evaluating a proposed modification.

544 However, diversity within the proposed design, and any affected SSCs is a powerful means for 545 significantly reducing the occurrence of failures affecting the accomplishment of design 546 functions.

Draft RIS 2002-22 Supplement 1, Attachment Page 7 of 17 547 548 3.1.2 Quality of the Design Process 549 550 Section 5.3.3 of NEI 01-01 states:

551 552 For digital equipment incorporating software, it is well recognized that 553 prerequisites for quality and dependability are experienced software engineering 554 professionals combined with well-defined processes for project management, 555 software design, development, implementation, verification, validation, software 556 safety analysis, change control, and configuration control.

557 558 Consistent with the guidance provided in NEI 01-01, Quality Design Processes means those 559 processes employed in the development of the proposed modification. Such processes include 560 software development, hardware and software integration processes, hardware design, and 561 validation and testing processes that have been incorporated into the development process.

562 For safety-related equipment this development process would be documented and available for 563 referencing in the qualitative assessment for proposed modifications. However, for 564 commercial-grade-dedicated or non-safety related equipment documentation of the 565 development process may not be readily available. In such cases, the qualitative assessment 566 may place greater emphasis on the design attributes included and the extent of successful 567 operating experience for the equipment proposed.

568 569 Quality of the design process is a key element in determining the dependability of proposed 570 modifications. Licensees employing design processes consistent with their NRC-approved 571 quality assurance programs will result in a quality design process.

572 573 When possible, the use of applicable industry consensus standards contributes to a quality 574 design process and provides a previously established acceptable approach (e.g., Institute of 575 Electrical and Electronics Engineers (IEEE) Standard 1074-2006, IEEE Standard for 576 Developing a Software Project Life Cycle Process, endorsed in Regulatory Guide 1.173, 577 Developing Software Life Cycle Processes for Digital Computer Software Used in Safety 578 Systems of Nuclear Power Plant ). In some cases, other nuclear or non-nuclear standards also 579 provide technically justifiable approaches that can be used if confirmed applicable for the 580 specific application.

581 582 Quality standards should not be confused with quality assurance programs or procedures.

583 Quality standards are those standards which describe the benchmarks that are specified to be 584 achieved in a design. Quality standards should be documents that are established by 585 consensus and approved by an accredited standards development organization. For example, 586 IEEE publishes consensus-based quality standards relevant to digital I&C modifications and is a 587 recognized standards development organization. Quality standards used to ensure the 588 proposed change has been developed using a quality design process do not need to be solely 589 those endorsed by the NRC staff. The qualitative assessment document should demonstrate 590 that the standard being applied is valid for the circumstances for which it is being used.

591 592 3.1.3 Operating Experience 593 594 Section 5.3.1 of NEI 01-01 states, Substantial applicable operating history reduces uncertainty 595 in demonstrating adequate dependability.

Draft RIS 2002-22 Supplement 1, Attachment Page 8 of 17 596 597 Consistent with the above-quoted text, relevant operating experience can be used to help 598 demonstrate that software and hardware employed in a proposed modification have adequate 599 dependability. The licensee may document information showing that the proposed system or 600 component modification employs equipment with significant operating experience in nuclear 601 power plant applications, or in non-nuclear applications with comparable performance standards 602 and operating environment. The licensee may also consider whether the suppliers of such 603 equipment incorporate quality processes such as continual process improvement, incorporation 604 of lessons learned, etc., and document how that information demonstrates adequate equipment 605 dependability.

606 607 Operating experience relevant to a proposed digital I&C change may be credited as part of an 608 adequate basis for a determination that the proposed change does not result in more than a 609 minimal increase in the frequency of occurrence of initiating events that can lead to accidents or 610 in more than a minimal increase in the likelihood of occurrence of a malfunction of an SSC 611 important to safety previously evaluated in the UFSAR. Differences may exist in the specific 612 digital I&C application between the proposed digital I&C modification and that of the equipment 613 and software whose operating experience is being credited. In all cases, however, the 614 architecture of the referenced equipment and software should be substantially similar to that of 615 the system being proposed.

616 617 Further, the design conditions and modes of operation of the equipment whose operating 618 experience is being referenced also needs to be substantially similar to that being proposed as 619 a digital I&C modification. For example, one needs to understand what operating conditions 620 (e.g., ambient environment, continuous duty, etc.) were experienced by the referenced design.

621 In addition, it is important to recognize that when crediting operating experience from other 622 facilities, one needs to understand what design features were present in the design whose 623 operating experience is being credited. Design features that serve to prevent or limit possible 624 common cause failures in a design referenced as relevant operating experience should be 625 noted and considered for inclusion in the proposed design. Doing so would provide additional 626 support for a determination that the dependability of the proposed design will be similar to the 627 referenced application.

Draft RIS 2002-22 Supplement 1, Attachment Page 9 of 17 628 Table 1 Qualitative Assessment Category Examples Categories Examples for Each Category Design

  • Design criteria Diversity (if applicable), Independence, and Redundancy.

Attributes

  • Inherent design features for software, hardware or architectural/network Watchdog timers that operate independent of software, isolation devices, segmentation of distributed networks, self-testing, and self-diagnostic features.
  • Basis for identifying that possible triggers are non-concurrent.
  • Sufficiently simple (i.e., enabling 100 percent testing or comprehensive testing in combination with analysis of likelihood of occurrence of input/output states not tested).
  • Failure state always known to be safe, or at least the same state as allowed by the previously installed equipment safety analysis.

Quality of

  • Justification for use of industry consensus standards for codes and the Design standards not endorsed by the NRC.

Process

  • Justification for use of other standards.
  • Use of Appendix B vendors. If not an Appendix B vendor, the analysis can state which generally accepted industrial quality program was applied.
  • Demonstrated capability (e.g., through qualification testing) to withstand environmental conditions within which the SSC is credited to perform its design function (e.g., EMI/RFI, Seismic).
  • Development process rigor (adherence to generally-accepted commercial or nuclear standards.)
  • Demonstrated dependability of custom software code for application software through extensive evaluation or testing.

Operating

  • Wide range of operating experience in similar applications, operating Experience environments, duty cycles, loading, comparable configurations, etc., to that of the proposed modification.
  • History of lessons learned from field experience addressed in the design.
  • Relevant operating experience: Architecture of the referenced equipment and software (operating system and application) along with the design conditions and modes of operation of the equipment should be substantially similar to those of the system being proposed as a digital I&C modification.

High volume production usage in different applications Note that for software, the concern is centered on lower volume, custom, or user-configurable software applications. High volume, high quality commercial products with relevant operating experience used in other applications have the potential to avoid design errors.

  • Experience working with software development tools used to create configuration files.

Draft RIS 2002-22 Supplement 1, Attachment Page 10 of 17 629 630 3.2 Qualitative Assessment Documentation 631 632 The U.S. Nuclear Regulatory Commission endorsed guidance for documenting 10 CFR 50.59 633 evaluations to meet the requirements of 10 CFR 50.59 (d) is provided in both NEI 96-07, 634 Revision 1 in Section 5.0, Documentation and Reporting and NEI 01-01, Appendix B. Both of 635 these documents reiterate the principles that documentation should include an explanation 636 providing adequate basis for the conclusion so that a knowledgeable reviewer could draw the 637 same conclusion.

638 639 Considerations and conclusions reached while performing qualitative assessments supporting 640 the evaluation criteria of 10 CFR 50.59, are subject to the aforementioned principles. In order 641 for a knowledgeable reviewer to draw the same conclusion regarding qualitative assessments, 642 details of the considerations made, and their separate and aggregate effect on any qualitative 643 assessments need to be included or clearly referenced in the 10 CFR 50.59 evaluation 644 documentation. References to other documents should include the document name and location 645 of the information within any referenced document.

646 647 If qualitative assessment categories are used, each category would be discussed in the 648 documentation including positive and negative aspects considered, consistent with the 649 examples provided in Table 1. In addition, a discussion of the degree to which each of the 650 categories was relied on to reach the qualitative assessment conclusion would be documented.

651 652 4. Engineering Evaluations 653 654 4.1 Overview 655 656 This section describes approaches that could be used for conducting and documenting 657 engineering evaluations. completed in accordance with the licensee s NRC approved quality 658 assurance program. The term engineering evaluation refers to evaluations performed in 659 designing digital I&C modifications. These evaluations are performed under the licensee s NRC 660 approved quality assurance program. These engineering evaluations may include, but are not 661 limited to discussion of compliance with regulatory requirements and conformity to the UFSAR, 662 regulatory guidance, and design standards.

663 664 In addition, these engineering evaluations may include discussions of: a) the performance of 665 deterministic failure analyses, including analysis of the effects of digital I&C failures at the 666 component-level, system-level, and plant-level; b) the evaluation of defense-in-depth; and c) the 667 evaluation of the proposed modification for its overall dependability. The qualitative 668 assessment framework discussed in the previous sections of this attachment may rely, in part, 669 on the technical bases and conclusions documented within these engineering evaluations.

670 Thus, improved performance and documentation of engineering evaluations can enable better 671 qualitative assessments.

672 673 One result of performing these evaluations is to provide insights as to whether a proposed 674 digital I&C design modification may need to be enhanced with the inclusion of different or 675 additional design attributes. Such different or additional design attributes would serve to 676 prevent the occurrence of a possible CCF or reduce the potential for a software CCF to cause a 677 loss of design function.

Draft RIS 2002-22 Supplement 1, Attachment Page 11 of 17 678 679 These approaches are provided for consideration only. They do not represent NRC 680 requirements and may be used at the discretion of licensees.

681 682 4.2 Selected Design Considerations 683 684 During the design process, it is important to consider both the positive effects of installing the 685 digital equipment (e.g., elimination of single-point vulnerabilities (SPVs), ability to perform signal 686 validation, diagnostic capabilities) with the potential negative effects (e.g., software CCF).

687 688 Digital I&C modifications can reduce SSC independence. Reduction in independence of design 689 functions from that described in the USFAR would require prior NRC approval.

690 691 4.2.1 Digital Communications 692 693 Careful consideration of digital communications is needed to preclude adverse effects on SSC 694 independence. DI&C-ISG-04, Revision 1, Highly-Integrated Control Rooms - Communications 695 Issues (Agencywide Documents Access and Management System Accession Number 696 ML083310185) provides guidance for NRC staff reviewing digital communications. This ISG 697 describes considerations for the design of communications between redundant SSCs, echelons 698 of defense-in-depth5 or SSCs with different safety classifications. The principles of this ISG or 699 other technically justifiable considerations, may be used to assess non-safety related SSCs.

700 701 4.2.2 Combining Design Functions 702 703 Combining design functions of different safety-related or non-safety related SSCs in a manner 704 not previously evaluated or described in the UFSAR could introduce new interdependencies and 705 interactions that make it more difficult to account for new potential failure modes. Failure of 706 combined design functions that: 1) can effect malfunctions of SSCs or accidents evaluated in 707 the UFSAR; or 2) involve different defense-in-depth echelons; are of significant concern.

708 709 Combining previously separate component functions can result in more dependable system 710 performance due to the tightly coupled nature of the components and a reduction in complexity.

711 If a licensee proposes to combine previously separate design functions in a safety-related 712 and/or non-safety related digital I&C modification, possible new failures need to be carefully 713 weighed with respect to the benefits of combining the previous separately controlled functions.

714 Failure analyses and control system segmentation analyses can help identify potential issues.

715 Segmentation analyses are particularly helpful for the evaluation of the design of non-safety 716 related distributed networks.

717 718 4.3 Failure Analyses 719 720 Failure analysis can be used to identify possible CCFs in order to assess the need to further 721 modify the design. In some cases, potential failures maybe excluded from consideration if the 722 failure has been determined to be implausible as a result of factors such as design 723 features/attributes, and procedures. Modifications that employ design attributes and features, 724 725 5 As stated in NEI 01-01, Section 5.2, A fundamental concept in the regulatory requirements and expectations for 726 instrumentation and control systems in nuclear power plants is the use of four echelons of defense-in-depth: 1) 727 Control Systems; 2) Reactor Trip System (RTS) and Anticipated Transient without SCRAM (ATWS); 3) Engineered 728 Safety Features Actuation System (ESFAS); and 4) Monitoring and indications.

Draft RIS 2002-22 Supplement 1, Attachment Page 12 of 17 729 730 such as internal diversity, help to minimize the potential for CCFs. Sources of CCF, could 731 include the introduction of identical software into redundant channels, the use of shared 732 resources; or the use of common hardware and software among systems performing different 733 design functions. Therefore, it is essential that such sources of CCF be identified, to the extent 734 practicable, and addressed during the design stage as one acceptable method to support the 735 technical basis for the proposed modification.

736 737 Digital designs having sources of CCF that could affect more than one SSC need to be closely 738 reviewed to ensure that an accident of a different type from those previously evaluated in the 739 UFSAR has not been created. This is particularly the case when such common sources of CCF 740 also are subject to common triggers. For example, the interface of the modified SSCs with 741 other SSCs using identical hardware and software, power supplies, human-machine interfaces, 742 needs to be closely reviewed to ensure that possible common triggers have been addressed.

743 744 A software CCF may be assessed using best-estimate methods and realistic assumptions.

745 Unless already incorporated into the licensee s UFSAR, best-estimate methods cannot be 746 used for evaluating different results than those previously evaluated in the UFSAR.

747 748 4.4 Defense-in-Depth Analyses 749 750 NEI 01-01 describes the need for defense-in-depth analysis as limited to substantial digital 751 replacements of reactor protection system and ESFAS. A defense-in-depth analysis for 752 complex digital modifications of systems other than protection systems may also reveal the 753 impact of any new potential CCFs due to the introduction of shared resources, common 754 hardware and software, or the combination of design functions of systems that were previously 755 considered to be independent of one another. Additionally, defense-in-depth analysis may 756 reveal direct or indirect impacts on interfaces with existing plant SSCs. This type of analysis 757 may show that existing SSCs and/or procedures could serve to mitigate effects of possible 758 CCFs introduced through the proposed modification.

759 760 4.5 Dependability Evaluation 761 762 Section 5.3.1 of NEI 01-01 states that a digital system that is sufficiently dependable will have a 763 likelihood of failure that is sufficiently low. This section describes considerations that can be 764 used to determine whether a digital system is sufficiently dependable.

765 766 The dependability evaluation relies on some degree of engineering judgment to support a 767 conclusion that the digital modification is considered to be sufficiently dependable. When 768 performing a dependability evaluation, one acceptable method is to consider: (1) inclusion of 769 any deterministically-applied defensive design features and attributes; (2) conformance with 770 applicable standards regarding quality of the design process for software and hardware; and (3) 771 relevant operating experience. Although not stated in NEI 01-01, judgments regarding the 772 quality of the design process and operating experience may supplement, but not replace the 773 inclusion of design features and attributes.

774 775 For proposed designs that are more complex or more risk significant, the inclusion of design 776 features and attributes that: serve to prevent CCF, significantly reduce the possible occurrence 777 of software CCF, or significantly limit the consequences of such software CCF, should be key 778 considerations for supporting a sufficiently dependable determination. Design features

Draft RIS 2002-22 Supplement 1, Attachment Page 13 of 17 779 780 maximizing reliable system performance, to the extent practicable, can also be critical in 781 establishing a basis for the dependability of complex or risk significant designs.

782 783 Section 5.1.3 of NEI 01-01 states that Judgments regarding dependability, likelihood of failures, 784 and significance of identified potential failures should be documented . Depending on the 785 SSCs being modified and the complexity of the proposed modification, it may be challenging to 786 demonstrate sufficient dependability based solely upon the quality of the design process 787 and/or operating history. Engineering judgments regarding the quality of the design process 788 and operating experience may supplement, but not replace the inclusion of design features and 789 attributes when considering complex modifications.

790 791 Figure 1 of this attachment provides a simplified illustration of the engineering evaluations 792 process described in Section 4 of this attachment.

793 794 4.6 Engineering Documentation 795 796 Documentation for a proposed digital I&C modification is developed and retained in accordance 797 with the licensee s design engineering procedures, and the NRC-approved QA program. The 798 documentation of an engineering evaluation identifies the possible failures introduced in the 799 design and the effects of these failures. It also identifies the design features and/or procedures 800 that document resolutions to identified failures, as described in NEI 01-01, Section 5.1.4. The 801 level of detail used may be commensurate with the safety significance and complexity of the 802 modification in accordance with licensee s procedures.

803 804 Although not required, licensees may use Table 2 of this attachment to develop and document 805 engineering evaluations. Documentation should include an explanation providing adequate 806 bases for conclusions so that a knowledgeable reviewer could draw the same conclusion.

Draft RIS 2002-22 Supplement 1, Attachment Page 14 of 17 807 808

Draft RIS 2002-22 Supplement 1, Attachment Page 15 of 17 809 810 Table 2 Example - Engineering Evaluation Documentation Outline to support a Qualitative Assessment Topical Area Description Step 1- Describe the full extent of the SSCs to be modified boundaries of the Identification design change, interconnections with other SSCs, and potential commonality to vulnerabilities with existing equipment.

  • What are all of the UFSAR-described design functions of the upgraded/modified components within the context of the plant system, subsystem, etc.?
  • What design function(s) provided by the previously installed equipment are affected and how will those design functions be accomplished by the modified design? Also describe any new design functions that were not part of the original design.
  • What assumptions and conditions are expected for each associated design function? For example, the evaluation should consider both active and inactive states, as well as transitions from one mode of operation to another.

Step 2 Identify Consider the possibility that the proposed modification may have potential failure introduced potential failures.

modes and

  • Are there potential failure modes or undesirable behaviors as a undesirable behavior result of the modification? A key consideration is that undesirable behaviors may not necessarily constitute an SSC failure, but a misoperation. (e.g., spurious actuation)
  • Are failures including, but not limited to, hardware, software, combining of functions, shared resources, or common hardware/software considered?
  • Are there interconnections or interdependencies among the modified SSC and other SSCs?
  • Are there sources of CCF being introduced that are also subject to common triggering mechanisms with those of other SSCs not being modified?
  • Are potential failure modes introduced by software tools or programmable logic devices?

Step 3 Assess the

  • Could the possible failure mode or undesired behavior lead to a effects of identified plant trip or transient?

failures

  • Can the possible failure mode or undesired behavior affect the ability of other SSCs to perform their design function?
  • Could the possible failure mode of the SSC, concurrent with a similar failure of another SSC not being modified but sharing a common failure and triggering mechanism affect the ability of the SSC or other SSCs to perform their design functions?
  • What are the results of the postulated new failure(s) of the modified SSC(s) compared to previous evaluation results described in the UFSAR?

Step 4 Identify What actions are being taken (or were taken) to address significant appropriate identified failures?

  • Are further actions warranted?

Draft RIS 2002-22 Supplement 1, Attachment Page 16 of 17 811 Table 2 Example - Engineering Evaluation Documentation Outline to support a Qualitative Assessment Topical Area Description resolutions for each

  • Is re-design warranted to add additional design features or identified failures attributes?
  • Is the occurrence of failure self-revealing or are there means to annunciate the failure or misbehavior to the operator?

Step 5

  • Describe the resolutions identified in Step 4 of this table that Documentation address the identified failures.
  • Describe the conformance to regulatory requirements, plant s UFSAR, regulatory guidance, and industry consensus standards (e.g., seismic, EMI/RFI, ambient temperature, heat contribution).
  • Describe the quality of the design processes used within the software life cycle development (e.g., verification and validation process, traceability matrix, quality assurance documentation, unit test and system test results).
  • Describe relevant operating history (e.g., platform used in numerous applications worldwide with minimal failure history).
  • Describe the design features/attributes that support the dependability conclusion (e.g., internal design features within the digital I&C architectures such as self-diagnostic and self-testing features or physical restrictions external to the digital I&C portions of the modified SSC), defense-in-depth (e.g., internal diversity, redundancy, segmentation of distributed networks, or alternate means to accomplish the design function).
  • Summarize the results of the engineering evaluation including the dependability determination.

Draft RIS 2002-22 Supplement 1, Attachment Page 17 of 17 812 813 DRAFT NRC REGULATORY ISSUE

SUMMARY

2002-22, SUPPLEMENT 1, CLARIFICATION 814 ON ENDORSEMENT OF NUCLEAR ENERGY INSTITUTE GUIDANCE IN DESIGNING 815 DIGITAL UPGRADES IN INSTRUMENTATION AND CONTROL SYSTEMS DATE: Month 816 XX, 2018 817 818 819 OFFICE NRR/DIRS/IRGB/PM NRR/PMDA OE/EB OCIO NAME TGovan LHill JPeralta DCullison DATE 02/16/2018 01/18/2018 01/22/2018 01/17/2018 OFFICE NRR/DE/EICB/BC NRR/DIRS/IRGB/LA NRR/DIRS/IRGB/PM NRR/DIRS/IRGB/BC NAME MWaters ELee TGovan HChernoff DATE 03/01/2018 02/22/2018 03/01/2018 03/01/2018 820

Industry Suggested Revision to RIS 2017-XX Section 3.1.3 - Operating Experience Deleted: Section 5.3.1 of NEI 01-01 states, Substantial applicable operating history reduces uncertainty in demonstrating adequate dependability.¶ 3.1.3 Operating Experience ¶ Consistent with the above-quoted text, r Deleted: is Relevant operating experience can be used to help evaluate and demonstrate that software and hardware employed in a proposed modification have adequate Deleted: s dependability. The licensee may document information showing that the proposed Deleted: but system or component modification employs equipment with significant operating Deleted: As for the q experience in nuclear power plant applications, or in non-nuclear applications with Deleted: , this comparable performance standards and operating environment. With a large Deleted: s population of in-service components/systems, operating experience provides a large installed test bed with many different environments and applications. Deleted: either good Deleted: sometimes Deleted: , etc Operating experience is normally used in two basic ways 1) calculation of an actual failure rate number or 2) a qualitative method that mines the failure data for both Deleted: suppliers positives and negatives. Failure rate numbers based upon real performance data is Formatted: Bulleted + Level: 1 + Aligned at:

the best quality indicator and calculated failure rate numbers can provide a 0.63" + Indent at: 0.88" comparison to the plants existing components/systems. Qualitative methods involve Deleted: ;

review of the actually reported failure descriptions. These reviews can provide Formatted: Indent: Left: 0.38" supporting evidence and trends of strengths or weaknesses in the manufactures Deleted: What is quality processes, such as design control processes, product testing effectiveness, Formatted ... [1]

and hardware robustness. The licensee may also consider whether the manufacturer of such equipment incorporate quality processes such as continual process Deleted: s improvement, incorporation of lessons learned, etc., and document how that Deleted: what is done information demonstrates adequate equipment dependability. Deleted: with this Some key areas in evaluating operating experience are: Deleted: utlized Deleted: e.g.,

x Strength of the manufactures problem reporting system Deleted: is Deleted: are x Identification of the manufactures threshold for problem reporting and how this Deleted: there information is utilized (i.e., how problems are tracked, are critical problems fixed in Deleted: ing a timely manner, is data used for continuous improvement)

Deleted: , etc.

x Evaluating for repetitive failure trends which could be indicative of a process Deleted: ¶ weakness that requires further evaluation. Formatted: Indent: Left: 0.38" Formatted: Font: 11 pt x Evaluate different revisions and versions of hardware/software for issues that occur Formatted ... [2]

during design or manufacturing changes.

Deleted: problem or x How no problem found is addressed and how large is this population. Formatted: Font: 11 pt Deleted: be possible Operating experience relevant to a proposed digital I&C change may be credited as Deleted: sign of part of an adequate basis for a determination that the proposed change does not Formatted: Font: 11 pt result in more than a minimal increase in the frequency of occurrence of initiating Formatted: Indent: Left: 0.38" events that can lead to accidents or in more than a minimal increase in the likelihood of occurrence of a malfunction of an SSC important to safety previously evaluated in Formatted ... [3]

the UFSAR. Differences may exist in the specific digital I&C application between the Formatted: Indent: Left: 0.38" proposed digital I&C modification and that of the equipment and software whose Formatted ... [4]

operating experience is being credited. The architecture of the referenced equipment Deleted: In all cases, however, t and software should be similar to that of the system being proposed. Note that an Deleted: substantially evaluation of multiple versions can provide evidence of good and stable design and manufacturing processes.

Further, the design conditions and modes of operation of the equipment whose operating experience is being referenced also needs to be similar to that being Deleted: substantially proposed as a digital I&C modification. It is important to recognize that when Deleted: For example, one needs to crediting operating experience, one needs to understand what design features are understand what operating conditions (e.g.,

ambient environment, continuous duty, etc.)

being credited. Highly-configurable components/systems may require a larger were experienced by the referenced design.

population of in-service data to address the potential different applications. In addition, i Customized or rarely used functions should be avoided in operating experience Deleted: i evaluations. Design features that serve to prevent or limit possible common cause Deleted: from other facilities failures as relevant operating experience should be noted and considered for inclusion in the proposed design. Doing so would provide additional support for a Deleted: were present in the design whose operating experience is determination that the dependability of the proposed design will be similar to the Deleted: Note that the more user-h referenced application.

Deleted: thea Deleted: is meansindicates that Deleted: ismay be needed Deleted: cover Deleted: onal Deleted: in a design referenced

Page 1: [1] Formatted Author Bulleted + Level: 1 + Aligned at: 0.63" + Indent at: 0.88" Page 1: [2] Formatted Author Bulleted + Level: 1 + Aligned at: 0.63" + Indent at: 0.88" Page 1: [3] Formatted Author Bulleted + Level: 1 + Aligned at: 0.63" + Indent at: 0.88" Page 1: [4] Formatted Author Bulleted + Level: 1 + Aligned at: 0.63" + Indent at: 0.88"

IndustrySuggestedRevisiontoTable2ofRIS2017XX

Table 2Example Engineering Evaluation Documentation Outline to Support a Qualitative Assessment Topical Area Description Step 1 Activity Identification x Describe the scope and boundaries of the proposed activity including interconnections with other SSCs.

x List the UFSAR-described design function(s) affected by the proposed change.

x Describe any new design functions performed by the modified design that were not part of the original design.

x Describe any design functions eliminated from the modified design that were part of the original design.

x Describe any previously separate design functions that were combined as part of the activity.

x Describe any automatic actions to be transferred to manual control.

x Describe any manual actions that are to be transferred to automatic control.

Step 2Failure Mode Comparison x Provide a comparison between the failure modes of the new digital equipment and the failure modes of the equipment being replaced.

x If the failure modes are different, describe the resulting effect of equipment failure on the affected UFSAR-described design function(s).

Step 3Determination of Equipment Using the qualitative assessment categories provided in Table 1, address the Reliability and CCF Likelihood following questions:

x Is the new digital equipment as reliable as the equipment being replaced?

x Is the new digital equipment CCF susceptibility much lower than the likelihood of failures considered in the UFSAR (e.g., single failures) and comparable to CCFs that are not considered in the UFSAR (e.g., design flaws, maintenance errors, calibration errors)?

Step 4Assessment of Equipment IF the results of Step 3 indicate the new digital equipment is as reliable as the Reliability and CCF Likelihood equipment being replaced AND CCF likelihood is determined to be sufficiently Results low, perform the following:

x Document that no new plant-level vulnerabilities were identified (i.e.,

the proposed activity will not increase accident frequency/malfunction likelihood or create an accident of a different type/malfunction with a different result) x Continue to Step 5 IF the results of Step 3 indicate the new digital equipment is not as reliable as the equipment being replaced, perform the following:

x Using the guidance provided in NEI 96-07, Rev. 1, Section 4.3.1 and Section 4.3.2, determine if the decrease in reliability will result in more than a minimal increase in accident frequency or malfunction likelihood and document the justification.

IF the results of Step 3 indicate the CCF likelihood is not sufficiently low, perform the following:

x Using the guidance provided in NEI 96-07, Rev. 1, Section 4.3.5 and Section 4.3.6, determine if a postulated CCF will result in a different type of accident or a malfunction with a different result and document the justification.

Table 2Example Engineering Evaluation Documentation Outline to Support a Qualitative Assessment Topical Area Description Step 5Conclusion Provide a summarization of the qualitative assessment results and overall conclusions reached. Include a discussion of the pros and cons associated with implementation of the proposed activity (e.g., elimination of single points of vulnerability, self-diagnostics and testing). Discuss the effect of the proposed activity, if any, on applicable UFSAR-described design functions. Provide discussion on any differences in equipment failure modes and the associated impact of different failure modes on applicable UFSAR-described design functions.

Step 6Supporting Documentation Provide a list of references used to support arguments made and conclusions reached in the qualitative assessment. References should be retrievable for future review and inspection activities. If not retrievable, consider attaching references to the qualitative assessment. The level of supporting documentation should be commensurate to the safety significance of the SSCs being modified.

Examples of references include:

x Applicable codes and standards applied in the design x Equipment environmental qualifications (e.g., ambient temperature, EMI/RFI, seismic) x Quality design processes employed (e.g., NQA-1, Part II, Subpart 2.7; Commercial Grade Dedication documentation per EPRI TR-106439) x Failure Modes and Effects Analysis (if applicable) x Software Hazard Analysis (if applicable) x Critical Digital Reviews (if applicable) x Documentation of equipment operating experience Step 7Application of Qualitative x Apply the qualitative assessment results when responding to the Assessment Results 10 CFR 50.59 Evaluation questions.

x List the qualitative assessment as a reference (or include as an attachment) to the 10 CFR 50.59 Evaluation.