ML25226A198
| ML25226A198 | |
| Person / Time | |
|---|---|
| Issue date: | 08/27/2025 |
| From: | Ismael Garcia, Derek Halverson NRC/RES/DE |
| To: | |
| Derek Halverson | |
| References | |
| Download: ML25226A198 (14) | |
Text
Assessment of the State of Practice for Using Operating Experience for Digital Systems for Nuclear Power Plants Dr. Derek Halverson Digital Instrumenation and Controls Engineer, US Nuclear Regulatory Commission Mr. Ismael Garcia Senior Techinical Advisor for Digital Instrumentation and Controls and Cybersecurity, US Nuclear Regulatory Commission This presentation describes work sponsored by the U.S. Nuclear Regulatory Commission (NRC),
Office of Nuclear Regulatory Research, under contract. But it does not represent a final agency position or final agency guidance.
Outline
- 1. Operating Experience (OpE) at the NRC
- 2. OpE Data Set - Acceptance Principles and Red Flags
- 3. OpE Usage Examples
- 4. OpE Usage at Different Criticality Levels
- 5. Future Opportunities in OpE
- 6. Conclusions Based on the work described in TLR-RES/DE-2024-06 Assessment of the Domestic and International Use of Operating Experience (OpE) in Digital Systems in Nuclear Power Plants and Safety-Critical Industries(ADAMS Accession Number ML24358A096) 2
- Focus of this research work is on reliability claims in support of regulatory reviews and inspection activities for specific digital devices
- Includes quantitative values
- Not for other uses (PRA and lessons learned)
- OpE is already mentioned in NRC guidance documents
- Supplement 1 to Regulatory Issue Summary 2002-22
- Branch Technical Position 7-19 Rev 9
- Design Review Guide
- Not historically relied on for regulatory reviews of Digital Instrumentation and Controls (DI&C) systems used in nuclear applications Operating Experience (OpE) at the NRC 3
OpE Data Set Acceptance Principles
- Bottom Line Up Front
- Acceptance principles often partially met Very challenging in practice
- OpE data usable in low safety-criticality applications
- Were interested in your feedback or collaboration on areas where improved OpE could be available in the future.
4 Image from TLR-RES/DE-2024-06 (ADAMS Accession Number ML24358A096) 4
- Lack of adequate end-user reporting mechanisms
- Non-safety past applications
- Over-optimistic operating time or number of failures
- Missing version information
- Unknown/differing operating profile or environment
- Short time on the market
- New manufacturer/supplier
- Manufacturer with a poor track record Red Flags 5
- Best quality that is likely to be available in practice
- Primarily warranty returns
- Includes firmware return data
- Data from an additional related device
- Data has issues, but sufficient for a 10-2 claim Example 1: Real-World Anonymized OpE 6
- Representative, but not actual numbers from a device
- Mean time between failure calculated using MIL-HDBK-217F handbook using a bill of materials of the device
- Hardware design stable, only minor firmware changes
- Manufacturer claims 8000 devices sold over a 10-year period
- Claim an experience factor of 2 to get 32700 device-years
- Actual returns lower than expected returns under warranty
- There would be issues with software claims, but suitable for claim of 10-1 failures per year Example 2: Hardware Reliability Modelling 7
- Addressing lack of Q/A process information for a nuclear application
- OpE from devices being used in other nuclear applications and other sites
- 18 devices over three sites over 3-4 years with no failures
- Devices estimated to be used once every six months
- Too many red flags and small sample size Example 3: Addressing Weakness in the Development Process 8
8
- A licensee is required by their processes to log all maintenance, calibration, and proof testing (surveillance) activities.
- Use the OpE data collected for a digital sensor to support the safety justification in a new application.
- 32 devices over 12 years
- Investigated 375 times with Six device failure events
- This is higher than the Failure Modes, Effects and Diagnostics Analysis (FMEDA) results anticipated, but still suitable for claims such as 10-1 dangerous failures per year.
Example 4: In Service History to Support Proven-in-use 9
9
- Arms Common Microcontroller Software Interface Standard (CMSIS) platform
- https://github.com/ARM-software/CMSIS_5
- Claim limited to there are no known Operating System defects that affect the behavior of a specific device
- Development managed using the Git version control tool
- Still only suitable for primary evidence at low safety-criticality or supporting at medium or high Example 5: Publicly Available Change Records 10
General OpE Usage at Different Criticality Levels
- Examples of Relevant Types of Claims
- Hardware reliability
- Reaching systemic fault target for a whole device
- Reaching systemic fault target in operating system and software tools
- Bounding systemic failure rate
- Degree of Utility
- Primary source of evidence for a claim
- Supporting evidence for claim
- Defeater only (usable to undercut or refute other evidence or demonstrate it doesnt support refuting)
- Criticality Levels
- High, medium, and low 11
- May be other, better, existing forms of OpE
- Digital Twins
- Data Lakes
- Various applications of AI/ML
- Physics-informed AI-driven anomaly detection
- Better data recording and reporting
- Supervisory Control and Data Acquisition in other industries Future Opportunities 12
- Practical for some applications, particularly for low safety-critical applications.
- Based on contractors experience with actual applications, OpE data alone is generally not seen as sufficient evidence for safety justification of a digital device, especially at high safety-critical applications.
- It is anticipated that the quality of operating experience will improve in the future, at which point the acceptance principes of relevance, data integrity, and sufficiency will be met without unaddressed red flags, which should allow for an expansion of the use of OpE in DI&C.
Conclusions 13
Questions?
Know of promising sources of better OpE?
Please
Contact:
Derek Halverson Derek.Halverson@nrc.gov Ismael Garcia Ismael.Garcia@nrc.gov David Rahn David.rahn@nrc.gov 14