ML24260A251
| ML24260A251 | |
| Person / Time | |
|---|---|
| Issue date: | 02/07/2025 |
| From: | NRC/NMSS/DFM/IOB |
| To: | Office of Nuclear Material Safety and Safeguards |
| Shared Package | |
| ML24260A250 | List: |
| References | |
| Download: ML24260A251 (1) | |
Text
1 Oversight Effectiveness Assessment for the NMSS Inspection Programs TEAM REPORT
2 I.
Executive Summary The Office of Nuclear Material Safety and Safeguards (NMSS) embarked on an effort to assess oversight activities and requirements to ensure consistency across the NMSS inspection programs. This report contains the results of the NMSS Oversight Effectiveness effort. Except where specified, the NMSS Oversight Effectiveness Team is referred to as the team. This effort consisted of an evaluation of the current inspection programs and associated guidance for each of the business lines (BLs) in NMSS. The purpose of this effort was to understand the tools currently in place to monitor the implementation of the inspection programs across NMSS and to improve those tools as needed. The team found the NMSS BLs inspection programs have been established, implemented and matured throughout their respective histories in alignment with the U.S. Nuclear Regulatory Commission (NRC) mission and supporting the safe and secure use of nuclear materials and protection of the environment within the licensed community.
The following recommendations aim to increase awareness of the successes and challenges in implementing inspection programs. These are intended for staff and management to highlight or address any issues promptly. The existing guidance continues to ensure effective implementation of inspection programs in NMSS. These recommendations are designed to enhance awareness and decision-making. While these recommendations are summarized below, further details are provided in the respective sections of the report.
Recommendation 1: Periodic self-assessments for each BL to help in evaluating the health of the program. Establishing specific reporting requirements could further support these self-assessments by acting as inputs and leading indicators. This is further expanded upon in Section V. Program Metrics. Additionally, the following sections also add context: VI. Program Evaluation, XIII. Resource Estimates, and XV. Dashboards.
Recommendation 2: Establishing and implementing reporting criteria, as proposed in Section VII, with respective requirements for reporting and documentation. This recommendation is explained in Section VII. Reporting Criteria.
Recommendation 3: For each Inspection Manual Chapter (IMC) without a distinct feedback section, add this as a separate section and include a reference to IMC 0801, Inspection Program Feedback Process. More information on this recommendation can be found under Section VI. Program Evaluation.
Recommendation 4: Establishing a working group for timely revisions to IMC 0610, Nuclear Material Safety and Safeguards Inspection Reports, and to align with the changes associated with risk-informed inspections made to the respective IMCs and Inspection Procedures (IPs). This is explained in Section X. Inspection Report Documentation.
Recommendation 5: Defining the completion of an inspection and revising the IMCs as appropriate to establish uniformity in terminology and practice across the BLs. This recommendation is expanded upon in Sections IX. Inspection Program Completion and XII. Program Changes Reporting - Decision-Making Authority.
Recommendation 6: Establishing a new term, IMC Deviation, which is defined as, any action regarding the inspection program taken by Regional management, which is not
3 consistent with the guidance contained in the IMC.Section XI. Program Changes explains the basis for this recommendation.
Recommendation 7: Revising IMCs to provide clarity and uniformity on how to document and report program changes and deviations. This is explained in Section XII.
Program Changes Reporting - Decision-Making Authority.
Recommendation 8: Including a resource estimate (range) for budgeting hours for IPs with an identified need. This recommendation is expanded upon in Section XIII.
Resource Estimates.
Recommendation 9: Updating IMCs for each BL to include a reference to Management Directive (MD) 8.3, NRC Incident Investigation Program, and documentation guidance of reactive inspection decision-making for each program in each BL. The basis of this recommendation is in Section XIV. MD 8.3 Decisions.
4 II.
Table of Contents I.
Executive Summary...............................................................................................................2 II.
Table of Contents...................................................................................................................4 III.
Background........................................................................................................................8 IV.
Contributors........................................................................................................................9 V.
Program Metrics.....................................................................................................................9 A.
DLLW..............................................................................................................................9 B.
FFBL.............................................................................................................................10 C.
NMU..............................................................................................................................10 D.
SFST.............................................................................................................................10 VI.
Program Evaluation..........................................................................................................11 A.
DLLW............................................................................................................................12 B.
FFBL.............................................................................................................................12 C.
NMU..............................................................................................................................13 D.
SFST.............................................................................................................................13 VII.
Reporting Criteria.............................................................................................................14 VIII.
Inspection Program Timeliness........................................................................................17 A.
DLLW............................................................................................................................17 B.
FFBL.............................................................................................................................17 C.
NMU..............................................................................................................................18 D.
SFST.............................................................................................................................18 IX.
Inspection Program Completion.......................................................................................18 A.
DLLW............................................................................................................................19 B.
FFBL.............................................................................................................................19 C.
NMU..............................................................................................................................19 D.
SFST.............................................................................................................................20 X.
Inspection Report Documentation....................................................................................20 A.
DLLW............................................................................................................................21 B.
FFBL.............................................................................................................................21 C.
NMU..............................................................................................................................21 D.
SFST.............................................................................................................................21 XI.
Program Changes............................................................................................................22 A.
DLLW............................................................................................................................22 B.
FFBL.............................................................................................................................22
5 C.
NMU..............................................................................................................................23 D.
SFST.............................................................................................................................23 XII.
Program Changes Reporting-Decision-Making Authority...............................................23 A.
DLLW............................................................................................................................23 B.
FFBL.............................................................................................................................24 C.
NMU..............................................................................................................................24 D.
SFST.............................................................................................................................24 XIII.
Resource Estimates.........................................................................................................24 A.
DLLW............................................................................................................................25 B.
FFBL.............................................................................................................................26 C.
NMU..............................................................................................................................26 D.
SFST.............................................................................................................................26 XIV.
MD 8.3 Decisions.............................................................................................................26 A.
DLLW............................................................................................................................26 B.
FFBL.............................................................................................................................27 C.
NMU..............................................................................................................................27 D.
SFST.............................................................................................................................27 XV.
Dashboards......................................................................................................................27 A.
DLLW............................................................................................................................28 B.
FFBL.............................................................................................................................28 C.
NMU..............................................................................................................................28 D.
SFST.............................................................................................................................28 XVI.
Summary of Findings........................................................................................................28 XVII.
Conclusions..................................................................................................................29
6 III.
Background
The Office of NMSS currently has oversight responsibilities, which include programmatic oversight and inspection activities of decommissioning, uranium recovery, low-level waste facilities, fuel cycle facilities, spent fuel storage, transportation, and academic, industrial, and medical uses of nuclear materials. On October 7, 2023, the NMSS Oversight Effectiveness Team was established to assess the inspection programs in NMSS as documented in this report. From this assessment, the teams suggested recommendations would enhance consistency in decision-making and documentation related to implementation of the NMSS inspection programs. Where consistency might not be practical, there needs to be a common understanding how and why the inspection programs differ. This effort considered all phases of the inspection programs of each BL (i.e., scheduling, preparation, inspection, documentation, enforcement).
While most inspection activities are conducted by NRC Regional offices, the spent fuel storage and transportation cask vendor inspections are conducted by a group of inspectors in headquarters (HQ). The Division of Fuel Management (DFM) is responsible for the programmatic oversight for activities under the Fuel Facilities BL (FFBL) and the Spent Fuel and Transportation BL (SFST). The Division of Materials Safety, Security, State, and Tribal Programs (MSST) is responsible for the programmatic oversight for activities in the Nuclear Materials Users BL (NMU), including the Agreement States. The Division of Decommissioning, Uranium Recovery, and Waste Programs (DUWP) is responsible for the programmatic oversight for activities under the Decommissioning and Low-Level Waste BL (DLLW). The FFBL implements the inspection guidance in IMC 2600, Fuel Cycle Facility Operational Safety and Safeguards Inspection Program.
SFST implements the inspection guidance in IMC 2690, Inspection Program for Storage of Spent Reactor Fuel and Reactor-Related Greater-Than-Class C Waste at Independent Spent Fuel Storage Installations and for Title 10 Code of Federal Regulations (10 CFR) Part 71 Transportation Packaging. NMU implements the inspection guidance in IMC 2800, Materials Inspection Program. DLLW implements the inspection guidance in IMC 2561, Decommissioning Power Reactor Inspection Program, IMC 2602, Decommissioning Fuel Cycle, Uranium Recovery, and Materials Inspection Program, and IMC 2801, Uranium Recovery and 11e.(2) Byproduct Material Facility Inspection Program.
Throughout this report, the following memorandums established the NMSS expectations for consistency across the BLs:
- 1) February 19, 2021, memorandum, Actions Required to Maintain the Effectiveness of the Office of Nuclear Material Safety Inspection Programs (ML21048A030);
- 2) April 19, 2021, memorandum, Actions to Ensure Effective and Consistent Implementation of NMSS Oversight Programs (ML21083A198); and
- 3) August 7, 2023, memorandum, Status on Actions Required to Maintain the Effectiveness of Office of Nuclear Material Safety Inspection Programs (ML23208A310).
Utilizing the above memorandums along with the October 7, 2023, alignment meeting, the team focused the assessment on consistency, efficiency, and effectiveness of the NMSS inspection programs and documented the results in this report.
7 IV.
Contributors Briana DeBoer, Team Lead, Project Manager, NMSS/DFM/Inspection and Oversight Branch (IOB)
Eucherius Rosario, Team Lead, Project Manager, NMSS/DFM/IOB Leira Cuadrado, Senior Project Manager, NMSS/MSST/Materials Safety and Licensing Branch (MSLB)
Jennifer Dalzell, Senior Project Manager, NMSS/MSST/MSLB Linda Gersey, Health Physicist, NMSS/DUWP/Reactor Decommissioning Branch Martha Poston-Brown, Senior Health Physicist, NMSS/DUWP/Uranium Recovery and Materials Decommissioning Branch V.
Program Metrics This section includes an assessment of the metrics used to evaluate the effectiveness and efficiency of the inspection programs in NMSS. Each BL has established metrics tracking the number of inspections completed, and this information is reported at the Congressional Budget Justification (CBJ) level. There are also metrics tracked at the Regional level, such as timeliness of report issuance and the quality of inspection findings. The team found that the CBJs and Regional level metrics are sufficient and have served to monitor inspections at the established frequencies. However, there is a desire to have metrics or other performance indicators to better measure the effectiveness of NMSS inspection programs and to be able to use those leading indicators to assist in mitigating challenges.
The team studied the self-assessment process in the Operating Reactors BL under IMC 0307, Reactor Oversight Process Self-Assessment Program. IMC 0307 provides guidance to implement performance metrics that serve to monitor certain aspects of the implementation of the Reactor Oversight Process (ROP) and provides trending data. As an example, ROP self-assessment monitors inspection hours to allow the establishment of an average resource estimate for specific work activities.
After evaluating the current metrics, the team believes routine program assessments are a better tool to measure and monitor effectiveness of the inspection programs as proposed in Recommendation 1. Therefore, the team recommends a periodic NMSS inspection program self-assessment where each BL establishes leading indicators to evaluate the health of their inspection programs. If this recommendation is accepted, each BL would need to determine the appropriate periodicity for their self-assessments during the implementation phase.
Additional information is provided in Section VI, Program Evaluation.
A. DLLW Currently, DLLW tracks metric CBJ-DL-12, Percentage of Required Inspections Completed in Accordance with the Applicable IMC, in accordance with IMC 2561, IMC 2602, and IMC 2801. Additionally, each region tracks the percent of inspection reports issued in accordance with timeliness requirements.
Metric data are found in the inspection details documented in the Reactor Program System/Inspection Scheduling Tracking and Reporting (RPS/ISTAR) database found
8 on the Office of Nuclear Reactor Regulation (NRR) intranet website with a few exceptions. The inspection information captured in the RPS/ISTAR database includes inspection report, report number, IPs used, lead inspector and inspection staff, inspection report due date, inspection report issue date, and Agencywide Documents Access and Documentation System (ADAMS) Accession Number. With some non-fee billable decommissioning sites not in RPS/ISTAR, metric data are tracked through the Regional master inspection plan (MIP) for IP completion. Establishing a periodic self-assessment as summarized in Recommendation 1 and Section VI allows for uniformity in tracking and trending of these established metrics.
B. FFBL Currently, FFBL has a method of tracking Fuel Facilities (FF) inspection completion through CBJ-FF-15, Percentage of Core Inspection Procedures Completed for Fuel Facilities as Required by IMC 2600. Additionally, Region II tracks Licensee Performance Review (LPR) Meetings, Fuel Facility Inspection Report Issuance, Licensee Performance Review Letters, and Temporary Instruction (TI) Inspection Completion. As with DLLW, inspection specific information for FFBL is captured in RPS/ISTAR database. While there are effective metrics and self-assessment, Recommendation 1 will enhance these established methods as described in more detail in Section VI.
C. NMU Currently, NMU has a method of tracking through CBJ-NM-05, Percentage of Safety Inspections of Materials Licensees Completed on Time. Each region also tracks the percent of inspection reports issued in accordance with timeliness requirements.
Since the NMU inspection program is evaluated under the Integrated Materials Performance Evaluation Program (IMPEP), the Regions have established a metric looking at quality of inspection findings which aligns with the IMPEP indicator, Technical Quality of Inspections. This Regional metric reviews inspection findings to ensure they are well-founded and well-documented, identifies root causes of poor performance, reflects appropriate follow-up on previous findings, and results in appropriate regulatory action. NMU inspection quality metric is a difference from the rest of the BLs. Inspection information for NMU is tracked in the Web-Based Licensing database. Therefore, the periodic self-assessment would not only provide enhanced oversight of the inspection program, but Recommendation 1 could also help Regions with IMPEP completion as stated in Section VI.
D. SFST Currently, SFST has a method of tracking through CBJ-SF-15, Percentage of Inspections Completed in Accordance with Manual Chapter 2690. Each region also tracks the percent of inspection reports issued in accordance with timeliness requirements. Independent spent fuel storage installation (ISFSI), spent fuel storage and transportation, and vendor inspections are tracked via RPS/ISTAR database.
With the current metrics and self-assessment, implementing Recommendation 1 would enhance these established methods as further explained in Section VI.
9 VI.
Program Evaluation This section evaluates the methods each of the BLs use to assess the health of the respective inspection programs. The team found there are areas to improve consistency for program evaluation across the BLs. There are differing evaluations conducted by each BL at various frequencies with different scopes. There are established programs that touch upon the inspection programs, such as the Operating Experience program, the Agency Action Review Meeting, and other NMSS reports. However, the team understands there is a desire to have better tools for measuring or evaluating the effectiveness of NMSS inspection programs and to be able to use leading indicators to mitigate challenges. Therefore, in Recommendation 1, the team recommends a periodic NMSS inspection program self-assessment to help each BL use leading indicators to evaluate the health of their programs.
NMSS Policy and Procedure (P&P) 6-11, Revision 1, Conducting Self-Assessments, dated August 18, 2018, provides a framework for planning, conducting, and documenting internal program assessments. This policy document states, in part, that NMSS conducts self-assessments to examine current performance of internal programs and processes in selected areas and the identify gaps between current performance and desired performance. However, this procedure does not specify a frequency at which self-assessments are conducted, rather, it allows Division Directors to determine the frequency at which self-assessments are conducted and the areas that will be evaluated.
As part of Recommendation 1, the team recommends revising the NMSS P&P 6-11 to include a self-assessment process for evaluating the overall effectiveness of each inspection program in meeting their established goals and intended outcomes. The objectives of the recommended self-assessment would be to:
(1) determine the adequacy of inspection program requirements and guidance, including IPs and IMCs; (2) evaluate the inspection program framework to identify improvements; and (3) provide timely, objective information to inform program planning and to develop the recommended improvements to the inspection program.
This self-assessment will result in a consolidation and centralized location for information to assist with tracking and trending.
As summarized in Recommendation 1, the team recommends a self-assessment report for each BL (DLLW, FFBL, NMU, SFST) be completed. The scope of the assessment report should address all areas reviewed, any findings and recommendations.
The self-assessment report should include:
(1) Objective Performance Metrics; (2) Data Trending Focus Areas; (3) Regional Surveys on program needs and satisfaction; (4) Monitoring of Resources; and (5) Other BL specific needs.
10 The program evaluation should include a set of performance metrics that will be monitored and assessed as an integral part of each self-assessment. These performance metrics may include items such as completion of inspection program, utilization of program resources, inspection objectivity and performance reviews, qualification status of inspectors, issuance of inspection reports, completion timeliness for potentially greater than Severity Level IV violations, maintenance of inspection guidance, supportability of inspection findings and outcome of contested violations.
The data trending focus areas should be complementary to the formal objective performance metrics and leverage program data to monitor program health. The data trending focus areas should include items such as inspection violations per IP and per region, open unresolved items, feedback form timeliness, and timeliness of technical assistance requests (TARs) in support of inspections. The NMSS staff should analyze program data in these focus areas, with the objective of identifying any significant trends (positive, negative, stable) or other insights into NMSS program performance in these areas. The data trending areas should be monitored on a quarterly basis by NMSS staff.
Should any significant trends or insights be identified, the self-assessment lead will perform further analysis, including input to the self-assessment report.
Additionally, IMC 0801, Inspection Program Feedback Process, describes the feedback process used by the Office of NMSS, the Office of NRR, and the Office of Nuclear Security and Incident Response to identify and communicate enhancements and resolve problems, concerns, or difficulties encountered in implementing the inspection programs of the NRC. The team found that not all NMSS IMCs reference this process and there is BL specific implementation of the guidance. As stated in Recommendation 3, the team recommends creating a separate section for the inspection feedback process to each NMSS IMC with a reference to IMC 0801 as stated in the following BL subsections.
A. DLLW The DLLW inspection programs are evaluated by management for consistency and effectiveness during the annual decommissioning counterpart meetings. These counterpart meetings review all the DLLW program areas to ensure licensees are being inspected at the appropriate frequencies and with the appropriate level of oversight. DLLW programs do not have an IMC that describes self-assessments, although the decommissioning counterpart meetings fulfill the intent of NMSS P&P 6-11. Additionally, IMCs 2801 and 2561 contain sections describing management reviews of that program. Although the program area is evaluated during the annual counterpart meetings, IMC 2602 does not have any reference to management reviews. None of the three IMCs in DLLW reference NMSS P&P 6-11.
The team recommends implementing Recommendation 1 to develop a self-assessment for these inspection programs. A self-assessment can be used to inform management of topics that should be evaluated during the annual decommissioning counterpart meeting. Additionally, IMCs 2561, 2602, and 2801 do not include a section for IMC 0801 and this should be added as proposed in Recommendation 3.
11 B. FFBL The first self-assessment for FFBL under IMC 2650, Fuel Cycle Inspection Assessment Program, is currently being completed. IMC 2650 describes the process to annually assess and evaluate the overall effectiveness of the inspection program by evaluating the following three elements: efficacy, consistency, and completeness reviews. Since the results of the IMC 2650 are under development, the recommendations proposed in this report do not incorporate identified findings.
However, the results of the FF self-assessment need to be considered prior to implementation. The teams recommendation is to revise the NMSS P&P 6-11 to include the guidance for self-assessments and the reporting criteria as proposed in this report. IMC 2650 must also be referenced in this revision. With the implementation of these recommendations, IMC 2650 should then be superseded by NMSS P&P 6-11 and retiring the IMC should be considered. Additionally, IMC 2600 should be updated to have an inspection program feedback section as well as referencing IMC 0801.
C. NMU The NMU inspection program is evaluated under the IMPEP. The IMPEP program is an established, independent, and routine performance evaluation program used to evaluate the radiation control programs of Agreement States and the NRC. It is conducted every 4 years and it evaluates three key aspects relevant to this evaluation. These relevant aspects are the evaluation of appropriate and qualified technical staff, the verification that inspections are being conducted at established frequency and ensuring the technical quality of inspections are sufficient. Any deficiencies in these areas are identified, presented to Executive Director of Operations level of management, and tracked until improvements are made and the program is re-evaluated and considered satisfactory.
In addition to the IMPEP, each region has implemented its own self-assessment of the inspection program via a Regional level performance metric (NMU-RI/RIII/RIV-07) for which inspection findings are assessed to determine that they are well-founded and well-documented, identify root causes of poor performance, reflect appropriate follow-up on previous findings, and result in appropriate regulatory action. IMC 2800 purposely does not reference IMC 0801. This is because IMC 2800 is used by the Agreement States to model their materials inspection programs, and IMC 0801 is geared toward internal NRC feedback. Thus, the team does not recommend updating IMC 2800 to include a reference to IMC 0801.
D. SFST The first self-assessment for SFST was completed in 2023 (ML23101A086). The self-assessment of the ISFSI Inspection Program covered calendar years 2021 and 2022. The review also included an assessment of the implementation and outcomes of each major change to the program made through the 2020 Enhancement Initiative.
Future self-assessments of the ISFSI Inspection Program were planned be conducted at the end of each triennial period. If Recommendation 1 presented in this report by the team to conduct self-assessments of each BL is accepted and implemented, then it would replace the recommendation from the ISFSI Inspection Program Triennial Self-Assessment. Along with the implementation of
12 Recommendation 1, the self-assessment would also have to be aligned with the NMSS P&P 6-11, including recommended updates. Accepting the recommendation for program feedback would require updating IMC 2690 to have a section for this as well as including a reference to IMC 0801.
VII. Reporting Criteria The reporting criteria proposed in this section provides a starting point to develop a framework for data reporting. These criteria are designed to have clear and consistent data across the NMSS BLs. Understanding the resource implications associated with recurring self-assessments and responding to the desire for more frequent awareness of the overall effectiveness of the inspection programs, the team is proposing a common reporting criteria to be incorporated into the revision of NMSS P&P 6-11. Furthermore, the team is proposing that each IMC (IMC 2561, 2600, 2602, 2690, 2800, and 2801) be revised to include the periodic self-assessment by referencing NMSS P&P 6-11. In this section, Table 1, Reporting Criteria Recommendations, would serve as inputs to the self-assessments and can be used to trend the health of the program between assessments. While there are reporting criteria listed in Table 1, these reporting criteria recommendations (Recommendation 2) were not designed to be an all-inclusive list.
Through the maturity of this oversight effectiveness effort, the reporting criteria recommendations are expected to vary from those proposed in this report. This is because there may be a decision to either eliminate, change, and/or include other reporting criteria for oversight effectiveness. While there may be changes, these reporting criteria would be complementary to the implementation of the periodic self-assessment and the data collected would be used to trend program health.
As mentioned, the team recommends implementing the periodic self-assessment in conjunction with the reporting criteria. Although some of the items identified as reporting criteria are already tracked via other methods, the implementation of the standardized format across the BLs will allow for the consistency across the NMSS inspection programs. Therefore, the team recommends NMSS P&P 6-11 be revised to include guidance on how to perform the self-assessment and the reporting criteria. If this recommendation is accepted, a working group comprised of HQ and Regional staff would need to be established to assess the resources required and develop an implementation plan.
In this section, Table 1 provides the teams proposed reporting criteria along with recommended frequency and associated reference information. Periodic reporting of selected data should be captured by Regional staff and saved in a consolidated location to support the completion of the self-assessment. The team recommends either a SharePoint site or dashboard be developed for tracking reporting criteria. This will also reduce the burden of gathering the information for periodic assessments as the data reported by the Regional staff will be documented at a specified frequency. The frequencies recommended in Table 1 are either quarterly (Q) or semi-annually (SA). The recommendation for quarterly reporting mimics the inspection reporting guidance set forth in the previously referenced February 19, 2021, memorandum (ML21048A030). As this memorandum established quarterly reporting requirements, the criteria with similar attributes were assigned the same quarterly reporting requirement.
Therefore, the team recommends implementing common reporting criteria requirements to develop a clear and consistent framework for data reporting as referenced in
13 Recommendation 2 in Section I, Executive Summary. Once this common reporting criteria is developed, the team feels that the criteria and self-assessments should be incorporated into NMSS PP 6-11.
Table 1, Reporting Criteria Recommendations, can be found on the next page.
14 Table 1 - Reporting Criteria Recommendations Criteria Description Frequency Proposed Leads Reference Documents 1
Missed Inspections Failure to conduct inspection within specified period (beyond established grace periods).
Inspection(s) missed due to poor planning, scheduling, or other controllable reasons.
Q Lead inspector or team leader for inspection staff branch.
IMCs for each BL:
IMC 2600, 2690, 2800, 2561, 2602, 2801 February 19, 2021, memorandum (ML21048A030) 2 Decision to Launch Reactive Inspections MD 8.3 decisions completed resulting reactive inspections.
Q Lead inspector or team leader for inspection staff branch.
IMC 2601, Reactive Inspection Decision Making Process for Fuel Facilities IMC 1301, Response to Radioactive Material Incidents That Do Not Require Activation of the NRC Incident Response Plan MD 8.3 3
Rescheduled Inspections/
Incomplete Inspections Attempted inspections, employee illness, unanticipated leave, severe weather events, travel disruptions, or other reasons outside the control of the inspector.
Q Lead inspector responsible or team leader for inspection staff branch.
February 19, 2021, memorandum, (ML21048A030) 4 Escalated Enforcement Issues Safety significant escalated actions, including those with civil penalties.
Q Lead inspector responsible for inspection or Regional Office of Enforcement staff.
Enforcement Manual and Enforcement Policy 5
Unique/ First-time Violations Unique violations identified during inspections that could have a programmatic impact.
SA Technical Assistant (TA) for BL N/A
15 Criteria Description Frequency Proposed Leads Reference Documents 6
Contested Violations Contested and potentially contested violations Q
TA for BL Enforcement Manual and Enforcement Policy 7
Non-concurrences (NCPs)/ Differing Professional Opinions (DPOs) of Inspection Findings (Reports),
Inspection Guidance, TIs, or other program documents NCPs/DPOs generated for each BL.
SA TA for BL MD 10.159, NRC Differing Professional Opinion Program MD 10.158, NRC Non-concurrence Process 8
Inspection Frequency Changes Approved IMC Deviations for each BL.
SA Inspector team lead for responsible branch OR Project Manager for the site.
IMCs for each BL:
IMC 2600, 2690, 2800, 2561, 2602, 2801 9
Findings with Generic Communication Potential Inspections findings that have a potential generic impact.
Q TA for BL N/A 10 TARs associated with Inspection Activities TARS generated for each BL.
Q Inspector team lead for responsible branch OR Project Manager for the site.
NMSS Procedure 70-09-00, Processing of Technical Assistant Reports (ML17080A508) 11 Use of Very Low Safety Significance Issue Resolution (VLSSIR)
Instances where VLSSIR has been used by each BL.
SA Lead inspector or team leader for inspection staff branch.
NMSS VLSSIR Working Group Report and Implementation Guidance IMCs for each BL:
IMC 2600, 2690, 2800, 2602, 2801
16 VIII. Inspection Program Timeliness Timely execution of the inspection programs is essential to ensure compliance with established standards and to uphold public trust. The team reviewed the guidance in each of the IMCs for scheduling and inspection frequencies. Following the review, the team determined that each inspection program effectively defines the appropriate oversight and inspection frequency for each BL, guided by risk insights. The respective inspection programs allow for flexibility of scheduling the inspection within the required frequency for efficiency of resources. The following subsections describe the inspection program timeliness requirements by BL. The team found the established timeframes are appropriate throughout each of the inspection programs except for inspection report documentation. However, the recommendation for documentation timeliness is explained further and captured under Section X, Inspection Program Documentation.
Therefore, the team is not recommending any changes to inspection programs timeliness.
A. DLLW Annual inspections are required at each site in accordance with IMC 2602 and IMC 2801. A deviation from this can be made based on licensee performance so that the inspections occur more frequently or less frequently. The decision to change the frequency from an annual inspection is required to be documented via a memorandum to the docket file and placed in ADAMS. The Regional Branch Chief and the Project Manager for the site are responsible for making this decision and reporting it to management. IMC 2561 established eight phases of decommissioning for a reactor and for each phase identifies core and discretionary IPs. The core IPs for the applicable reactor decommissioning phase are required to be completed annually, and the discretionary IPs are completed on an as needed basis. IMC 2602 and IMC 2801 also identify core and discretionary procedures associated with those inspection programs, but rather than having multiple phases of decommissioning, these two IMCs focus inspection efforts on risk modules (RMs). RMs are defined as program areas that present higher risk or are expected to effectively reduce risk to health, safety and security that are identified in each IP in order to focus inspection efforts on these particular program areas. The ranking of the RMs varies based on the stage of work at the licensed site. After reviewing the DLLW IMCs, the team is not recommending any changes with respect to inspection program timeliness.
B. FFBL The annual inspection cycle for the Core Inspection Program completion, as required by IMC 2600, Appendix B, NRC Core Inspection Requirements. Except where deviations are approved, these inspection activities are completed annually.
IMC 2600 also recommends inspections to be completed in an as needed manner when an activity or event occurs at the facility. This is specified in the guidance section of specific IPs. IMC 2600, Appendix B provides tables for IPs and associated frequencies for fuel fabrication facilities, other fuel facilities, and as needed IPs.
17 C. NMU The inspection frequency in NMU is determined by the primary program code assigned to the license. While a license may have various program codes, the program code with the highest inspection priority is assigned as the primary code to ensure the licensee gets inspected at the appropriate inspection frequency commensurate with the higher risk activities authorized in the license.
Routine inspections are to be conducted at the designated inspection frequency as described in IMC 2800. It also provides flexibility for scheduling those inspections to support resource efficiency. For licensees that are to be inspected every year (inspection priority 1), the region may schedule that inspection within 6 months before or after the inspection due date and the inspection will still be considered timely. The rest of the inspection frequencies may be scheduled within a year before or after the inspection is due.
NMU tracks the timeliness of inspections with a CBJ performance indicator (see Section V, Performance Metrics), and it continuously monitors the timeliness of inspection via dashboards (see Section XV, Dashboards).
The inspection priorities assigned to materials inspections are risk-informed with those higher risk activities being inspected more frequently than lesser risk activities.
Based on this overarching evaluation of the inspection programs, the team is not recommending any changes to inspection timeliness in the NMU.
D. SFST Inspection frequencies are specified in IMC 2690, Appendix A, Inspection Program Guidance for ISFSIs. While certain activities or events at a facility require quarterly inspections, other inspections are required either annually or even on a triennial cycle. It should be noted that the triennial cycle no longer aligns with the ROP cycle as it is now on a quadrennial cycle. IMC 2690, Appendix A also provides detailed guidance on milestone-based inspections, which have by no-later-than dates for associated IPs. Based on this evaluation of this inspection program, the team is not recommending any changes.
IX.
Inspection Program Completion The team reviewed each inspection program and found that each BL considers the inspection program complete when all applicable inspection requirements are met. As referenced in Section V, Program Metrics, the BLs track their CBJ metrics to determine if they are meeting the requirements of their inspection program.
The team found that there is a difference in the BLs between IP completion and inspection completion and this difference should be clearly defined. Currently, many of the IPs define completion when the associated objectives and requirements are satisfied. If an inspection requires multiple IPs, the inspection is complete when each applicable IPs objectives are met. A routine inspection without escalated enforcement issues would normally be closed with the issuance of the inspection report within 30 days (a non-team inspection) or within 45 days (team inspection) of the final exit meeting in accordance with IMC 0610, Section 4.05, Report Timeliness. Reports
18 involving escalated enforcement are required to be issued within 160 days after the final exit (without investigation) or within 330 days after the final exit (with investigation).
The team found there is no terminology consistency across the IMCs related to what constitutes:
(1) completion of the inspection program, (2) completion of an inspection (inspection activities); and (3) completion of an IP.
None of the IMCs define all three of these endpoints and the definitions of those that are defined is not consistent across the BLs. Since there are metrics that start their timeline based on certain tenants of an inspection, like the escalated enforcement clock starting after the final exit for an inspection, it is important that these endpoints be clearly defined in a consistent way across the BLs.
Therefore, the team recommends (Recommendation 5) the development of standard terminology among all IMCs to define the completion of an inspection program, the completion of an inspection, and the completion of an IP to ensure alignment with Enforcement Manual, and the existing metrics framework.
A. DLLW The IMCs (2801, 2602, and 2561) each state that an IP is complete when the NRC staff determines that the objectives of the core or discretionary IPs have been met.
IMCs 2601 and 2602 also state an inspection is complete when all the IPs selected for the inspection are complete. To ensure consistency, the team recommends that IMCs 2801, 2602, and 2561 be updated to include the updated definitions in Recommendation 5.
B. FFBL Except where deviations are approved, IMC 2600 defines core inspection program completion for an annual inspection cycle to include each core IP listed in IMC 2600 Appendix B. Within IMC 2600, deviations are defined as changes to the inspection program that result in not meeting the inspection completion as defined in the individual IP, including not completing the inspection at the frequency specified in Appendix B.
While IMC 2600 does not state specifically what constitutes inspection completion, IMC 0616, Fuel Cycle Safety and Safeguards Inspection Report, defines inspection completion as the day of the onsite exit meeting or the day of the last re-exit meeting, whichever is later. For FFBL specifically, IMC 0616 already has alignment with the enforcement guidance for inspection completion. However, IMC 2600 should be updated along with the other BLs in the development of standard terminology and definitions.
C. NMU Since materials inspections are conducted on a rolling basis, overall inspection completion is captured by the inspection completion program metric. The inspection is
19 not considered complete upon holding a preliminary onsite exit meeting. The inspection end date is when the final exit meeting is held. For recently updated procedures with RMs, the inspector assesses the applicable RMs identified in the IP for inspection completion. The RMs carrying the highest risk components should always be completed. There are also IPs with focus elements (FE). For these, the emphasis of the inspection is on the FE listed in the procedure. Assessing all applicable areas addressed in the FEs means the IP is complete. Highlighting some of these differences, NMU would also benefit from updating IMC 2800, where applicable, for alignment with the standard terminology and definitions as recommended by the team.
D. SFST Inspection Manual Chapter 2690 defines inspection program completion as the completion of all IPs, including both initial and recurring frequency inspections. Within IMC 2690, the guidance for planned deviations to the inspection frequency requirements due to an inspection deferral is communicated quarterly in a memorandum to the DFM/Branch Chief IOB. Deviations are defined as not completing all IP requirements as defined in the individual IP or not completing the IP within the required frequency as defined in IMC 2690 Appendix A or in Appendix B, Inspection Program Guidance For Transportation Packaging.
Inspection guidance requires inspectors to conduct an exit meeting at the conclusion of an inspection. Therefore, while not explicitly defined, inspection completion is the day of the onsite exit meeting or the day of the last re-exit meeting, whichever is later.
This aligns with the enforcement guidance. Therefore, IMC 2690 should be considered for revision to align with standard terminology and definitions as recommended.
X.
Inspection Report Documentation The team evaluated the guidance for inspection report documentation requirements applicable to each BL. While most of the programs require a narrative report, NMU may issue inspection reports in the field, documenting results using NRC forms. This difference exemplifies a challenge faced by the team throughout this assessment effort.
The team reviewed the inspection report documentation guidance captured under IMC 0610. The purpose of this IMC is to provide guidance on inspection report content and format for NMSS inspection reports. The team noted the last revision was issued in 2004.
The team is recommending a separate working group be established to revise IMC 0610 (Recommendation 4). There are several sections of IMC 0610 requiring an update, as well as ensuring the update is consistent with recommendations in this report. In addition, the team recommends inspection report timeliness be reviewed and updated to ensure it is consistent across all the BLs. The current guidance leaves flexibility for Offices or Regions to implement different report timeliness goals that may not align with complimentary processes in the agency, such as enforcement, and IMPEP evaluation criteria. In addition, the IMC 0610 revision should clarify that the final exit meeting starts the timeline for inspection report issuance.
20 A. DLLW The DLLW IMCs (2801, 2602, and 2561) all reference use of IMC 0610 for inspection report generation. The team recommends the timely revision of IMC 0610 to ensure other programmatic topics, such as clarifying inspection report timeliness, are reflected in this guidance in accordance with Recommendation 4.
B. FFBL Inspectors document inspections in accordance with the requirements of IMC 0616.
Inspectors document inspection results in either integrated inspection reports as inputs or in stand-alone inspection reports. All inspection documentation shall be filed on the docket, following supervisory review. Stand-alone inspection reports are typically issued no later than 30 calendar days after inspection completion. IMC 0616 defines inspection completion as the day of the onsite exit meeting or the day of the last re-exit meeting, whichever is later. Along with the IMC 0610 updates, the team recommends IMC 0616 to be updated in accordance with Recommendation 4.
C. NMU When documenting an NMU inspection, the inspector has the option to either complete a narrative inspection report to document inspection results or using NRC Forms 591M, Materials Inspection Report, and 592M, Materials Inspection Record.
Inspection results shared with the licensee may be reported to the licensee by either issuing a letter or using an NRC Form 591M. The inspector uses an NRC Form 592M to document the scope of the inspection, identify points of contact and share inspection notes any time a narrative report is not required, including non-escalated violations. Detailed information on when it is appropriate to use NRC Forms 591M and 592M in lieu of a narrative inspection report or a letter (with or without a formal notice of violation) are provided in IMC 2800.
Findings documented in a narrative inspection report, must include a cover letter to inform the licensee of the results of the inspection. NRC Form 591M shall not be used to transmit non-cited or cited security-related violations pursuant to 10 CFR Part 37.
These inspection results must be transmitted via letter with must include proper markings, as appropriate. NRC Form 591M is not used to transmit a cited violation where the corrective actions taken by the licensee include an amendment of the license. The inspector may present the completed and signed NRC Form 591M to the licensee either prior to leaving the site or from the office.
Therefore, Recommendation 4 for IMC 0610 revision should also address the use of NRC Forms 591M and 592M.
D. SFST Inspection reports are completed by either the Regional inspectors or the HQ inspectors per IMC 2690. Regional inspection reports are documented in accordance with the guidance in IMC 0611, Power Reactor Inspection Reports, and are normally inputs to the resident inspectors quarterly integrated reports. For those not integrated into the resident inspectors quarterly reports, inspectors document inspection reports in accordance with IMC 0610. Both the Regional and HQ inspection results shall
21 contain the relevant 10 CFR Part 72 docket number and, for specific licensees, the license number. These reports shall be transmitted to licensees in accordance with Regional and HQ requirements, respectively. Therefore, the updates to IMC 0610 would affect SFST and should be considered for alignment with the updates as proposed in Recommendation 4.
XI.
Program Changes Across each of the BLs, there are variations in terminology and definitions for what is more broadly referred to as a programmatic change in this section. Along with these differences, there are also differing guidance for expected communication and required communication. Despite this, the team found decisions regarding programmatic changes are documented in accordance with each manual chapter.
In the February 19, 2021, memorandum, Actions Required to Maintain the Effectiveness of Office of Nuclear Material Safety Inspection Programs, (ML21048A030) the term program adjustment was defined and established the expectation that this term be included in each IMC with set roles/responsibilities. The use of program adjustment was controversial among the different BLs because some of the IMCs already included program adjustment with an alternate definition. Therefore, the team is recommending a new term that is not defined in any IMC to meet the intent of the February 19, 2021, memorandum expectation (Recommendation 6).
The new proposed term, IMC Deviation is defined as, any changes to the inspection program taken by Regional management, which is not consistent with guidance contained in the IMC. An IMC Deviation should be documented through a memorandum from the appropriate management to the appropriate BL Division Director and placed in ADAMS. IMC Deviations may include, but are not limited, to, program adjustments, deviations, and variances. Additionally, IMC Deviations may include changes such as:
intervals between inspections (beyond established grace periods), level of effort for inspections, methods for inspection, or the level at which decisions concerning adjustments are made.
A. DLLW Currently, IMCs 2602 and 2801 use the definition of program adjustment in accordance with the definition detailed in the February 19, 2021, memorandum.
IMC 2561 does not use program adjustment or IMC Deviation, rather it discusses the modification of MIPs to decrease or increase the level of inspection effort for a licensee, if warranted. If a new definition of IMC Deviation is adopted, IMCs 2602, 2801, and 2561 would need to be updated (Recommendation 6).
B. FFBL In IMC 2600, a deviation is defined as a change to the inspection program that results in not meeting the inspection completion as defined in the individual IP, including not completing the inspection at the frequency specified in Appendix B. Also, within IMC 2600, a program adjustment is defined as a change to the MIP based on periodic adjustments from LPR results or in response to other events or activities as determined by management.
22 Because deviation and program adjustment are already defined terms in the FFBL program document, they were unable to adopt the term program adjustment as defined in the February 19, 2021, memorandum. The teams recommendation to define a new term, IMC Deviation, allows the FFBL to be consistent with the other BLs and should be included in the update to IMC 2600.
C. NMU Any actions or decisions associated with the implementation of the inspection program taken by Regional management that are not consistent with IMC 2800 are considered an alteration to the implementation of the inspection program. Such alterations must be approved by Regional management, documented, coordinated and communicated to the MSST Division Director. This guidance can be found in Section 2800-13, Coordination and Reporting for Alterations in the Implementation of the Inspection Program. With this recommendation, IMC 2800 should also be updated to include the new terminology, IMC Deviation.
D. SFST In IMC 2690, a deviation is defined as either not completing all IP requirements as defined in the individual IP or not completing the IP within the required frequency as defined in IMC 2690 Appendix A or B. Therefore, IMC 2690 should be updated to include the new terminology and definition for IMC Deviation.
XII. Program Changes Reporting-Decision-Making Authority The team found that most program change decisions are made at the Regional level, but it is not always clear in each IMC how these decisions are communicated and documented to the BL owners.
The team recommends enhancing each IMC to make it clear how to document reportable items already found in the IMC, such as deviations, changes to inspection plans, and the newly termed IMC Deviations. In addition to Recommendation 6, the standard terminology alignment in Recommendation 5 would also contribute to the uniformity in terminology and practices to clearly establish communication expectations and requirements. Furthermore, the team proposes Recommendation 7, such that any IMC Deviations be reported to BL owners through memoranda.
A. DLLW During recent revisions, IMCs 2602 and 2801 were modified to change who makes decisions regarding program changes. Program change decisions are now made by the Regional branch chiefs in consultation with the HQ Project Managers. IMC 2561 states that Regional management makes the decision to modify the MIP to increase or decrease inspection effort. The team recommends that IMCs 2602, 2801, and 2561 be updated to incorporate Recommendations 5 and 6. IMC 2801 and IMC 2602 already require the documentation of program changes to the inspection program to be documented via a memorandum to the docket file, explaining why the change was made. IMC 2561 would require a revision to include a documentation of changes to the inspection program.
23 B. FFBL Program deviations are expected to be communicated to the DFM Director via a memorandum from the Division of Fuels, Radiological Safety, and Security (DFRSS)
Director. IMC 2600 was modified to provide some flexibility to the Regional office for adjusting frequencies, focus, and intensiveness of inspections for different functional areas at facilities. Aside from periodic adjustments associated with LPR results, other adjustments may occur in response to other events or activities and are expected to be coordinated between DFRSS and DFM branch chiefs. This effort should then be documented in a memorandum, and if the MIP is impacted, then approved by the Director, DFM, and Director, DFRSS, or designees.
C. NMU As indicated in IMC 2800, decisions about the implementation of the inspection program are done at the Regional Division of Radiological Safety and Safeguards level. Even alterations to IMC 2800 are at the Regional level with coordination with the BL manager (MSST Director). IMC 2800 does not provide specifics for how these decisions should be reported/communicated, it only provides direction that the docket needs to be updated with such information and that reporting/communication to the BL manager should occur as soon as practicable, but not later than the next quarterly performance management. The team recommends that IMC 2800 be updated to incorporate Recommendations 5, 6, and 7.
D. SFST Deviations and partial remote inspections are expected to be communicated to the IOB Branch Chief. Full remote inspections and a significant increase in the level of effort or inspection frequency should be coordinated with division management. The team found that deviations for each of the other BLs are communicated at the division level. Further, the method of communication for these situations is not clear in the IMC. Therefore, the team is recommending, that deviations, full remote inspections, and significant increases in level of effort or inspection frequency should be communicated from Regional division management to HQ division management through a memorandum.
XIII. Resource Estimates The team found none of the NMSS or NRR programs track inspection hours for the purpose of determining the completion of an inspection. Rather, the resource estimate stipulated in inspection guidance is used for budgeting and resource planning purposes.
As stated above in Section IX, Inspection Program Completion, each inspection program considers the inspection program complete when all applicable inspection requirements are met.
Specifically, IMC 2561 states, the resource estimate is provided for planning purposes only, and deviations from the estimate should be made based on licensee performance, multi-unit site considerations, resident inspection activities, the type and schedule of decommissioning activities being conducted by the licensee, and the radiological source term present at the site. Further, IMC 2515, Light Water Reactor Inspection Program
24 Operations Phase states, inspection requirements are the controlling factor in determining the amount of inspection effort necessary to complete the baseline inspections. Appendix A provides an estimate of the hours associated with each IP for overall resource planning only.
The NMSS inspection programs differ in indicating resource estimates in IPs, for example some programs provide a set number of hours while some use a range of hours, that may be a small range or large range. There are some procedures that do not have a resource estimate. However, the team found that each program is purposely designed to give inspectors the guidance needed for the program specific inspections being conducted. The team is recommending for those programs that have identified a need to include a resource estimate, that the resource estimate be a range of hours (Recommendation 8). This aligns with a recommendation from the most recent ISFSI Inspection Program Self-Assessment (ML23101A086) in which the team recommended that IP 60855, Operation of an Independent Spent Fuel Storage Installation, be revised to include a 15 percent range to the resource estimate.
As part of the continuous monitoring of the ROP, NRR uses dashboards as a tool for data trending of several areas. These include, Inspection Hours Charged by Site, Baseline Inspection Hours Charged, and Samples Completed and Findings. The ROP self-assessment lead monitors these dashboards monthly, each baseline IP lead looks at the inspection hour and finding data for their specific IP on a quarterly basis.
Should any significant trends or insights be identified, the ROP self-assessment lead provides the data to the appropriate program area lead for further analysis and action.
Any identified significant trends or insights from the ROP data trending efforts are discussed as part of the ROP yearly self-assessment briefing.
Based on the information collected, the team is recommending adding monitoring of resources in the proposed periodic self-assessment (Recommendation 1). Resources monitored would include items such as: budget execution, full-time equivalent utilization, staffing levels, travel funds, and time it takes to refill vacant positions. Monitoring budget execution along with receiving program feedback from the inspectors will allow the BLs to make any necessary changes to the resource estimates specified in the inspection guidance.
Currently, SFST and FFBL have inspection hour dashboards in place to allow staff to monitor the resources being used. NMU has numerous dashboards that allow staff to monitor inspection status, eligible inspections, and timeliness of inspections. DLLW has plans to start developing similar dashboards soon.
A. DLLW Regional staff tracks the conduct of inspections for decommissioning and uranium sites using a MIP specific for the sites in that Region. Each IP applicable to the licensee is identified and if reviewed during the inspection it is either marked complete or partially complete. MIP details are shared with HQ Project Manager.
Inspection hours are listed in the IPs as a range of hours and are not tracked currently. Dashboards associated with inspection program implementation need to be developed and implemented for DLLW.
25 B. FFBL Currently, FFBL has the Fuel Facilities Hours Tracking dashboard in place which is an operational level dashboard used by inspectors and branch chiefs to track hours for each IP, whether they are in or out of range with the resource estimates, track hours charged for each site, etc. Because a dashboard is already in use, no changes are recommended by the team.
C. NMU IPs that were updated as part of the IMC 2800 Phase 3 effort, include a resource estimate section that serves as a guideline for inspectors. This section includes a range of hours to implement that IP due to the variances of complexity of materials licensees under the same IP. For materials, tracking inspection hours is not a good indicator of inspection completion, there is no cost activity code (CAC) generated per inspection. Inspectors charge prep, travel, inspection and documentation time to the corresponding CAC, which for fee purposes, is set up at a fee category level.
Because it is not possible to track inspection hours, the team is not recommending a dashboard be created.
D. SFST Currently, SFST has the SFST Inspection Hour dashboard in place which is an operational level dashboard used by inspectors and branch chiefs to monitor the number of inspection hours charged to each individual inspection. Because a dashboard is already in use, no changes are recommended by the team.
XIV. MD 8.3 Decisions The team found not all the IMCs include a reference to MD 8.3. As summarized in Recommendation 9, the team recommends updating the IMCs to include a reference to MD 8.3 and including guidance for the decision-making documentation to be publicly available, whenever possible.
The Office of the Inspector General (OIG) issued OIG-23-A-06, Audit of the US NRCs Processes for Deploying Reactive Inspection Teams, (ML23157A268) on May 10, 2023.
From the audit, Recommendation 1.1 stated, Update agency policies to require that staff provide complete information on screening evaluation forms, correctly profile evaluation forms in the ADAMS, and publicly share non-sensitive reactive inspection screening decision-making, whenever possible. To be consistent across the agency, NMSS needs to ensure MD 8.3 decisions are publicly available except for those containing sensitive information.
A. DLLW For incident investigations, IMCs 2602, 2801, and 2561 do not specifically reference MD 8.3. However, IMCs 2602 and 2801 mention reactive or special inspections can be performed based on incidents or accidents onsite. The team recommends that the DLLW IMCs reference MD 8.3 and direct investigation decisions to be made publicly available, whenever possible.
26 B. FFBL Plant-specific reactive inspections are specifically described in IMC 2600. IMC 2601 provides guidance for implementing the requirements prescribed in MD 8.3.
Specifically, it includes detailed guidance consisting of both deterministic criteria and risk-informed insights that can be used as a decision basis for implementing Incident Investigation Teams, Augmented Inspection Teams, and Special Inspection Teams.
Additionally, Regional Office Instruction Number 0.704, Documenting Management Directive 8.3, NRC Incident Investigation Program, Reactive Inspection Decisions in the Division of Fuel Facility Inspection, supplements the guidance contained in IMC 2601 and MD 8.3 for reviewing plant events and documenting the decision regarding event follow-up inspections. Currently, the record documenting reactive inspection decisions are non-publicly available records in ADAMS. Since these documents typically contain sensitive information, maintaining as non-publicly available would be consistent with the recommendation. However, a high-level, sanitized checklist could be generated to provide a publicly available document of decisions. IMC 2601 does not currently require MD 8.3 decisions to be made publicly available. Although there are the limitations, the team recommends the update for IMC 2601 to make MD 8.3 decisions publicly available, whenever possible.
C. NMU to IMC 2800 includes MD 8.3, IMCs, and IPs applicable to the implementation of the materials inspection program. However, IMC 2800 is silent whether the MD 8.3 decision needs to be made publicly available or not. Therefore, the team recommends updating IMC 2800 to include guidance for MD 8.3 decisions to be made publicly available, whenever possible.
D. SFST The guidance in IMC 2690 includes the conduct of supplemental or reactive inspections. The guidance states that events should be evaluated for significance to determine if any additional inspection effort is warranted. Further, it references MD 8.3 as guidance to document the decision and approval for conducting a reactive inspection. IMC 2690 does not require the MD 8.3 decisions to be made publicly available, whenever possible, and should be updated to include this guidance.
XV. Dashboards The team found that the NMSS BLs are variable with the use of dashboards. The team recommends the use of dashboards to assist in the implementation of the inspection programs and data collecting for the proposed periodic self-assessment. The team recognizes that dashboards are under development for the inspection programs and recommends the review of dashboard effectiveness be included in the BL self-assessments. As this is captured in Recommendation 1 and Section VI, there are no additional recommendations for this section.
27 A. DLLW Currently, DLLW has dashboards on the DUWP Operations and Management Hub SharePoint site to gather data, such as the CBJ metrics, quarterly reports, and workload planning. HQ personnel collect this data from Regional staff and manually input it into the dashboards. There is no direct input of data from Regional staff to these dashboards. With the absence of NRC Low Level Waste licensees, there is not a need for dashboards in this area. Dashboards related to the implementation of the inspection programs need to be developed and implemented to better monitor the health of the inspection program.
B. FFBL Currently, FFBL has the Fuel Facilities Hours Tracking dashboard in place, which is an operational level dashboard used by inspectors and branch chiefs to track hours for each IP, whether they are in or out of range with the resource estimates, track hours charged for each site, etc.
C. NMU The NMU has been using dashboards that shows eligible licensees for inspections to assist in inspection planning, the status of materials inspections, and the timeliness of materials inspections. Feedback has been positive from HQ and the Regions in the use of these dashboards.
D. SFST Currently, SFST has two dashboards in place, SFST Inspection Hours and ISFSI Inspection Schedule. The SFST Inspection Hour dashboard is an operational level dashboard used by inspectors and branch chiefs to monitor the number of inspection hours charged to each individual inspection. The ISFSI Inspection Schedule dashboard is an operational level dashboard used by inspectors and branch chiefs to have all ISFSI related inspections, whether completed or upcoming, all in one location.
XVI. Summary of Findings This section includes a comprehensive overview of the key findings and recommendation derived from the teams evaluation. The findings highlight the suggested areas for improvement and identify challenges that should be addressed. The recommendations are designed to enhance the effectiveness and efficiency of the inspection programs in NMSS. The team found the following:
Finding 1: Each BL has metrics for tracking the number of inspections and are reporting these at the CBJ level. The team found the CBJ level and Regional level metrics to be sufficient. However, the team is recommending a periodic NMSS inspection program self-assessment per NMSS P&P 6-11 to help each BL use leading indicators to evaluate the health of their programs.
Finding 2: Common reporting criteria does not exist to assess the health of the inspection programs across the BLs. While there are differences between the inspection
28 programs, the team identified criteria that may be used to assist in maintaining awareness of the inspection programs and potential programmatic impacts. The team utilized IMCs, MDs, and other agency guidance documents to suggest a set of common reporting criteria and requirements.
Finding 3: The implementation of inspection program feedback varied across the BLs in accordance with IMC 0801. To ensure common implementation, adding a section for Inspection Program Feedback referencing IMC 0801 would allow for further improvements and adjustments to the applicable inspection programs; NMU is excluded from this recommendation.
Finding 4: There are differences between each of the BLs for inspection report documentation. While some of the BLs require narrative reports for every inspection, some do not. IMC 0610 is outdated and does not capture the agencys current practices.
Although previous efforts identified gaps in IMC 0610, there has not been a coordinated effort to implement these changes.
Finding 5: The team found there is inconsistent terminology across the IMCs related to what constitutes: 1) completion of the inspection program, 2) completion of an inspection (inspection activities), and 3) completion of an IP.
Finding 6: The February 19, 2021, NMSS memorandum (ML21048A030) mandated the use of the term, program adjustment, but this term is already in use and has a different meaning in some of the BLs. The establishment of the new term, IMC Deviation, would provide consistency and uniformity across each of the BLs for the collection of inspection program information detailed in the NMSS memoranda.
Finding 7: Most of the decision-making authority associated with the inspection program resides at the Regional level. However, there is variation across the BLs in how these decisions are to be documented.
Finding 8: The NMSS inspection programs do not track inspection hours for the purposes of determining the completion of an inspection. The resource estimates in the IMCs are used for budgeting and resource planning purposes.
Finding 9: The team found inconsistencies across the BLs in reference to incident investigations. Some IMCs do not include a reference to MD 8.3. In addition, the BLs could improve uniformity in making the decision records for incident investigations publicly available, whenever possible.
XVII.Conclusions The team concluded that the NMSS inspections programs are very satisfactory.
Throughout the years, each of the inspection programs have established, implemented, and matured in alignment with the NRC mission and supporting the safe use of nuclear materials within the licensed community. These inspection programs have not always been under the same organizational structure. Because of this, all the BLs have not been holistically evaluated all at once. In addition, the available technology tools (e.g.,
software, data systems, data visualization) have greatly improved, making implementing oversight of licensed activities more efficient. Utilizing these tools allow staff to more easily monitor, assess, and gather information related to implementation of the inspection programs. The recommendations in this report are intended to leverage
29 technology tools that provide better awareness of the successes and challenges of the inspection programs implementation.
Although the team found that the guidance in the various IMCs differed in terminology, structure, and details, each program provides the necessary guidance and structure that supports appropriate oversight of their respective licensed activities. It is important to emphasize that the team did not find any major deficiencies or flaws in the current guidance that provides the framework for the inspection programs. The existing guidance still supports the effective implementation of inspection programs in NMSS, and the recommendations provided are intended to improve awareness and decision-making. The team believes the nine recommendations presented in this report will bring uniformity across the BLs and move us closer to the One NMSS vision without imposing an undue burden on the inspection staff.