ML23305A290
| ML23305A290 | |
| Person / Time | |
|---|---|
| Issue date: | 12/31/2023 |
| From: | Adelaide Giantelli NRC/NMSS/DMSST |
| To: | Kevin Williams NRC/NMSS/DMSST |
| Shared Package | |
| ML23305A231 | List: |
| References | |
| Download: ML23305A290 (28) | |
Text
ASSESSING THE ABILITY TO MONITOR NATIONAL MATERIALS PROGRAM PERFORMANCE U.S. Nuclear Regulatory Commission Office of Nuclear Material Safety and Safeguards Division of Materials Safety, Security, State, and Tribal Programs December 2023
i EXECUTIVE
SUMMARY
Based on the charter approved by the Director of the Division of Materials Safety, Security, State, and Tribal Programs and the Chair of the Organization of Agreement States, a 15-person working group, comprised of staff from across the U.S. Nuclear Regulatory Commission and four Agreement States was assembled to assess whether the current Integrated Materials Performance Evaluation Program process with respect to providing a proactive assessment of the National Materials Program radiation control programs performance.
The working group was organized into two subgroups, a data group, and an interview group.
Information collected by the subgroups was reviewed, evaluated, and shared with the entire working group to develop overall recommendations. As the working group performed its activities outlined in the Charter of reviewing self-assessments, analyzing IMPEP data, and interviewing IMPEP stakeholders, the working group focused on recommendations that fell under two distinct categories:
Enhancing awareness of the RCPs performance, including metrics used to identify programs with a declining performance or performance challenges; and Improving the IMPEP assessment of a RCPs performance.
Through its review the working group identified five recommendations for consideration by management, three associated with enhancing awareness of the RCPs performance and two with improving the IMPEP assessment. Each recommendation offers potential action items for leadership consideration. The recommendations and action items are discussed in detail in Section 4.0 of this report.
ii Table of Contents EXECUTIVE
SUMMARY
............................................................................................................i
1.0 INTRODUCTION
............................................................................................................1 2.0 PURPOSE AND OBJECTIVES......................................................................................1 3.0 SCOPE AND METHODOLOGY.....................................................................................3 3.1 Review of NMP Performance from CY2018 to CY2022.................................................4 3.2 Consistency of RCP Reviews through IMPEP................................................................5 3.3 Evaluation of NMP Reporting..........................................................................................6 3.3.1 CBJ Metrics.................................................................................................................6 3.3.2 Annual Report to the Commission...............................................................................7 3.4 Improving Awareness of Performance Issues................................................................8 3.4.1 Promote Strong NMP Relationship.............................................................................8 3.4.2 Self-Audit Tool.............................................................................................................8 3.4.3 Periodic Meetings........................................................................................................9 3.4.4 RSAO Role..................................................................................................................9 3.4.5 Timely Assistance when Issues are Self-Identified...................................................10 3.4.6 Proactive Assessment of NMP RCPs Performance..................................................10 3.4.7 Consistency of RCP Reviews....................................................................................10 3.4.8 IMPEP Training.........................................................................................................11 4.0 RESULTS AND RECOMMENDATIONS......................................................................12 4.1 Response to Questions Posed in the Working Group Charter.....................................12 4.2 Recommendations........................................................................................................13 4.2.1 Enhancing Awareness of Radiation Control Programs Performance.......................13 4.2.2 Improving the IMPEP Assessment of a Radiation Control Programs Performance..17 APPENDIX A: WORKING GROUP MEMBERS......................................................................22 APPENDIX B: INDIVIDUALS INTERVIEWED.........................................................................23 APPENDIX C: OTHER OPTIONS CONSIDERED BY THE WORKING GROUP...................24
1
1.0 INTRODUCTION
The National Materials Program (NMP) radiation control programs (RCPs) continue to provide adequate protection of health and safety and compatibility with the NRCs regulatory program.
Over the last 5 calendar years (CYs) (2018 to 2022), Agreement State and U.S. Nuclear Regulatory Commission (NRC) staff have conducted 51 Integrated Materials Performance Evaluation Program (IMPEP) reviews and evaluated 338 individual performance indicators. Of the 338 performance indicators reviewed during that time frame, 299 were found satisfactory (SAT), 26 satisfactory but needs improvement (SBNI), and 13 were found unsatisfactory (UNSAT). This equates to approximately 88 percent satisfactory, 8 percent satisfactory but needs improvement, and 4 percent unsatisfactory, respectively.
In CY 2022 we saw a significant increase in the number of unsatisfactory performance indicators ratings with 8 of the 13 unsatisfactory performance indicators noted above (see Table 1 below). As a result, the NRC did not meet the 2022 Congressional Budget Justification (CBJ)
NM-23 performance metric of zero (0) percent of the RCPs having more than one unsatisfactory performance indicator rating. The table below summarizes the unsatisfactory ratings by RCP, review year, and performance indicator. Consequently, it appears that there may be an emergent trend in the number of unsatisfactory ratings recently.
Table 1 - PERFORMANCE INDICATORS RCP PROGRAM IMPEP REVIEW SMIP1 TQI2 TQLA3 TQIAA4 LROPE5 Arkansas 2018 SAT6 SAT UNSAT7 SAT SAT Kansas 2018 SAT SAT SAT UNSAT SAT New York 2018 SAT SAT SAT SAT UNSAT Florida 2019 SAT SBNI8 SAT SAT UNSAT Rhode Island 2021 SAT SAT UNSAT SAT SAT Mississippi 2022 UNSAT UNSAT UNSAT UNSAT SBNI New York 2022 SAT SAT SAT SAT UNSAT North Carolina 2022 SAT SAT SAT SAT UNSAT Washington 2022 SAT SBNI UNSAT UNSAT SBNI 1 SMIP - Status of the Materials Inspection Program 2 TQI - Technical Quality of Inspections 3 TQLA - Technical Quality of Licensing Actions 4 TQIAA - Technical Quality of Incident and Allegation Activities 5 LROPE - Legislation, Regulations, and Other Program Elements 6 SAT - Satisfactory 7 UNSAT-Unsatisfactory 8 SBNI - Satisfactory, but Needs Improvement
2 2.0 PURPOSE AND OBJECTIVES The purpose of working group was to assess the current IMPEP process with respect to providing for a proactive assessment of the NMP RCPs. In addition, the NRC wanted to evaluate the effectiveness of the IMPEP program to predict, identify, and reverse declines in performance indicators before a RCPs performance would result in an unsatisfactory finding by the NRC. Because of the recent declines in performance, the NRC wanted to identify potential root causes common to declines identified in recent reviews and identify leading indicators to help identify RCPs experiencing challenges to achieving satisfactory findings for each indicator.
The NRC established a 15-person working group, comprised of staff from the NRC and Agreement States. A copy of the charter is available in the NRCs Agencywide Documents Access and Management System (ADAMS) using the Accession Number ML22305A688. The working group was instructed to:
As appropriate, build on the recommendations from Self-Assessment of Integrated Materials Performance Evaluation Program (IMPEP) report dated June 2010 (ADAMS Accession Number ML102030228) and the Focused Self-Assessment of Integrated Materials Performance Evaluation Program (IMPEP) dated June 2018 (ADAMS Accession Number ML17187A100).
Evaluate the results of recent IMPEP reviews and assess whether there are connections between unsatisfactory performance indicators.
Recommend changes to enhance the effectiveness of IMPEP processes (periodic meetings, legislation and regulation compatibility reviews, and IMPEP reviews, including the consistency of review team findings and recommendations).
Evaluate the need for changes, if any, in how Regional State Agreement Officers (RSAOs) participate in the IMPEP process.
Assess if the 2019 revision of Management Directive (MD) 5.6, Integrated Materials Performance Evaluation Program (IMPEP), led to lower performance ratings.
Determine if the pandemic impacted the RCPs performance.
As the working group performed its activities of reviewing self-assessments, analyzing IMPEP data, and interviewing IMPEP stakeholders, the working group focused on recommendations that fell under two distinct categories:
Enhancing awareness of the RCPs performance, including metrics used to identify programs with a declining performance or performance challenges; and Improving the IMPEP assessment of a RCPs performance.
Section 4.0 of this report provides conclusions and recommendations for NRC management to consider. In addition, Section 4.0 addresses the following questions directed to the working group in the Charter:
Considering NMP performance over approximately the last 5 years, have recent IMPEP reviews identified a trend toward more unsatisfactory performance?
3 Are RCPs reviewed consistently through IMPEP?
Are procedures and processes, including roles and responsibilities, sufficient to ensure effective and consistent IMPEP reviews?
Does the Annual Report to the Commission provide sufficient insights on performance trends?
Considering the IMPEP process, can it support the identification of leading indicators for performance trends?
What process changes can improve awareness of performance challenges within the NMP?
What tools are available (or needed) to detect a downward performance trend?
Can we develop leading indicators to identify declining program performance early?
Can we develop performance metrics to measure NMP performance, as allowed in SA-100?
3.0 SCOPE AND METHODOLOGY The working group was organized into two subgroups, a data group, and an interview group.
Information collected by the subgroups was reviewed, evaluated, and shared with the entire working group to develop overall recommendations.
The data subgroup examined results of recent IMPEP reviews to evaluate the effectiveness of the program and identify potential trends or other performance issues. The data subgroup considered current methods of data collection to determine whether the NRC has information sufficient to assess NMP performance and whether additional or different data might identify declining NMP performance sooner. The data subgroup also reviewed the recommendations from the 2010 and 2018 IMPEP self-assessments.
The interview subgroup developed a standard set of questions and interviewed stakeholders consisting of MRB members (including an Organization of Agreement States (OAS)
Representative), NRC Managers, RSAOs, NRC IMPEP Team Leaders, and NRC and Agreement State IMPEP Team Members. Additionally, the working group hosted public meetings on February 14, and 23, 2023, to collect opinions from Agreement State and NRC staff participants. No other members of the public participated in either of these two meetings.
The subgroup met regularly to discuss information collected and trends that were identified based on information shared by the stakeholders that were interviewed. The list of individuals interviewed by the subgroup can be found in Appendix B.
Following the data and information collection by the subgroups, the entire working group joined together to discuss findings and develop recommendations. These findings and recommendations are provided in Section 4.0 of this report.
4 3.1 Review of NMP Performance from CY2018 to CY2022 The working group evaluated NMP performance from CY2018 to CY2022 to identify potential IMPEP performance trends. The working group reviewed the overall NMP performance, as measured by IMPEP adequacy and compatibility results, and determined that adequacy and compatibility rates were typically above 90 percent (see Table 2). The working group noted that the overall compatibility percentage for 2022 was 87.2 percent (see Table 3). A summary of adequacy and compatibility rates over the last 5 years are provided in the Tables 2 and 3, respectively, below.
Table 2 - NMP Performance by Adequacy CY20181 CY20192 CY2020 CY20213 CY2022 Adequate 38 39 40 37 36 Adequate, But Needs Improvement 4
4 3
3 4
CY Adequacy percentage 90.5%
90.7%
92.5%
92.5%
90.0%
1 In CY2018, 42 RCPs were subject to IMPEP. It consisted of 38 Agreements States. The 3 NRC regional programs and the Headquarters program were evaluated individually.
2 On September 30, 2019, Vermont became an Agreement State. Therefore, 43 RCPs were subject to IMPEP.
3 In CY2021, the NRC started evaluating NRC regional and Headquarters programs as one RCP. The first consolidated NRC IMPEP review was conducted in June 2021. Therefore, 40 RCPs were subject to IMPEP.
Table 3 - NMP Performance by Compatibility1 CY2018 CY20192 CY2020 CY2021 CY2022 Compatible 36 36 37 37 34 Not Compatible 2
3 2
3 5
CY Compatibility percentage 94.7%
92.3%
94.9%
94.9%
87.2%
1 Compatibility only applies to the Agreement States.
2 On September 30, 2019, Vermont became an Agreement State The working group also evaluated NMP performance, as measured by the individual performance indicator results. The NRC and Agreement State staff have conducted 51 IMPEP reviews and evaluated 338 performance indicators over this 5-year period (CY2018 to CY2022).
Each performance indicator is assessed in one of three findings:
Satisfactory (SAT)
Satisfactory, but needs improvement (SBNI)
Unsatisfactory (UNSAT)
5 In Figure 1 below, a graphical representation of the performance indicator findings.
Based on the performance indicator review, there was an increase in SBNI and UNSAT for CY2022. The working group noted that NMP performance in 2022 was significantly impacted by the results of two IMPEP reviews. These two IMPEP reviews accounted for 19 (5 of 26) percent of the SBNI performance indicator results and 46 (6 of 13) percent of the UNSAT performance indicator results between CY2018 and CY2022. The working group concluded that the decline in performance was the direct result of these two IMPEP reviews with unique circumstances and not indicative of a broader trend across the NMP.
3.2 Consistency of RCP Reviews through IMPEP The working group also reviewed the past 2010 and 2017 IMPEP self-assessments and determined that all the recommendations from these self-assessments were successfully implemented. Some of the recommendations, while previously completed, are still good practices and relevant to some of the findings/themes identified by this working group, including the importance of:
team member, team leader, and new MRB member training; a more performance-based approach to assessing compatibility results; conducting materials, low-level radioactive waste, and uranium recovery inspector accompaniments; and 62 75 31 74 57 9
3 1
3 10 3
1 1
8 0
10 20 30 40 50 60 70 80 CY2018 CY2019 CY2020 CY2021 CY2022 Satisfactory (SAT)
Satisfactory, But Needs Improvement (SBNI)
Unsatisfactory (UNSAT)
Figure 1 Performance Indicator Findings CY2018 to CY2022
6 continuing to expand the pool of trained Team Leaders.
The working group also evaluated recent IMPEP review results to determine if the 2019 update to MD 5.6, that was first implemented in October 2020, introduced any potential or unintended impacts on the IMPEP or IMPEP review results. Based on a review of the data, the working group found that the 2019 update to MD 5.6 did not result in any observable impacts on the IMPEP or IMPEP review results. The working group also assessed IMPEP review results to determine if the pandemic had any adverse impacts on the IMPEP or IMPEP review results.
Based on a review of the data, the working group did not find discernable evidence of any adverse impacts from the pandemic on IMPEP or IMPEP review results.
3.3 Evaluation of NMP Reporting The working group first reviewed and evaluated the current metrics ability to track and assess NMP performance. Specifically, the working group assessed NMP metrics in (1) the CBJ; and (2) the annual report to the Commission.
It should be noted that stakeholders and the working group unanimously considered metrics to be an important tool for tracking performance and overall programmatic health.
3.3.1 CBJ Metrics The working group reviewed the current CBJ performance indicator (NM-23) to determine if theres room for improvement. NM-23 measures the Percentage of Materials Programs with More Than One Unsatisfactory Performance Indicator, with a zero (0) percent threshold for success. Staff determined that NM-23 was unnecessarily narrow as it did not consider other important program performance measures like the number of programs on monitoring (MON),
heightened oversight (HO), or probation (PROB). Programs on MON, HO, and PROB all warrant additional attention from the NRC and Agreement State partners. Because IMPEP review results measure past performance, there is no way to recover from a missed metric (NM-23). Once a programs IMPEP is completed and the program is found to have more than one unsatisfactory indicator, the metric trips red (failure). Additionally, NM-23 only measures negative performance in the year that an IMPEP was conducted, with the metric reverting to green at the beginning of the next year regardless of the degree of recovery within the challenged program.
The working group considered how to update the CBJ metric to measure more than just the number of unsatisfactory performance indicator results. The working group looked at replacing NM-23 with a new CBJ performance metric that measures the Percentage of National Materials Programs on Enhanced Oversight or Probation. The MRB Chair makes the decision to place and Agreement State program on Enhanced Oversight (either MON or HO). Whereas the Commission makes the decision to place an Agreement State on Probation. Enhanced Oversight and Probation include a graduated approach of increased communication, with a focus on improving programmatic weaknesses, and enhanced regulatory oversight. Programs on HO or PROB are also required to develop a Program Improvement Plan (PIP) and participate in periodic conference calls between the NRC and the Agreement State program.
The working group considered metric targets of less than 10, 15, and 20 percent of Agreement State Programs on Enhanced Oversight. Based on historical data, the working group
7 determined that a target of 10 percent would not be realistic. The working group also determined that 20 percent was not restrictive enough. The working group considered that a less than 15 percent threshold would provide a clear indicator of NMP health and will help alert management to negative trends. The new performance indicator metric of less than 15 percent would also allow the NRC to work with the NMP proactively to take necessary actions prior to tripping the metric.
Therefore, the working group recommends that a new performance indicator for NMP performance is established. Management should consider establishing a <15% of Agreement State Programs on enhanced oversight metric.
A summary of the number of RCPs on enhanced oversight over the last 10 years is provided in Table 4, below.
Table 4 - Number of RCPs on Enhanced Oversight 2012 2013 2014 2015 2016 2017 2018 2019 2020 2021 2022 RCPs on MON 5
6 7
6 6
6 6
4 2
RCPs on HO 3
1 1
1 1
1 2
RCPs on PROB 1
1 RCPs on MON, HO, and PROB 8
8 8
7 6
6 7
4 2
1 3
Number of RCPs 37 37 37 37 37 37 38 39 39 39 39
% RCPS on MON, HO, and PROB 22%
22%
22%
19%
16%
16%
18%
10%
5%
3%
8%
3.3.2 Annual Report to the Commission The working group reviewed the Annual Report to the Commission on the health of the NMP and found that the most recent reports include the following performance information and statistics:
A 5-year summary of performance indicator results (e.g., number of PIs reviewed, number and percentage of SATs, SBNI, and UNSATs);
A 1-year summary of performance indicator results for the most recent year (e.g.,
number of PIs reviewed, number and percentage of SATs, SBNI, and UNSATs);
A comparison of the 1-year to 5-year results and a discussion of the impacts in the CBJ metric (e.g., 2 programs contributed missing metric);
A summary of the most recent IMPEP findings and associated adequacy and compatibility results for each of the RCPs that constitute the NMP and a discussion/observation of the most challenging PIs; and
8 A summary and comparison of overall adequacy and compatibility.
The working group determined that the Annual Report could be improved by including additional tracking, trending, and statistical insights providing information regarding the overall health of the NMP. A recommendation for consideration to address this is outlined in Section 4.0 of this report and proposes that management consider establishing a joint NRC/OAS working group to create and define these metrics.
3.4 Improving Awareness of Performance Issues 3.4.1 Promote Strong NMP Relationship The working group and stakeholders continue to describe IMPEP as a strong program within the NMP. Under this program, individuals from NRC and Agreement States work together to evaluate the effectiveness of each RCPs performance while ensuring public health, safety, and security surrounding the use of radioactive materials is maintained.
The working group identified that RCPs that experienced declines in performance often had turnover in personnel. This turnover in personnel often affected the programs ability to reach out to other RCPs for information or formal advice and made it less likely that the program would approach the NRC for help early in the process. Stakeholders noted that increasing the opportunities for NMP program personnel to build relationships and discuss common performance challenges would likely result in more engaged staff and may mitigate potential future declines in performance.
Stakeholders also discussed the possibility of leveraging NMP relationships for resource sharing on a case-by-case basis.
3.4.2 Self-Audit Tool RCPs can improve if managers are aware of areas that could benefit from more attention. Early identification of challenges facilitates timely corrective actions. The working group recognizes the benefits of self-reflection/self-identification and taking a pro-active approach to solving performance issues. The working group heard from stakeholders that performing periodic self-audits could aid with early identification of potential challenges in an RCP. Additionally, during the public meetings conducted by the working group, Agreement State representatives strongly encouraged the use of a self-audit tool by RCPs. Self-audit results could be shared during the periodic meeting and the next IMPEP review. The working group believes that this self-audit tool can be voluntarily completed on an annual basis or after a significant change which impacts an indicator such as Technical Staffing and Training.
The self-audit would be beneficial to the RCPs, resulting in early identification of issues and RCPs taking immediate corrective actions. This could increase the efficiency and effectiveness of the overall IMPEP and periodic meeting process.
In developing the self-audit tool, a future working group may consider that Web-Based Licensing (WBL) allows users to identify the number of licensing actions completed, inspections completed, and other items of interest that could be included in the self-audit. The working group acknowledges that while WBL does not address all potential leading indicators for a program in decline, such as staffing levels, WBL does capture several other valuable indicators
9 on licensing and inspection that can of value to a self-audit tool. In 2023, there are 9 RCPs who use Web-Based Licensing (WBL), and 3 more RCPs are expected to onboard. In 2024, the NRC projects to have 12 RCPs using WBL and another 5 RCPs to on-board. In 2025, the NRC projects to have 17 RCPs using WBL and possibly one more RCP to onboard. Therefore in 2026, we expect to have 18 RCPs using WBL.
3.4.3 Periodic Meetings The working group and stakeholders emphasized the importance of the periodic meeting between the NRC and the RCP. Periodic meetings are informal and allow for early identification of challenges within an RCP. This long-standing open forum continues to provide opportunities to candidly discuss program performance and items of concern. Typically, there is no evaluation during a periodic meeting as it is solely an informal discussion. Success in the periodic meeting relies on open communication and trust between the NRC and the RCP.
Review of data for IMPEP performance indicators suggests that most programs showed a decline in one of more performance indicators at the IMPEP before being put on some form of heightened oversight. During periodic meetings, areas of concern from the previous IMPEP review are discussed as part of the agenda. The working group heard from numerous stakeholders that the periodic meeting should be strengthened to drive discussions into greater levels of detail and assist clearly identifying program challenges. A recommendation to establish a working group to make improvements to the periodic meetings is outlined in Section 4.0.
3.4.4 RSAO Role The working group and stakeholders acknowledge the importance of the RSAO role in identifying program performance issues and assisting with corrective actions. The relationship between the RSAO and Agreement State is vital and cannot be overlooked when discussing the IMPEP process and program challenges. This relationship is a vital building block in the foundation for trust and open communication.
After one of the most challenging IMPEP reviews and subsequent MRB meetings in 2022, there were questions regarding the need for early identification of declining performance. The IMPEP is a past-performance analysis process. Enhancing open dialogue regarding self-assessments/metrics and providing tools that the RSAO can point the Agreement State to for support would go a long way to early identification of the potential for a decline or improvement of program performance.
The number of Agreement States has increased and continues to increase while the number of RSAOs has remained constant. The working group noted that in 1990, there were 28 Agreement states and five RSAOs. Today there are 39 Agreement States (soon to be 41) and still only five RSAOs. With the increase in states assigned to the individual RSAO comes an increase in responsibilities. The working group and stakeholders expressed concerns about the ability of the current five RSAOs to continue to provide the high-level of exceptional service to each Agreement State as we increase the number of States each RSAO supports. Increasing the number of RSAOs will allow for more time and attention to the Agreement States (especially those that need extra attention) and showcasing the NRCs commitment to establishing and maintaining excellent working relationships with the States The working group recommends management consider assessing the current and future role of the RSAO in the NMP as outlined in Section 4.0 of this report.
10 3.4.5 Timely Assistance when Issues are Self-Identified The working group also considered how to effectively and efficiently provide timely targeted assistance to RCPs with performance challenges. The working group focused on the current SA procedure for programs requesting assistance.
The working group and stakeholders noted that once programmatic challenges have been identified, the RCP should have a clear, concise method for finding assistance. The current process to request assistance is described in the section titled Programmatic Technical Assistance in SA-1001. It does not leverage the cooperation in the NMP. For instance, in addition to the NRC, the RCPs can request assistance from the OAS Board or other Agreement States.
In the last five years, five Agreement States self-identified programmatic challenges and requested assistance initiated through informal channels outside of the SA-1001 process.
Relying primarily on specific individuals knowledge, Agreement States, RSAOs, Office of Nuclear Material Safety and Safeguards (NMSS), and the OAS Board were able to compile many of the necessary elements to assist these programs.
The stakeholders suggested and the working group agreed that revising the Programmatic Technical Assistance section of SA-1001, Implementation of Management Directive 5.7, Technical Assistance to Agreement States would be beneficial. The revision of this section should include self-identified programmatic issues, in addition to ones discovered during IMPEP reviews. The revision should describe how programs can request timely assistance through the OAS Board, RSAOs, other Agreement States, and NMSS. It should also identify how the request is acknowledged, assigned, tracked, and closed.
3.4.6 Proactive Assessment of NMP RCPs Performance The working group evaluated whether the current IMPEP process provides for a proactive assessment of the NMP RCPs performance. Because IMPEP is a past-performance analysis process, it is difficult to extrapolate the information gathered during an IMPEP review to make assessments of future performance. The working group and stakeholders found it difficult to identify leading indicators to predict programs that may have challenges in the future. As noted in the Results and Analysis section of this report, the data did reflect an item that bears notation.
Programs put on heightened oversight, in nearly all cases, showed a less than satisfactory determination in one or more indicators on the previous IMPEP review.
As the working group looked deeper into the current IMPEP processes, stakeholders were eager to assist in identifying potential improvements to the IMPEP process. Recommendations relating to IMPEP consistency implementation and streamlining processes are outlined for consideration as part of the recommendations in Section 4.0.
3.4.7 Consistency of RCP Reviews Through discussions regarding consistency of RCP reviews, the working group examined moving three (Sealed Source and Device Evaluation Program (SS&D), Low-Level Radioactive Waste Disposal Program (LLRW), and Uranium Recovery Program (UR)) of the four non-
11 common performance indicators into the common performance indicators. This will allow for all agreement state programs being evaluated under the same six indicators.
SS&D, which has three sub-elements, could be addressed in the following performance indicators: Technical Staffing and Training (TST), Technical Quality of Licensing Actions (TQLA), and Technical Quality of Incident and Allegation Activities (TQIAA). LLRW and UR, which have five sub-elements (identical to the common performance indicators), will be addressed in all five common performance indicators. For example, the TST indicator will discuss the entire radiation control program staff. It will not be disjointed (e.g., meaning in four different sections of the report), but rather inclusive to address the materials, SS&D, LLRW, and UR staff in one section of the report.
Evaluating all RCPs under the same common performance indicators would allow for more risk-informed reviews of the entire RCP. As of FY2023, the procedures consider equal weighing of the nine performance indicators but the majority of LLRW and UR facilities are closed facilities and do not rise to a level of risk significance. During an IMPEP review, these non-common performance indicators are weighed the same in comparison to, for example, industrial radiography, which carries a much higher risk significance. Incorporating the SS&D, LLRW, and UR performance indicators as part of the common performance indicators will allow the team to make risk-informed decisions regarding the programs overall ability to protect public health and safety.
This review will allow each Agreement State RCP to be evaluated using the same six indicators.
The one NRC RCP, however, will be evaluated with five performance indicators because the non-common performance indicator Legislation, Regulations, and Other Program Elements (LROPE) only applies to Agreement States. All RCPs will be evenly evaluated using the same criteria regardless of what is incorporated in their agreement. The working group recommends management consider establishing a joint working group to modify Management Directive 5.6 and the associated State Agreement procedures so that all RCPs can be evaluated under the same common performance indicators, as outlined in Section 4.0.
3.4.8 IMPEP Training With the expansion of Agreement States, there will be more IMPEP reviews. With the need for more interactions, sustained performance, and the level of interest in IMPEP reviews, it would be beneficial to increase the workforce related to IMPEP reviews and training. Stakeholders remarked that participating on IMPEP teams expanded their knowledge of other RCPs and provided valuable information sharing between NMP programs.
The working group reviewed IMPEP team member and team leader training to determine that whether these programs are successfully training current and future IMPEP reviewers and team leaders, and if there is room for improvement.
Ideally, the NRC would like to host team member training annually. In September 2022, the NRC trained 36 new and existing team members, including 20 NRC staff and 16 staff from 13 different RCPs. Because individuals on IMPEP teams perform reviews somewhat infrequently, effective training to ensure consistent implementation of procedures is essential. Team member training has been successful in preparing individuals to review indicators, make decisions based on guidance, and assume their role on the IMPEP team. Even though IMPEP Team Leaders
12 mentor new team members, the working group and stakeholders noted that taking IMPEP training beyond its basics would likely improve consistency within and across teams.
The NRC also hosts Team Leader training annually and in March 2023, the NRC held Team Leader training for 12 current team leaders and 8 new Team Leaders-in-Training, including 2 former and 2 current Agreement State Team Leaders-in-Training. Team Leader training has been successful in preparing Team Leaders and Team Leaders-in-Training to lead teams effectively, document findings, promote consistency, and fosters counterpart interaction.
4.0 RESULTS AND RECOMMENDATIONS 4.1 Response to Questions Posed in the Working Group Charter Response to questions posed within the charter.
Considering NMP performance over approximately the last 5 years, have recent IMPEP reviews identified a trend toward more unsatisfactory performance?
Based on the review of data, the working group found although there was an increase in less than satisfactory performance, the working group concluded that this decline was the direct result of two IMPEP reviews with unique circumstance and not indicative of a broader trend.
Are RCPs reviewed consistently through IMPEP?
The working group found that the non-common indicators SS&D, LLRW, and UR may be a relatively small part of an Agreement States program, when compared to its licensing and inspection oversight program. Therefore, an UNSAT in one or more of these indicators can result in a disproportionate negative impact on the overall performance of the Agreement State. Applying a risk-informed approach to these non-common indicators was recommended by the working group. The working group determined that incorporating the SS&D, LLRW, and UR indicators into the common performance indicators would allow all RCPs to be evaluated using consistent criteria.
Are procedures and processes, including roles and responsibilities, sufficient to ensure effective and consistent IMPEP reviews?
The working group noted that IMPEP procedures are well defined. Stakeholders remarked that these processes and procedures do create effective IMPEP reviews, and that maintaining training is essential to keep reviews consistent.
Does the Annual Report to the Commission provide sufficient insights on performance trends?
The working group determined that the Annual Report is not designed to look at trends and could be improved by including additional tracking, trending, and statistical insights.
See Section 4.0, Recommendations, Action 1-3.
Considering the IMPEP process, can it support the identification of leading indicators for performance trends?
13 Because IMPEP is a past-performance analysis process, it is difficult to extrapolate the information gathered during an IMPEP review to make assessments of future performance. Although the working group and stakeholders found it difficult to identify leading indicators to predict programs that may have challenges in the future, it was noted that programs put on heightened oversight, in nearly all cases, showed a less than satisfactory determination of one or more indicators on the previous IMPEP review. The working group recommends management consider other ways to improve awareness of potential program performance concerns, such as improving periodic meetings to be more substantive and continuing to strengthen NMP relationships.
What process changes can improve awareness of performance challenges within the NMP?
The working group and stakeholders support multiple efforts to improve awareness of programs experiencing performance challenges. Recommendations were developed to incorporate changes in several areas, see Section 4.0, Recommendations, Actions 2-3, 2-5, 2-9, 2-12, and 2-13.
What tools are available (or needed) to detect a downward performance trend?
Existing metrics only look at past performance. The working group did not identify any existing tools used within the NMP that would detect or predict a downward performance trend. Stakeholders remarked that development of an NMP self-audit tools and creating more detailed periodic meetings may be ways to detect downward performance. See Section 4.0, Recommendations, Actions 2-9 and 2-12.
Can we develop leading indicators to identify declining program performance early?
Current methods measure past performance and as such are not effective predictors. No specific leading indicators in the existing IMPEP process were identified by the working group. However, the working group, consistent with stakeholder suggestions, determined that tools such as self-audits, could reveal indications of declining performance.
Can we develop performance metrics to measure NMP performance, as allowed in SA-100?
Yes, the working group and stakeholders support developing performance metrics that represent the health of the NMP. See Section 4.0, Recommendations, Action 1-3.
4.2 Recommendations Based on the activities of the working group, recommendations were made in two categories:
Enhancing awareness of the RCPs performance, including metrics used to identify programs with a declining performance or performance challenges; and Improving the IMPEP assessment of a RCPs performance.
14 The report also contains options that the working group considered but did not recommend for implementation. These options are found in Appendix C.
4.2.1 Enhancing Awareness of Radiation Control Programs Performance Consistent with the Charter, the working group evaluated the effectiveness of the IMPEP process in monitoring NMP performance. As a result of this evaluation, the working group developed three recommendations to address: metrics for tracking performance; tools for identifying performance issues; and assisting programs facing performance issues.
Recommendation #1: Identify and implement meaningful performance metrics to track the health of the NMP.
As a result of this review and input from stakeholders interviews and public meetings, the working group recommends updating the existing metrics to better reflect the overall health of the NMP. Additionally, the working group recommends that new performance tracking tools be developed for the annual report to the Commission. The actions listed below represent potential options for implementing this recommendation:
Action 1-1, Discontinue use of CBJ metric NM-23, Percentage of Materials Programs with More Than One Unsatisfactory Performance Indicator. This CBJ is narrow and does not include other important program measures such as the number of programs on MON, HO, or PROB. Additionally, the metric only applies to the IMPEP year it was measured while a programs performance issues may persist.
Action 1-2, Create a new CBJ metric, Percentage of National Materials Programs on Enhanced Oversight or Probation. The working group recommends the CBJ metric threshold of less than 15 percent to provide a clear indication of the health of the NMP.
Action 1-3, Identify performance tracking tools in addition to the Annual Report to the Commission. The working group noted that the Annual Report already contains some high-level NMP performance and tracking information. The working group recommends management sponsor a joint NRC/OAS effort to assess the need for additional performance metrics and develop tracking tools.
Recommendation #2: Develop tools and strategies for identifying potential performance issues and facilitating prompt corrective actions.
The working group considered what tools and strategies, both new and existing, that timely identify performance issues and assist RCPs in making effective corrective actions. The actions listed below represent potential options for implementation of this recommendation.
Action 2-1, Continue to support the NMP through monthly NMP calls, Champions Chats, and other communication pathways to promote early communication and relationship building.
Action 2-2, Continue to support the NMP through the OAS Annual Meeting. Encourage States to attend and participate in the meeting. Stakeholders described the benefits of participation in the timely topical discussions arranged by the joint NRC and OAS Planning Committee and networking at this meeting.
15 Action 2-3, Assist the Agreement States/OAS with the creation of a pamphlet for new Radiation Control Program Directors (RCPDs). Information provided will enhance new RCPDs understanding of IMPEP and the NMP. This can be done by offering to post the pamphlet on the State Communications portal and to print the pamphlets for distribution to the new RCPDs. OAS had a poster at the 2023 OAS Annual Meeting on this topic.
Action 2-4, Encourage use of the forum on the State Communications portal. The forum facilitates conversations and open dialogue across the NMP. The State Communications portal has been redesigned and continues to be an effective method to timely communicate important topics related to the NMP. Support by providing resources to facilitate oversight and prompt updating to the portal should be continued.
Action 2-5, Develop an IMPEP Awareness Training for new RCPDs. The working group and stakeholders recognize that when an RCPD understands the IMPEP program and significance, they are more likely to address challenges, performance issues and gain assistance before these challenges result in a decline of their overall performance.
Action 2-6, Facilitate a counterpart meeting for all NMP inspectors. The CRCPD annual conference and the OAS annual meeting are typically attended by the RCPDs. Staff participation is much lower. To ensure the success of the NMP, it would be beneficial to hold - possibly every 3 years - an in-person counterpart meeting for NMP inspectors.
Not every inspector will need to attend, but there should be representation from each Agreement State. Then the inspector(s) attending can brief their fellow inspectors and management. This will foster relationships and encourage open transparent communication regarding the resolution of common performance challenges.
Action 2-7, Facilitate regular NMP meetings for Agreement State license reviewers and inspectors using the NRC monthly Part 35 and the Commercial, Industrial, Research and Development, and Academic meetings model. These are excellent discussion forums that currently do not include Agreement State representatives. This will foster relationships and encourage open transparent communication regarding the resolution of common performance challenges.
Action 2-8, Encourage Agreement States to draft licensing guidance for new emerging technologies that they first encounter. The NRC may not have licensees using these new technologies and would appreciate the Agreement States providing a draft document.
Action 2-9, Establish a joint working group to develop a self-audit tool. Every RCP can improve if managers are aware of areas that could benefit from more attention. Early identification of challenges facilitates timely corrective actions and adds efficiency to the periodic meeting and IMPEP report process. This tool can be voluntary and strongly encouraged for the next five years allowing time to pilot this process to ensure it is effective and helpful in identifying challenges early.
Action 2-10, Establish a periodic review of the self-audit tool. The efficacy of the self-audit tool can be re-examined after five years (to give those programs on the five-year cycle time for input) to determine its effectiveness in identifying performance issues and
16 corrective actions across the NMP. This will also allow for any needed course corrections or improvements to the self-audit tool.
Action 2-11, Encourage the use of a self-audit tool to focus discussions and refine the periodic meeting agenda. Stakeholders described the need for more substantive conversations during the periodic meeting without creating a formal mini-IMPEP. The use of a self-audit tool based on the IMPEP indicators would aid in these conversations.
Management should also allow IMPEP teams to be given flexibility when assessing programs that self-identified challenges and implemented corrective actions.
Action 2-12, Establish a joint working group to revise SA-116, Periodic Meetings between IMPEP Reviews. Improvements may include:
o Focus on improving the periodic meeting agenda. Develop questions that are data driven that do not require file review. Stakeholders have indicated challenges with staffing and training often is a root cause for programmatic issues. The revision should include a detailed discussion related to staffing and training. In addition, include a detailed review of indicators with a less than satisfactory rating.
o Tailor the management and/or staff level involved in the periodic meeting based on the needs of the RCP. Look for opportunities to engage Agreement States or the OAS Board, if needed.
o Shed time constraints of the periodic meeting and focus on results. Allow time to properly evaluate data or perform evaluations of less than satisfactory indicators to have more meaningful conversations. Typically, these meetings are less than one day. It may be more beneficial to extend the time and have more open collaborative discussions.
o Install a timeliness criterion for documentation of the periodic meeting summaries. Timeliness of documentation is crucial to have effective corrective and timely actions and to keep NRC management aware of issues early.
Action 2-13, Consider conducting an assessment of the RSAO role. The assessment should include clear expectations regarding the RSAO role for the NMP, the RSAO workload, and the organizational structure for the RSAO. This assessment should consider the following:
o Clarifying managements expectations regarding the RSAO role for the NMP.
Enhancing open dialogue regarding self-assessments/metrics and providing tools that the RSAO can point the Agreement State to for support would go a long way to predicting performance.
o Creating a single Team of SAOs (e.g., center of excellence) with a single point of contact and distribute Agreement State responsibilities across the entire NMP.
The working group and stakeholders indicated that this would create consistency in interactions between the SAO and the States. With the SAOs together, there could be improved backfilling when an SAO is out of the office. There could be more consistency in periodic meeting summaries as well as timeliness. Being on a team could also facilitate improved group dynamics.
17 Action 2-14, Stakeholders acknowledge that annually the RSAOs participate on many IMPEP teams both as Team Leaders and Team Members. The working group supports reducing the number of IMPEPs in which an RSAO participates as a Team Leader by continuing to train additional Team Leaders. The working group also recommends the RSAO continues as a Team Member for States to which they are assigned.
Stakeholders recognize the important role the RSAO has during an IMPEP review under these circumstances. Additionally, the working group supports increasing the number of RSAOs as a method to decrease in the number of IMPEPs they support as team members and balance workload.
Action 2-15, Establish a universal process for RSAOs to communicate with each other.
Enhance the current methods for NRC Regions, and NMSS to track state questions, status, etc. Create a tool for all stakeholders to use.
Recommendation #3: Develop NMP strategies to assist RCPs with performance challenges.
The working group also considered how to effectively and efficiently provide timely targeted assistance to RCPs with performance challenges. The action described below represents the potential option for implementation of this recommendation.
Action 3-1, The working group recommends that management consider revising the Programmatic Technical Assistance section of SA-1001, Implementation of Management Directive 5.7, Technical Assistance to Agreement States. The revision of this section should include self-identified programmatic issues, in addition to ones discovered during IMPEP reviews. The revision should describe how programs can request timely assistance through the OAS Board, RSAOs, other Agreement States, and NMSS. It should also identify how the request is acknowledged, assigned, tracked, and closed.
4.2.2 Improving the IMPEP Assessment of a Radiation Control Programs Performance The working group and stakeholders agreed that the essentials of IMPEP are sound, but there is room for improvement related to the consistency of implementation of IMPEP. As a result of this evaluation, the working group developed two recommendations.
Recommendation #4: Modify and enhance IMPEP to ensure that reviews continue to be done in a consistent and risk-informed manner.
The following actions represent potential options for implementation of this recommendation.
Action 4-1, Consider evaluating all RCPs under the same common performance indicators and establish a joint working group to implement this action. Evaluating all Agreement State RCPs using the same criteria (e.g., six performance indicators), could allow for a holistic review, as well as provide a much more aligned focus and oversight of the NMP. This could be showcased in the annual report to the Commission and should reflect an easier to comprehend set of data and highlight trends. As part of the modifications in going from nine to six indicators, the working group recommends that management consider the following:
18 o
Establishing joint working groups to modify MD 5.6, the respective SA procedures, the IMPEP questionnaire template, scheduling letter, and the IMPEP report template.
o Incorporating TI-003, Evaluating the Impacts of the COVID-19 Public Health Emergency as Part of the Integrated Materials Performance Evaluation Program into MD 5.6 to account for other disruptions (e.g., natural disasters/climate change (such as floods, tornadoes, hurricanes), government shutdowns, cybersecurity issues, electricity crises, financial crises, civil unrest/disruptions etc.) that are outside of the programs control and not be narrowly focused on pandemics. This will allow additional flexibility in evaluating the programs response(s) to these events.
o For openness and transparency, if an RCP has the potential for probation, temporary suspension, or suspension, MD 5.6 should add a meeting with NRC senior management and the programs senior management prior to the MRB meeting. This meeting will allow for NRC to discuss next steps with senior program management.
Action 4-2, Enhance Team Leader training to include more scenarios. Scenarios should be informed by previous IMPEP reviews where teams debated indicator findings or recommendations.
Action 4-3, Establish quarterly or semi-annual team leader forums. These forums would allow discussions related to current IMPEP findings. For example, once a final IMPEP report has been issued, the team leader should prepare an overview of the IMPEP, noting successes and challenges. These can be shared with other Team Leaders at these team leader forums.
Action 4-4, Continue to have Team Leaders mentor new team members for their first IMPEP review. Team Leaders also mentor team leaders-in-training creating more consistency in reviews.
Action 4-5, Consider adding IMPEP qualifications to IMC 1248. IMPEP qualifications could encourage staff to get qualified for IMPEP reviews.
Action 4-6, Ensuring team members have the opportunity to work with a varied range of Agreement State programs. IMPEP Team Members will be more engaged and develop a broader understanding of the NMP. Stakeholders have described obtaining valuable and transferable information that they are able to apply to their program from each IMPEP experience.
Action 4-7, Include additional training on report writing and how to deliver a high-level brief for exits and MRB meeting, periodically. Ask the NRCs Technical Training Center to deliver the Writing: Back to Basics training to IMPEP Team Leaders and Team Members (including Agreement State personnel). This will extend the training by a day, but it will be beneficial. Provide high-level briefing training for IMPEP Team Members to prepare for the MRB meeting.
19 Action 4-8, Consider holding semi-annual or annual MRB forums to allow MRB members to discuss items related to current IMPEP findings.
Action 4-9, Consider providing annual refresher training for MRB members.
Action 4-10, Establish Dedicated Training Staff. Assign a dedicated person to oversee and facilitate the IMPEP related training courses (e.g., IMPEP Team Leader training, IMPEP Team Member training, IMPEP refresher training, MRB member training, MRB annual refresher training).
Recommendation #5: Modify IMPEP processes to increase efficiencies. In addition to creating consistency in the implementation of IMPEP.
The working group and stakeholders are recommending management consider areas within the entire IMPEP process that may be streamlined. Overall, the working group considers the current IMPEP processes to be effective. However, based on feedback from the interviews and public meetings, there are opportunities to increase efficiencies and streamline processes. The working group recommends the following actions as potential options for implementation of this recommendation:
Action 5-1, Create an IMPEP Module for WBL. For programs using WBL, an IMPEP module could be created to determine the number of priorities 1, 2, and 3 inspections and initial inspections that were conducted overdue during the review period. The module could also identify the qualified staff (inspectors and license reviewers), the total number and list of inspections (noting the inspector and the type of licensee), total number and list of licensing actions (noting the license reviewer, type of licensee, and type of action), list of renewal licensing actions that have been pending for one year or more, the time to issue inspection reports/results, and the time to complete licensing actions (separated by new, amendments, renewals, and terminations). The team could choose casework for each inspector and license reviewer based on these reports prior to arriving on-site. Data necessary for the Status of Materials Inspection indicator could be gathered through this module and nearly completed before arriving on-site.
Action 5-2, Identify, and if possible, resolve, impediments to having Agreement State IMPEP Team Members become qualified to review LROPE. Currently, Agreement State IMPEP Team Members are not allowed to review the LROPE indicator. Consistent with SA-107, much of the evaluation of this performance indicator consists of reviewing whether the State has submitted the appropriate State statutes, legislation, and regulations to the NRC for review. Importantly, the compatibility review of submitted State statutes and regulations is conducted separately by qualified NRC staff, outside of the IMPEP process. There is a need to qualify individuals to review the LROPE indicator.
As the number of Agreement State IMPEP Team Members increases, the working group sees this activity beneficial to the NMP. Since Agreement States are regulatory partners, it is important to include them in being able to review this indicator. Implementing this recommendation would require a revision to SA-111, Implementation of Management Directive 5.10, Formal Qualifications for Integrated Materials Performance Evaluation Program Team Members and Team Leaders.
Action 5-3, Engage with the Office of Enforcement to re-consider allowing Agreement State IMPEP Team Members to be assigned the TQIAA indicator for the one NRC
20 IMPEP review. The review of NRC allegation information is currently limited to only NRC staff.
Action 5-4, Consider the effects of implementing streamlined processes to allow for flexibility in scheduling review dates. Streamlining processes could lead to less on-site time for IMPEP teams. Team Leaders should be allowed flexibility to consider a shorter review schedule which would change the traditional travel schedule. Currently, IMPEP Team Members travel on Sunday and return Friday. NRC management should consider allowing the IMPEP team the option to travel on Monday instead of Sunday with the review scheduled for Tuesday through Friday. This would allow the team to work during normal business hours and not travel on a weekend reducing travel time and cost while improving work/life balance.
Action 5-5, Reduce redundancies and streamline the IMPEP report. The working group and stakeholders pointed out several repeated items in the IMPEP report. Management should consider revising the IMPEP Report template to eliminate redundancies. By incorporating this change and reducing the number of performance indicators to six, the draft IMPEP report template can be reduced by half, and this would be much more efficient.
Action 5-6, Eliminate the proposed final report if there are no changes or comments from the program. The MRB would be provided the draft IMPEP report. Following the MRB, a final report would be issued. This would increase efficiencies.
Action 5-7, Items for the MRB Chair Consideration. For improved performance and efficiency, the working group recommends the following items for the MRB Chair to consider:
o Revise the MRB script to guide the discussion to the challenging performance indicators.
o Consider grouping all clearly satisfactory indicators into one short discussion so that the remainder of the time can be focused on those indicators that are less that satisfactory (e.g., challenging performance areas).
o Shorten the Team Leader/Team Members introduction of the results of the performance indicator (e.g., approximately one minute) or have the Team Leader provide a very short high-level review or overview of the indicators. This will allow more time for questions and discussion.
o Shed the requirement for MRB meeting minutes. The final IMPEP report is a standalone document which captures the findings at the meeting. Items discussed at the MRB which influence the outcome of the rating would not be lost by eliminating the MRB meeting minutes but would be incorporated into the final IMPEP report.
o Encourage MRB members to provide questions to the IMPEP team prior to MRB meeting and with enough lead time for the IMPEP team members to be able to provide quality answers. Because the MRB meetings are open to the public, we want the IMPEP team members to be prepared to provide the necessary
21 information so the MRB Chair can make an informed decision and for the meeting to be successful. This will allow the team to provide answers in the public forum and eliminate the need for the team member(s) to provide an answer later for questions which the team member(s) does not have an answer readily available.
o Reconsider the order of the questions from the MRB. The team may find it beneficial to start with technical questions and end with legal questions.
o Streamline the MRB process by not asking each MRB member if they agree with the teams recommendation because the MRB Chair makes the final determination. As is the practice today, each MRB member can ask questions and discuss each indicator. The working group recommends that the MRB members have discussions regarding the rating for each indicator and the overall program. Each MRB member shares their input with the MRB Chair, but the MRB Chair has the authority and responsibility to render the final decision.
22 APPENDIX A: WORKING GROUP MEMBERS Sherrie Flaherty, Co-Chair NRC Office of Nuclear Material Safety and Safeguards (NMSS)
Santiago Rodriguez, Co-Chair State of New Mexico Huda Akhavannik NRC NMSS Robert Johnson NRC NMSS Duncan White, Advisor NRC NMSS Keisha Cornelius State of Oklahoma David Crowley State of North Carolina Beth Shelton State of Tennessee Brian Harris NRC Office of General Counsel (OGC)
Jen Scro NRC OGC Farrah Gaskins NRC Region I Tammy Bloomer NRC Region I Darren Piccirillo NRC Region III Lizette Roldán-Otero NRC Region IV Linda Howell, Advisor NRC Region IV
23 APPENDIX B: INDIVIDUALS INTERVIEWED MRB Members:
- Cathy Haney, OEDO
- Brian Harris, OGC
- Rob Lewis, NMSS
- Ray Lorson, Region I
- John Monninger, Region IV
- Mohammed Shuaibi, Region III
- Tammy Bloomer, Region I
- Jack Giessner, Region III
- Mary Muessle, Region IV
- Geoff Miller, Region IV
- Kevin Williams, NMSS RSAOs:
- Jackie Cook, Region IV
- Randy Erickson, Region IV
- Monica Ford, Region I
- Farrah Gaskins, Region I
- Darren Piccirillo, Region III IMPEP Team Leaders:
- Lizette Roldán-Otero, Region IV
- Geoffrey Warren, Region III
- Duncan White, NMSS IMPEP Team Members
- NRC Participants o Latischa Hanson, Region IV o Sara Forster, Region III o Lisa Forney, Region I o Betsy Ullrich, Region I
- State Participants o Brian Goretzki (AZ) o Phill Peterson (CO) o Anjan (AJ) Bhattacharyya (KY) o Josh Daehler (MA) o Tyler Kruse (MN) o David Stradinger (ND) o Stephen James (OH) o Muhammadali Abbaszadeh (TX)
24 APPENDIX C: OTHER OPTIONS CONSIDERED BY THE WORKING GROUP C.1 RCP Draft IMPEP Report The working group discussed the option for an RCP to use the results of the self-audit to replace the IMPEP questionnaire and generate portions of the draft IMPEP report. The self-audit could be used to guide the program to complete portions of the IMPEP report allowing them to be early and directly involved in the description of their program. The IMPEP team would provide necessary changes to the report during the IMPEP review based on interviews and findings. In addition, using this approach has the potential to shorten the length of time the IMPEP team would need to be on-site for the review.
While there are some benefits to this approach, the working group does not recommend its implementation. Having the RCP write the initial IMPEP draft report implies that the final version will be somewhat similar. In instances where the IMPEP team identifies several inconsistencies with the IMPEP draft report which they received; the Program may be surprised by the IMPEP teams report that would be provided to the MRB. The working group also believes that this approach would likely create additional inconsistencies with the IMPEP report. Therefore, the working group does not recommend moving forward with this option.
C.2 Periodic Meetings The working group heard suggestions to formalize the periodic meeting by having programs complete a questionnaire similar to the IMPEP questionnaire prior to the meeting. This would involve a thorough review by the program of the details for each indicator.
While the working group believes this approach may provide more details on the status of the program, the working group does not recommend it because such a questionnaire would likely be more burdensome to the program. Instead, the working group proposes that programs use other forms of assessments (i.e., the self-audit tool) that are less resource intensive but will still assist in gauging the health of a program at the periodic meetings. Stakeholders also informed the working group that the informal periodic meeting was generally effective when all parties participated in open and candid discussions. Therefore, the working group does not recommend moving forward with this option.
C.3 RSAO Role The working group discussed multiple suggestions regarding changes to the RSAO role in the IMPEP process. The list below represents suggestions that the working group considered but does not recommend.
Remove RSAO from IMPEP reviews of their assigned states. Some stakeholders described a potential for a bias with the RSAO on the team for States which they are assigned. The working group believes that the RSAO can deliver independent and objective opinions for all IMPEP reviews.
Remove RSAOs as IMPEP Team Leaders and use only as Team Members for IMPEP reviews. The working group noted that the IMPEP process could not support removing RSAOs as Team Leaders based on the number of qualified Team Leaders available and the number of reviews scheduled each year.
25 Have RSAOs on IMPEP teams in a liaison role. RSAOs would not be assigned to review an indicator. Due to RSAO workload demands, the RSAO could attend IMPEP reviews for their respective states but not participate as a team member. In this capacity, the RSAO could assist the State as well as the IMPEP team and not have to write an indicator, thereby reducing their workload. The working group did not achieve a consensus on this option. The working group did not believe that the workload reduction was significantly beneficial and that the RSAO can continue to support the State while reviewing an indicator.
Therefore, the working group does not recommend moving forward with these options.
C.4 Programs Needing Assistance Some stakeholders described creating a specialized team made up of NRC and Agreement State personnel who would be able to go in and assist programs. Funding for this type of activity would be needed. Also having staff stop their own work and be dispatched for an unspecified period would have to be taken into consideration. This activity may have the appearance of a temporary suspension of the agreement. The working group does not recommend this approach and additional details are noted in the next paragraph.
C.5 Consistency of IMPEP Reviews The working group heard from stakeholders that designating a single team of individuals to perform all IMPEP reviews would create consistency. Although this may improve consistency in implementation of IMPEP procedures, the working group considers that some of the value of IMPEP to the NMP is lost if it does not allow for wide NMP participation. The NMP collaboration and opportunities for learning from each other has been a longstanding positive factor in IMPEP which is lost if only a few people are performing IMPEP reviews. For this reason, along with the staffing and personnel issues in creating a single IMPEP team, the working group does not recommend moving forward with this option.
C.6 MRB For programs with all satisfactory indicators on consecutive IMPEP reviews, consider the need for an MRB. The final IMPEP report would be issued and made public without an MRB.
The working group did not have consensus on this recommendation and did not consider this a good option. Most of the working group described that discarding the MRB for successful programs diminishes the significance of the IMPEP evaluation. The working group recognizes that all programs should have the opportunity to meet with the MRB to discuss that status of their program. Some of the working group believed that Agreement States have all satisfactory indicators on consecutive IMPEP reviews should have the option to not have an MRB meeting because it would be more efficient and less burdensome on the RCP.
In addition, the working group sees the advantage to holding an MRB for all reviews because other programs often listen to the MRB meetings. This opportunity to hear details regarding successful programs would be lost if eliminated under these circumstances. Therefore, the working group does not recommend moving forward with this option.