ML20245E369

From kanterella
Revision as of 19:02, 7 October 2020 by StriderTol (talk | contribs) (StriderTol Bot insert)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigation Jump to search
SA-101 Tracked Changes
ML20245E369
Person / Time
Issue date: 09/15/2020
From: O'Hara J
NRC/NMSS/DMSST
To:
O'Hara J
Shared Package
ML20183A179 List:
References
SA-101
Download: ML20245E369 (22)


Text

Office of Nuclear Material Safety and Safeguards Procedure Approval Reviewing the Common Performance Indicator, Status of Materials Inspection Program State Agreements (SA) Procedure SA-101 Issue Date: August XX, 2020 Review Date: August XX, 2025 Michael C. Layton Director, NMSS/MSST Date: 08/XX/20 Lizette Roldan-Otero, Ph.D.

Acting Branch Chief, NMSS/MSST/SALB Date: 08/XX/20 Joe OHara Procedure Contact, NMSS/MSST/SALB Date: 07/21/20 David CrowleyTerry Derstine Chair, Organization of Agreement States Date: 08/XX/20 ML19353A76320183A179 NOTE Any changes to the procedure will be the responsibility of the NMSS Procedure Contact.

Copies of NMSS procedures are available through the NRC Web site at https://scp.nrc.gov

Procedure SA-101: Reviewing the Common Performance Page: 1 of 9 Indicator Status of Materials Inspection Program Issue Date:

I. INTRODUCTION This document describes the procedure for conducting reviews of the Agreement States and U.S. Nuclear Regulatory Commission (NRC) radiation control programs (Programs) for the common performance indicator, Status of Materials Inspection Program specified in the U.S. Nuclear Regulatory (NRC) Management Directive (MD) 5.6, Integrated Materials Performance Evaluation Program (IMPEP).

II. OBJECTIVES A. To verify that initial inspections and inspections of Priority 1, 2, and 3, licensees are performed at the frequency prescribed in NRC Inspection Manual Chapter (IMC) 2800, Materials Inspection Program.

B. To verify that licensees working under reciprocity are inspected in accordance with the criteria prescribed in IMC 2800 or compatible policy developed by Agreement State radiation control programs using a similar risk-informed performance-based approach.

C. To verify that deviations from inspection schedules are approved by Program Management and that the reasons for the deviations are documented.

D. To verify there is a plan to perform any overdue inspections and reschedule any missed or deferred inspections. To determine a basis has been established for not performing any overdue inspections or rescheduling any missed or deferred inspections.

E. To verify that inspection findings are communicated to licensees within 30 calendar days, or 45 calendar days for a team inspection, after inspection completion as specified in IMC 0610, Nuclear Material Safety and Safeguards Inspection Reports and IMC 2800.

III. BACKGROUND Periodic inspections of licensed activities are essential to ensure that activities are conducted in compliance with regulatory requirements and consistent with good safety and security practices. Inspection frequency, designated by a priority code, is based on the relative risk of the radiation hazard of the licensed activity. For example, a Priority 1 licensee presents the greatest risk to health and safety of workers, members of the public, and the environment; therefore, Priority 1 licensees require the most frequent inspections. Information regarding the number of overdue inspections is a significant measure of the status of a radioactive materialsradiation control inspection program.

Procedure SA-101: Reviewing the Common Performance Page: 2 of 9 Indicator Status of Materials Inspection Program Issue Date:

IV. ROLES AND RESPONSIBILITIES A. IMPEP Review Team Leader (Team Leader)

1. In coordination with the IMPEP Program Manager, the Team Leader determines which team member is assigned lead review responsibility and assigns other team members to provide support, as necessary.
2. Communicates the teams findings to Program Management and ensures that the teams findings are in alignment with MD 5.6.

B. Principal Reviewer

1. Reviews relevant documentation, conducts management and staff discussions, and maintains a summary of all statistical inspection information received.
2. Calculates the percentage of Priority 1, 2, 3, and initial inspections completed overdue in accordance with Appendix A: Overdue Inspection Calculation Worksheet of this procedure.
3. Verifies that reciprocity inspections are completed in accordance with the NRCs IMC 2800.
4. Reviews inspection communications sent to licensees to verify that findings are communicated to the licensee in accordance with the NRCs IMC 2800.
5. Informs the Team Leader of their teams findings throughout the onsite review.
6. Presents the teams findings to the Program at the staff exit meeting.

6.7. Completes their portion of the IMPEP report for the Status of Materials Inspection Program performance indicator reviewed.

7.8. Attends the IMPEP Management Review Board meeting for the IMPEP review; and is presents and pared to discusses the teamsir findings for the Status of Materials Inspection Program performance indicator (this can be done either in-person or remotely).

V. GUIDANCE A. Scope

1. This procedure evaluates the quantitative performance of routine Priority 1, 2, 3 and initial inspections of the Agreement State or NRC program and

Procedure SA-101: Reviewing the Common Performance Page: 3 of 9 Indicator Status of Materials Inspection Program Issue Date:

inspections of reciprocity licensees in accordance with IMC 2800 since the last IMPEP review.

The IMPEP Team should follow the guidance provided in SA-100, Implementation of the Integrated Materials Performance Evaluation Program (IMPEP), regarding discussions related to this indicator with inspectors, supervisors, and managers. If performance issues are identified by the reviewer(s) that lead to programmatic weaknesses, the IMPEP Team should seek to identify the root cause(s) of the issues which can be used as the basis for developing recommendations for corrective actions. Appendix D of contains criteria regarding the development of recommendations by the IMPEP team.

2. This procedure evaluates the quantitative performance of routine Priority 1, 2, 3 and initial inspections of the Agreement State or NRC program and inspections of reciprocity licensees in accordance with IMC 2800 since the last IMPEP review.
2. If performance deficiencies are identified, review team members should consider whether the root causes of these deficiencies affect more than the Status of Material Inspection Program Indicator. Issues impacting one performance indicator could also have a negative impact on performance with respect to other indicators. As a general matter, a performance deficiency, and associated root causes, should be assigned to only the most appropriate indicator and not counted against multiple indicators.

B. Review Guidelines

1. Evaluate Thethe response generated by the Agreement State or NRC Program to relevant questions in the IMPEP questionnaire. Depending on the level of detail of the information provided, the response to the questionnaire relative to this indicator mayshould be usefuld to focus the review.
2. The Principal Reviewer should be familiar with IMC 2800, which prescribes inspection frequencies. The Principal Reviewer should also be cognizant of any additional inspection guidance, such as Temporary Instructions, that may describe deviations in inspection frequencies.
3. The Principal Reviewer should use inspection data provided in the questionnaire and information provided during the on-site review. An Agreement State program should not be penalized for failing to meet internally-developedinternally developed inspection schedules that are more aggressive (i.e. licensees or license types that are more frequently inspected) than those specified in IMC 2800.

Procedure SA-101: Reviewing the Common Performance Page: 4 of 9 Indicator Status of Materials Inspection Program Issue Date:

4.2. To eEvaluate the status of materials and security inspections by gathering, the Principal Reviewer should evaluate the following information:

a. The number of Priority 1, 2, and 3, and initial inspections completed overdue during the review period and overdue at the time of the review;
b. The amount of time past the applicable inspection due dates for any Priority 1, 2, and 3, and initial overdue inspections;
c. The reason Priority 1, 2, and 3, and initial inspections were completed overdue or are overdue at the time of the review;
d. The safety or security significance of not performing or deferring any overdue inspections;
e. The timeliness of issuance of inspection findings to licensees;
f. The inspection frequencies used by an Agreement State. The reviewer should verify the Programs inspection frequencies are at least as frequent as those listed in IMC 2800. The reviewer should document any Agreement State inspection frequencies that are conducted at frequencies that are longer than those specified in IMC 2800, the Programs rationale for conducting them at a greater frequency, and any impacts to health, safety, security, or the environment. An Agreement State program should not be penalized for failing to meet internally developed inspection schedules that are more aggressive (i.e., licensees or license types that are more frequently inspected) than those specified in IMC 2800;
g. Overdue inspections are not determined based on the inspection frequencies established by any Agreement State. The inspection frequencies in IMC 2800 are used as the baseline metric for determining if an inspection is overdue. A number of Agreement States have more aggressive inspection schedules than those prescribed in IMC 2800.

However, in cases where an Agreement States inspection frequency is less stringent than IMC 2800, the reviewer should note the difference(s) and determine if there are performance issues as a result. Several Agreement States have set less stringent frequencies for certain categories of licensees. The State needs to have a documented rationale for the difference(s) and the Management Review Board will make the final determination if public health and safety are jeopardized based on the difference(s); and f.h. The performance of reciprocity inspections in accordance with the guidance in IMC 2800, or the details of and justification for the NRC orthe Agreement States or NRCs alternative reciprocity inspection policy.

Procedure SA-101: Reviewing the Common Performance Page: 5 of 9 Indicator Status of Materials Inspection Program Issue Date:

D. The Principal Reviewer should evaluate the following during the onsite review:

1. Examine information on the status of routine Priority 1, 2, 3 and initial inspections and reciprocity inspections completed by the Program during the review period.
a. If available, the reviewer should examine the inspection information contained in the Programs database. If the Program uses the Web Based Licensing (WBL) system, information can be obtained by running a query against the new licensing actions (i.e., to determine initial inspection due dates) and inspection activities; or,
b. If the Program does not have a database or such data cannot be easily retrieved or provided, to cross-reference and verify information, the reviewer should examine a representative number of Priority 1, 2, and 3 and reciprocity inspection records, as well as other relevant documents involving inspection findings, using the following guidance:
i. All inspections performed since the last IMPEP review are subject for review.

ii. The reviewer should sample as many inspections as possible to determine the rating for this indicator and note in the report that only a sampling was performed. This means that the team members will need to pull files and review information from inspection reports. The reviewer will need to document in the report the values and assumptions used for the overdue calculation based on the sampling. If possible, the reviewer should include in the report the total number of Priority 1, 2, and 3 and initial inspections conducted by the Program during the review period, as well as the number that were overdue for inspection at the time of the review.

iii. A risk-informed sample of the Programs inspections based on safety and security significance should be selected. The selected inspection casework should focus on the Programs highest-risk license activities. The use of risk-informed sampling, rather than random sampling, maximizes the effectiveness of the review of casework. The sampling should also ensure inclusion of the full range of Priority 1, 2, and 3 modalities licensed by the Agreement States and NRC (e.g.,

industrial, medical, academic) as well as a representative sample of security inspections of Category 1 and 2 risk significant radioactive material and service provider licensees.

Procedure SA-101: Reviewing the Common Performance Page: 6 of 9 Indicator Status of Materials Inspection Program Issue Date:

2. Determine the percentage of overdue Priority 1, 2, and 3, and initial inspections for the review period. Appendix A of this procedure contains guidance for the overdue inspection calculation with a sample worksheet for use by the reviewer.
a. Routine inspections of Priority 1 and 2 licensees are considered overdue if the inspections exceed the IMC 2800 frequencies plus the following applicable maximum window:
i. Priority 1 inspections completed greater than 6 six months past the inspection due date; ii. Priority 2 inspections completed greater than 12 months (1one year) past the inspection due date; and,
b. Routine inspections of Priority 3 and 5 licensees and telephonic contact of Priority T licensees are considered overdue if the inspections or contact exceed the IMC 2800 frequencies plus 1 year.
c. Initial inspections are normally considered overdue if the inspections are performed greater than 12 months after the date of issuance of the license, however, if the licensee does not yet possess licensed material or has not yet performed any principal activities, the initial inspection may be rescheduled to within 18 months of license issuance. When determining the number of initial inspections performed or overdue, all initial inspections must be included. This includes initial inspections of all priority codes, including Priority 5.
d. Reciprocity inspections are evaluated separately and should not be included in the calculation.
e. The information and definitions in IMC 2800 should be used when making a calculation and determining the status of inspections in Appendix A. If the Agreement State program defines overdue inspections using different definitions than the NRC, the reviewer should note the differences in terminology or definitions in the IMPEP report.
3. Attempt to ascertain the reason(s) for any overdue inspections. This can be accomplished through discussions with individual inspectors as well as Program management.
4. Include an assessment of the issuance of inspection findings. Inspection findings in most cases should be provided to licensees within 30 days of the exit meeting with the licensee or 45 days of the exit meeting with the licensee for a team inspection, or a time period specified in the compatible Agreement State procedure.

Procedure SA-101: Reviewing the Common Performance Page: 7 of 9 Indicator Status of Materials Inspection Program Issue Date:

5. Evaluate the performance of reciprocity inspections in comparison to the criteria in IMC 2800 and the NRC or alternative compatible Agreement State reciprocity policyrocedure.
6. Review the Agreement State Programs inspection frequencies. While this indicator primarily focuses on quantitative performance, the reviewer should also include a qualitative evaluation of an Agreement State Programs inspection frequencies. If the Agreement State Programs inspection frequencies deviates from the frequencies established in IMC 2800, the reviewer should evaluate what if any health, safety, and/or security impacts have occurred as a result of the deviation. Additionally, the reviewer should ensure documentation exists that justifies why the deviation in inspection frequency exists.
7. Flexibility may be used to make the determination of the rating for this indicator.

The reviewer should consider the status of the Program and any mitigating factors that may have prohibited the Program from conducting inspections during the review period. The reviewers assessment should include the examination of plans to perform any overdue inspections or reschedule any missed or deferred inspections. The reviewer should determine that a basis has been established by the Program for not performing any overdue inspections or rescheduling any missed or deferred inspections.

a. For example, if a State has no overdue inspections at the time of the review and has addressed the root cause of the overdue inspections, then there may not be any performance issue and as such, a finding of satisfactory may be appropriate (also taking into consideration the other factors for this indicator). However, if the State has not addressed the root cause of the overdue inspections or has not developed a management plan or other effort to address the issue, then a rating of satisfactory, but needs improvement, or unsatisfactory may be appropriate (also taking into consideration the other factors for this indicator). Additionally, review teams may make specific recommendations to address these types of performance issues.

E. Review Information Summary At a minimum, the summary maintained by the Principal Rreviewer should include the following information:

1. Number of Priority 1, 2, and 3 inspections that were completed on time during the review period;
2. Number of Priority 1, 2, and 3 inspections that were completed overdue during the review period, and the range of time past due the inspections were completed;

Procedure SA-101: Reviewing the Common Performance Page: 8 of 9 Indicator Status of Materials Inspection Program Issue Date:

3. Number of Priority 1, 2, and 3 inspections that are overdue at the time of the review, and the range of time past due the inspections are at the time of the review;
4. Number of initial inspections that were completed on time during the review period;
5. Number of initial inspections that were completed overdue during the review period, and the range of time past due the inspections were completed;
6. Number of initial inspections that are overdue at the time of the review, and the range of time past due the inspections are at the time of the review;
7. Number of reciprocity licensees for each year of the review period and the number of reciprocity inspections that were completed during each year of the review period; and
8. Number of inspection findings from Priority 1, 2, and 3, and initial inspections that were issued to the licensees more than 30 days, or 45 days for a team inspection, after the inspection exit meeting was held and the amount of time past the 30/45-daydue date that the late inspection findings were sent or are overdue. The Principal Rreviewer should also document the reason any inspection findings were dispatched overdue.

E. Evaluation Process

1. The reviewer should refer to Part III, Evaluation Criteria, of MD 5.6 for specific evaluation criteria. As noted in MD 5.6, the criteria for a satisfactory Program is as follows:
a. Less than 10 percent of initial and high priority licensees (Priority 1, 2, and 3) are inspected at frequencies greater than those prescribed in IMC 2800 or compatible Agreement State procedure.
b. Inspection findings are communicated to the licensee according to the criteria prescribed in IMC 2800 or compatible Agreement State procedure.
c. Reciprocity inspections are performed in a manner that meets the requirements identified in IMC 2800 and applicable guidance, or compatible Agreement State procedures; or the Agreement State program has developed and successfully implemented an alternative policy for reciprocity inspections in lieu of IMC 2800 and applicable guidance, using a similar risk-informed, performance-based approach for determining reciprocity licensees.

Procedure SA-101: Reviewing the Common Performance Page: 9 of 9 Indicator Status of Materials Inspection Program Issue Date:

2. Appendix B Examples of Less than Satisfactory Findings of Program Performance of this procedure contains examples to assist the reviewer in identifying less than fully satisfactory findings of a Programs performance.
3. The IMPEP Team should follow the guidance provided in SA-100, Implementation of the Integrated Materials Performance Evaluation Program (IMPEP), regarding discussions related to this indicator with inspectors, supervisors, and managers.
4. If performance issues are identified, the reviewer should consider whether the root causes of these issues affect more than the Status of Materials Inspection Program Indicator. Issues impacting this performance indicator could have a negative impact on other performance indicators. As a general matter, a performance issue, and associated root causes, should be assigned to only the most appropriate performance indicator and not counted against multiple indicators.

F. Discussion of Findings with the Radiation Control Program

1. The IMPEP team should follow the guidance given in SA-100, for discussions of technical findings with staff, supervisors, and management. If performance issues are identified by the reviewer(s) that lead to programmatic weaknesses, the reviewer(s) team should seek to identify the root cause(s) of the issues which can be used as the basis for developing recommendations for corrective actions.

Appendix D of SA-100 contains criteria regarding the development of recommendations by the IMPEP team.

VI. APPENDICES A. Overdue Inspection Calculation Worksheet B. Frequently Asked Questions CB. Examples of Less than Satisfactory Findings of Program Performance ; this appendix can be found on the state and tribal communications portal Web site:

https://scp.nrc.gov/regtoolbox.html VII. REFERENCES Management Directives (MD) available at https://scp.nrc.gov.

NMSS SA Procedures available at https://scp.nrc.gov.

Procedure SA-101: Reviewing the Common Performance Page: 10 of 9 Indicator Status of Materials Inspection Program Issue Date:

NRC Inspection Manual Chapters available at https://www.nrc.gov/reading-rm/doc-collections/insp-manual/manual-chapter/.

NRC Inspection Procedures available at https://www.nrc.gov/reading-rm/doc-collections/insp-manual/inspection-procedure/.

NRC Generic Communications available at https://www.nrc.gov/reading-rm/doc-collections/gen-comm/.

NRC/Agreement State Working Groups available at https://scp.nrc.gov.

Procedure SA-101: Reviewing the Common Performance Page: 11 of 9 Indicator Status of Materials Inspection Program Issue Date:

VIII. ADAMS REFERENCE DOCUMENTS For knowledge management purposes, listed below are all previous revisions of this procedure, as well as associated correspondence with stakeholders, that have been entered into the NRCs Agencywide Document Access Management System (ADAMS).

No. Date Document Title/Description Accession N b 1 10/24/02 STP-02-074, Opportunity to Comment on Draft ML022970629 Revisions to STP Procedure SA-101 2 1/24/03 Summary of Comments on SA-101 ML031130704 3 4/4/03 STP Procedure SA-101 ML031080519 4 4/19/07 FSME-07-037, Opportunity to Comment on Draft ML071090427 Revision to FSME Procedure SA-101 5 6/14/07 Summary of Comments on SA-101 ML072160015 6 7/23/07 FSME Procedure SA-101 ML072160012 7 3/28/16 STC-16-028, Closeout of Temporary Instructions TI- ML16084A626 001 and 002 8 4/27/16 Closeout Memo of Independent Review ML16041A299 Panel/Materials Program Working Group Recommendation for TI 001 and 002 9 12/18/19 STC-19-079, Opportunity to Comment of Interim SA- ML20183A152 101 ML20183A153 10 12/18/19 Interim NMSS Procedure SA-101 ML19339H097 11 Resolution of Comments- Final NMSS Procedure SA- ML20184A180 101 12 Final NMSS Procedure SA-101 ML20183A328

Procedure SA-101: Reviewing the Common Performance Page: 12 of 9 Indicator Status of Materials Inspection Program Issue Date:

Appendix A OVERDUE INSPECTION CALCULATION WORKSHEET Guidance for calculating the number of overdue inspections:

1. Priority 1, 2, and 3 inspections and all initial inspections areInspections considered in the calculation are Priority 1, 2, and 3 inspections and all initial inspections. An inspection will be considered overdue if it falls under one of the following cases:
a. A Priority 1 inspection completed greater than 6 months past the inspection due date (18 months since the start of the last inspection);
b. A Priority 2 inspection completed greater than 12 months past the inspection due date (36 months since the start of the last inspection);
c. A Priority 3 inspection completed greater than 12 months past the inspection due date (48 months since the start of the last inspection); and
d. An initial inspection completed greater than 12 months from the date of license issuance, or greater than 18 months if the licensee did not possess licensed material in the first 12 months.
2. Inspection frequencies are always compared to the NRC inspection Ppriorities listed in IMC 2800 rather than the Pprograms internal inspection frequencies.
3. Multiple overdue inspections for the same licensee are counted as a single event. Depending on the inspection pPriority, there may be more than one inspection for a specific licensee conducted during the review period. However, if more than one inspection is significantly overdue and/or not yet completed, the Principal Rreviewer should count them as one missed or overdue inspection but should note examples of the overdue ranges for the IMPEP report. The IMPEP policy is to credit the Program for the inspections they perform yet keep track of how late overdue inspections were eventually conducted. Thus, inspections that should have been performed are not double or triple counted in the calculation, but the reviewer should document how late the overdue inspection was performed or if it is still overdue at the time of the review.

For example, if a Program inspects a Priority 1 licensee only once in a 4-year period, this is counted as one overdue inspection if only one inspection was conducted for a Priority 1 licensee during a four-year period, for the purpose of the overdue inspection calculation, this would be considered one (1) overdue inspection and the reviewer should note the number of months exceeding the 18-month grace period. Even though the inspection could be overdue 30 months, it would still be counted as one (1) overdue inspection.

4. The percentage of overdue inspections during the review period should be calculated as follows:

A-1

% overdue = 100 x Multiply the ratio below by 100 to obtain the percentage.

Number of Priority 1, 2, and 3 and initial inspections not completed on time per NRC IMC 2800overdue Number of Priority 1, 2, and 3 and initial inspections that should have been completed For exampleTo determine the numerator and denominator:

% overdue = 100 x

% overdue = 100 x (PCO + PU + ICO + IU)

(PCO + PU + ICO + IU + PC + IC)

Where:

PCO = number of Priority 1, 2, and 3 inspections completed overdue during the review period PU = number of Priority 1, 2, and 3 inspections overdue at the time of the review PC = number of Priority 1, 2, and 3 inspections completed on time during the review period ICO = number of initial inspections completed overdue during the review period IU = number of initial inspections overdue at the time of the review IC = number of initial inspections completed on time during the review period

5. The following is a sample calculation:

The Program performed 80 Priority 1, 2, and 3 inspections on time during the review period and ten (10) Priority 1, 2, and 3 inspections were performed overdue during the review period. Additionally, at the time of the review there was two (2) Priority 1, 2, or 3 inspections that are still overdue. The Program performed ten (10) initial inspections on time during the review period and performed five (5) initial inspections overdue during the review period. At the time of the review, there was one (1) initial inspection that was still overdue.

PCO = 10 ICO = 5 PU = 2 IU = 1 PC = 80 IC = 10 So:

% = 100 x (PCO + PU + ICO + IU)

(PCO + PU + ICO + IU + PC + IC)

= 100 x (10 + 2 + 5 + 1)

A-2

(10 + 2 + 5 + 1 + 80 + 10)

= 100 x 18/108 = 16.7%

6. The overdue inspection calculation is just one piece of information that the reviewer uses to determine the appropriate rating for this indicator. Regardless of how close a calculation is to 25 percent (or 10 percent), the reviewer should take the Programs overall performance involving the other aspects of this indicator, (e.g., the root cause of the overdue inspections and the Program Managements actions to address the issues) into account when determining an appropriate rating for this indicator.

A-3

INSPECTION STATUS REVIEWER WORKSHEET State/NRC______________________

Time period covered by IMPEP Review _____________________________

One entry per inspection Entry # Licensee License Priority 1, 2 3, Last inspection Date due 50% window Name Number or initial date for Priority 1 or and 2 license issued date, if initial 1- year window inspection for Priority 3 No window for initial 10 Sample 12-2345 1 1/1/13 1/1/14 7/1/14 Company A 20 Sample 23-4567 Initial 5/1/13 5/1/14 N/A Company B A-4

INSPECTION STATUS REVIEWER WORKSHEET (cont.)

Entry Date Amount of Date Date Report issued Notes Performed time inspection inspection within 30 overdue completed findings days, issued 45 days for team inspection if not, days over 0 9/1/14 2 months 9/1/14 9/15/14 Yes 0 7/1/14 2 months 7/1/14 8/20/14 No - 18 days A-5

Appendix B FREQUENTLY ASKED QUESTIONS Q1: Is there any leniency to counting overdue inspections of Priority 1 and 2 licensees as the NRC IMC 2800 frequency plus 50 percent?

A1: No. For Priority 1, and 2, inspections completed over the 50 percent, the inspection should be considered overdue and documented as such in the calculation. Review teams may take other mitigating factors into consideration and describe them in the narrative portion of the report as appropriate.

Q2: If a Program inspects a Priority 1 licensee only once in a 3-year period, why do we only count that as one overdue inspection?

A2: IMPEP policy is to credit the Program for the inspections they perform yet keep track of how late overdue inspections were eventually conducted. Thus, inspections that should have been performed are not double or triple counted in the calculation, but the reviewer should document how late the overdue inspection was performed or if it is still overdue at the time of the review.

Q3: How important is the overdue inspection calculation to the rating for this indicator? For example, what if the number of overdue inspections turns out to be just under or over 25 percent?

A3: The overdue inspection calculation is just one piece of information that the review team uses to determine the appropriate rating for this indicator. Regardless of how close a calculation is to 25 percent (or 10 percent), the review team should take the Programs overall performance involving the other aspects of this indicator, (e.g., the root cause of the overdue inspections and the Program Managements actions to address the issues) into account when determining an appropriate rating for this indicator.

Q4: What if the data necessary to perform the overdue calculation is not easy to get or determine?

A4: In this case, the review team should sample as many inspections as possible to help determine the rating for this indicator and note in the report that only a sampling was performed. This means that the team members will need to pull files and review information from inspection reports. The review team will need to document in the report the values and assumptions used for the overdue calculation based on the sampling. If possible, the review team should include in the report the total number of Priority 1, 2, and 3 and initial inspections conducted by the Program during the review period, as well as the number that were overdue for inspection at the time of the review.

B-1

Q5: What if a State deviates from the inspection frequencies prescribed in IMC 2800?

A5: Overdue inspections are not determined based on the inspection frequencies established by any Agreement State. The inspection frequencies in IMC 2800 are used as the baseline metric for determining if an inspection is overdue. A number of Agreement States have more aggressive inspection schedules than those prescribed in IMC 2800. However, in cases where an Agreement States inspection frequency is less stringent than IMC 2800, the review team should note the difference(s) and determine if there are performance issues as a result. Several Agreement States have set less stringent frequencies for certain categories of licensees. The State needs to have a documented rationale for the difference(s) and the Management Review Board will make the final determination if public health and safety are jeopardized based on the difference(s).

Q6: What if a State conducted many Priority 1, 2, and 3, and initial inspections overdue during the review period as a result of staff turnover, but have caught up on all the overdue inspections at the time of the review?

A6: If a State has no overdue inspections at the time of the review and has addressed the root cause of the overdue inspections, then there may not be any performance issue and as such, a finding of satisfactory may be appropriate (also taking into consideration the other factors for this indicator). However, if the State has not addressed the root cause of the overdue inspections or has not developed a management plan or other effort to address the issue, then a rating of satisfactory, but needs improvement, or unsatisfactory may be appropriate (also taking into consideration the other factors for this indicator). Additionally, review teams may make specific recommendations to address these types of performance issues.

Q7: For the initial inspections, are only Priority 1, 2, and 3 licensees counted in the calculation?

A7: No. When determining the number of initial inspections performed or overdue, all initial inspections must be included. This includes initial inspections of all priority codes, including Priority 5.

B-2

Appendix BC EXAMPLES OF LESS THAN SATISFACTORY FINDINGS OF PROGRAM PERFORMANCE The effectiveness of a Program is assessed through the evaluation of the criteria listed in Section III, Evaluation Criteria, of MD 5.6. These criteria are NOT intended to be exhaustive but provide a starting point for the IMPEP review team to evaluate this indicator. The review team should also take into consideration other relevant mitigating factors that may have an impact on the Programs performance under this performance indicator. The review team should consider a less than satisfactory finding when the identified performance issue(s) is/are programmatic in nature, and not isolated to one aspect, case, individual, etc. as applicable.

This list is not all inclusive and will be maintained and updated in the IMPEP Toolbox on the state communications portal Wweb site at https://scp.nrc.gov.

The following are examples of review findings that resulted (or could result) in a Program being resulted satisfactory, but needs improvement for this indicator:

The Program conducted a total of 291 inspections of high priority licensees and 65 initial inspections during the review period. Of the 291 high priority inspections, the review team determined that 37 inspections were completed overdue by more than 25 percent of the inspection frequency prescribed in IMC 2800, and that one high priority inspection was overdue at the time of the review. Of the 65 initial inspections, the review team determined that 22 inspections were completed more than 12 months after license issuance and that no initial inspections were overdue at the time of the review. Overall, the review team calculated that the Program performed 16.8 percent of its inspections overdue during the review period. The team determined more than 10 percent, but less than 25 percent, of Priority 1, 2, and 3 and initial inspections were inspected at intervals exceeding the frequencies prescribed in IMC 2800.

The review team evaluated the Programs timeliness of issuance of inspection findings. The Program has a goal of issuing inspection correspondence within 30 days of the final date of the inspection. The review team determined that 30 of the 40 inspection reports reviewed were issued within the 30-day goal. All inspections reviewed except one inspection were clear inspections.

The following are examples of review findings that resulted (or could result) in a Program being found unsatisfactory for this indicator:

The Program conducted 70 high priority inspections during the review period. Thirty of these inspections were conducted overdue by more than 25 percent of the inspection frequency prescribed in IMC 2800. The Program performed 21 initial inspections during the review period, 13 of which were conducted overdue.

The team identifies that the Program issued 18 of the 30 inspection reports greater than 30 days after the inspection exit. All inspections except one were clear inspections. The team determined that the 17 clear inspection findings were issued late due to a backlog of work on the Program Supervisors desk.

C-1

The Program granted 126 reciprocity permits but did not conduct any reciprocity inspections in three of the calendar years during the review period. In the year leading up to the review, the Program performed 3 reciprocity inspections.

C-2