ML19353A763

From kanterella
Jump to navigation Jump to search
Interim State Agreements (SA) Procedure SA-101, Reviewing the Common Performance Indicator, Status of Materials Inspection Program
ML19353A763
Person / Time
Issue date: 12/18/2019
From: O'Hara J
NRC/NMSS/DMSST/ASPB
To:
O'Hara J
References
SA-101
Download: ML19353A763 (18)


Text

Office of Nuclear Material Safety and Safeguards Procedure Approval Reviewing the Common Performance Indicator, Status of Materials Inspection Program Interim State Agreements (SA) Procedure SA-101 Issue Date: December 18, 2019 Review Date: December 18, 2022 Michael C. Layton Original signed by Director, NMSS/MSST Kevin Williams for Michael Layton Date: 12/09/19 Paul Michalak Original signed by Branch Chief, NMSS/MSST/SALPB Paul Michalak Date: 12/06/19 Joe OHara Original signed by Procedure Contact, MSST/SALPB Joe OHara Date: 12/6/19 Terry Derstine Original signed by Chair, Organization of Agreement States Terry Derstine Date: 12/10/19 ML19353A763 NOTE Any changes to the procedure will be the responsibility of the NMSS Procedure Contact. Copies of NMSS procedures are available through the NRC Web site at https://scp.nrc.gov

Interim Procedure SA-101: Reviewing the Common Page: 1 of 9 Performance Indicator Status of Materials Inspection Program Issue Date:

12/18/19 I. INTRODUCTION This document describes the procedure for conducting reviews of the Agreement States and U.S. Nuclear Regulatory Commission (NRC) radiation control programs (Programs) for the common performance indicator, Status of Materials Inspection Program specified in the U.S. Nuclear Regulatory (NRC) Management Directive (MD) 5.6, Integrated Materials Performance Evaluation Program (IMPEP).

II. OBJECTIVES A. To verify that initial inspections and inspections of Priority 1, 2, and 3, licensees are performed at the frequency prescribed in NRC Inspection Manual Chapter (IMC) 2800, Materials Inspection Program.

B. To verify that licensees working under reciprocity are inspected in accordance with the criteria prescribed in IMC 2800 or compatible policy developed by Agreement State programs using a similar risk-informed performance-based approach.

C. To verify that deviations from inspection schedules are approved by Program Management and that the reasons for the deviations are documented.

D. To verify there is a plan to perform any overdue inspections and reschedule any missed or deferred inspections. To determine a basis has been established for not performing any overdue inspections or rescheduling any missed or deferred inspections.

E. To verify that inspection findings are communicated to licensees within 30 calendar days, or 45 calendar days for a team inspection, after inspection completion as specified in IMC 0610, Nuclear Material Safety and Safeguards Inspection Reports and IMC 2800.

III. BACKGROUND Periodic inspections of licensed activities are essential to ensure that activities are conducted in compliance with regulatory requirements and consistent with good safety and security practices. Inspection frequency, designated by a priority code, is based on the relative risk of the radiation hazard of the licensed activity. For example, a Priority 1 licensee presents the greatest risk to health and safety of workers, members of the public, and the environment; therefore, Priority 1 licensees require the most frequent inspections. Information regarding the number of overdue inspections is a significant measure of the status of a radioactive materials inspection program.

Interim Procedure SA-101: Reviewing the Common Page: 2 of 9 Performance Indicator Status of Materials Inspection Program Issue Date:

12/18/19 IV. ROLES AND RESPONSIBILITIES A. IMPEP Review Team Leader (Team Leader)

1. In coordination with the IMPEP Program Manager, the Team Leader determines which team member is assigned lead review responsibility and assigns other team members to provide support, as necessary.
2. Communicates the teams findings to Program Management and ensures that the teams findings are in alignment with MD 5.6.

B. Principal Reviewer

1. Reviews relevant documentation, conducts management and staff discussions, and maintains a summary of all statistical inspection information received.
2. Calculates the percentage of Priority 1, 2, 3, and initial inspections completed overdue in accordance with Appendix A of this procedure.
3. Verifies that reciprocity inspections are completed in accordance with the NRCs IMC 2800.
4. Reviews inspection communications sent to licensees to verify that findings are communicated to the licensee in accordance with the NRCs IMC 2800.
5. Informs the Team Leader of their findings throughout the review.
6. Completes their portion of the IMPEP report for the performance indicator reviewed.
7. Attends the IMPEP Management Review Board meeting and is prepared to discuss their findings (this can be done either in-person or remotely).

V. GUIDANCE A. Scope

1. The IMPEP Team should follow the guidance provided in SA-100, Implementation of the Integrated Materials Performance Evaluation Program (IMPEP), regarding discussions related to this indicator with inspectors, supervisors, and managers. If performance issues are identified by the reviewer(s) that lead to programmatic weaknesses, the IMPEP Team should seek to identify the root cause(s) of the issues which can be used as the basis for developing recommendations for corrective actions. Appendix D of

Interim Procedure SA-101: Reviewing the Common Page: 3 of 9 Performance Indicator Status of Materials Inspection Program Issue Date:

12/18/19 SA-100 contains criteria regarding the development of recommendations by the IMPEP team.

2. In terms of general guidance for the IMPEP review team, a finding of "satisfactory" should be considered when none or only a few or small number of the cases or areas reviewed involve performance issues/deficiencies (e.g.,

inspection, licensing, staffing, etc.); an "unsatisfactory" finding should be considered when a majority or a large number of cases or areas reviewed involve performance issues/deficiencies, especially if they are chronic, programmatic, and/or of high-risk significance; and a finding of "satisfactory, but needs improvement" should be considered when more than a few or a small number of the cases or areas reviewed involve performance issues/deficiencies in high-risk-significant regulatory areas, but not to such an extent that the finding would be considered unsatisfactory.

3. This procedure evaluates the quantitative performance of routine Priority 1, 2, 3 and initial inspections of the NRC or Agreement State program and inspections of reciprocity licensees in accordance with IMC 2800 since the last IMPEP review.
4. If performance deficiencies are identified, review team members should consider whether the root causes of these deficiencies affect more than the Status of Material Inspection Program Indicator. Issues impacting one performance indicator could also have a negative impact on performance with respect to other indicators. As a general matter, a performance deficiency, and associated root causes, should be assigned to only the most appropriate indicator and not counted against multiple indicators.

B. Evaluation Process

1. The Principal Reviewer should refer to Part III, Evaluation Criteria, of MD 5.6 for specific evaluation criteria. As noted in MD 5.6, the criteria for a satisfactory Program is as follows:
a. Less than 10 percent of initial and high priority licensees (Priority 1, 2, and 3) are inspected at frequencies greater than those prescribed in IMC 2800 or compatible Agreement State procedure.
b. Inspection findings are communicated to the licensee according to the criteria prescribed in IMC 2800 or compatible Agreement State procedure.
c. Reciprocity inspections are performed in a manner that meets the requirements identified in IMC 2800 and applicable guidance, or compatible Agreement State procedures; or the Agreement State

Interim Procedure SA-101: Reviewing the Common Page: 4 of 9 Performance Indicator Status of Materials Inspection Program Issue Date:

12/18/19 program has developed and successfully implemented an alternative policy for reciprocity inspections in lieu of IMC 2800 and applicable guidance, using a similar risk-informed, performance-based approach for determining reciprocity licensees.

2. The Principal Reviewer should examine any information on the status of routine Priority 1, 2, 3 and initial inspections and reciprocity inspections completed by the NRC or Agreement State program during the review period.
a. If available, the Principal Reviewer should examine the inspection information contained in the Programs database. Information can be obtained through the Web Based Licensing (WBL) system by running a query against the licensing actions and inspection activities; or,
b. If the Program does not have a database or such data cannot be easily retrieved or provided, to cross-reference and verify information, the reviewer should examine a representative number of Priority 1, 2, and 3 and reciprocity inspection records, as well as other relevant documents involving inspection findings, using the following guidance:
i. All inspections performed since the last IMPEP review are subject for review.

ii. The Principal Reviewer should perform a risk-informed sample of the Programs inspections based on safety and security significance. The selected inspection casework should focus on the Programs highest-risk licensees. The use of risk-informed sampling, rather than random sampling, maximizes the effectiveness of the review of casework. The sampling should also ensure inclusion of the full range of Priority 1, 2, and 3 modalities licensed by the NRC and Agreement States (e.g.

industrial, medical, academic). as well as a representative sample of security inspections of high-risk IAEA Category 1 and 2 sources and service provider licensees.

3. As part of the evaluation criteria for this indicator, the Principal Reviewer will determine the percentage of overdue Priority 1, 2, and 3, and initial inspections for the review period. Appendix A contains guidance for the overdue inspection calculation with a sample worksheet for use by the Principal Reviewer.
a. Routine inspections of Priority 1 and 2 licensees are considered overdue if the inspections exceed the IMC 2800 frequencies plus the following applicable maximum window:

Interim Procedure SA-101: Reviewing the Common Page: 5 of 9 Performance Indicator Status of Materials Inspection Program Issue Date:

12/18/19

i. Priority 1 inspections completed greater than six months past the inspection due date; ii. Priority 2 inspections completed greater than 12 months (one year) past the inspection due date; and,
b. Routine inspections of Priority 3 and 5 licensees and telephonic contact of Priority T licensees are considered overdue if the inspections or contact exceed the IMC 2800 frequencies plus 1 year.
c. Initial inspections are normally considered overdue if the inspections are performed greater than 12 months after the date of issuance of the license, however, if the licensee does not yet possess licensed materials or has not yet performed any principal activities, the initial inspection may be rescheduled to within 18 months of license issuance.
d. Reciprocity inspections are evaluated separately and should not be included in the calculation.
e. The Principal Reviewer should use the information and definitions in IMC 2800 when determining the status of inspections. If the NRC or Agreement State defines overdue inspections using different definitions, a reasonable attempt should be made to make the calculation using the information and definitions from IMC 2800. This may be achieved by reviewing inspection casework files and applying the information to the worksheet in Appendix A. If the reviewer is unable to calculate the status of inspections using the information and definitions in IMC 2800, the reviewer may use the NRC or Agreement State's data or information but must note the differences in terminology or definitions in the IMPEP report.
4. The Principal Reviewer should attempt to ascertain the reason(s) for any overdue inspections. This can be accomplished through discussions with individual inspectors as well as Program management.
5. The review should include an assessment of the issuance of inspection findings. Inspection findings in most cases should be provided to licensees within 30 days of the exit meeting with the licensee or 45 days of the exit meeting with the licensee for a team inspection, or a time period specified in the compatible Agreement State procedure.
6. The performance of reciprocity inspections of licensees should be evaluated in comparison to the criteria in IMC 2800 or alternative compatible Agreement State procedure.

Interim Procedure SA-101: Reviewing the Common Page: 6 of 9 Performance Indicator Status of Materials Inspection Program Issue Date:

12/18/19

7. While this indicator primarily focuses on quantitative performance, review of this indicator should also include a qualitative evaluation of an Agreement State Programs inspection frequencies. If the Agreement State Program deviates from the frequencies as established in IMC 2800 the principal reviewer should evaluate what if any health, safety, and/or security impacts have occurred as a result of the deviation. Additionally, the Principal Reviewer should ensure documentation exists that justifies why the deviation in inspection frequency exists.
8. In applying the criteria, flexibility may be used to make the determination of the rating for this indicator. The review team should consider the status of the Program and any mitigating factors that may have prohibited the Program from conducting inspections during the review period. The review teams assessment should include the examination of plans to perform any overdue inspections or reschedule any missed or deferred inspections. The Principal Reviewer should determine that a basis has been established by the Program for not performing any overdue inspections or rescheduling any missed or deferred inspections.

C. Review Guidelines

1. The response generated by the NRC or Agreement State to relevant questions in the IMPEP questionnaire should be used to focus the review.
2. The Principal Reviewer should be familiar with IMC 2800, which prescribes inspection frequencies. The Principal Reviewer should also be cognizant of any additional inspection guidance, such as Temporary Instructions, that may describe deviations in inspection frequencies.
3. The Principal Reviewer should use inspection data provided in the questionnaire and information provided during the on-site review. An Agreement State program should not be penalized for failing to meet internally-developed inspection schedules that are more aggressive (i.e.

licensees or license types that are more frequently inspected) than those specified in IMC 2800.

4. To evaluate the status of materials and security inspections, the Principal Reviewer should evaluate the following:
a. The number of Priority 1, 2, and 3, and initial inspections completed overdue during the review period and overdue at the time of the review;
b. The amount of time past the applicable inspection due dates for any Priority 1, 2, and 3, and initial overdue inspections;

Interim Procedure SA-101: Reviewing the Common Page: 7 of 9 Performance Indicator Status of Materials Inspection Program Issue Date:

12/18/19

c. The reason Priority 1, 2, and 3, and initial inspections were completed overdue or are overdue at the time of the review;
d. The safety or security significance of not performing or deferring any overdue inspections;
e. The timeliness of issuance of inspection findings to licensees;
f. The inspection frequencies used by an Agreement State. The Principal Reviewer should verify the Programs inspection frequencies are at least as frequent as those listed in IMC 2800. The principal reviewer should document any Agreement State inspection frequencies that are conducted at frequencies that are longer than those specified in IMC 2800, the Programs rationale for conducting them at a greater frequency, and any impacts to health, safety, security, or the environment; and
g. The performance of reciprocity inspections in accordance with the guidance in, or the details of and justification for the NRC or Agreement States alternative reciprocity inspection policy.

D. Review Information Summary At a minimum, the summary maintained by the Principal Reviewer should include the following information:

1. Number of Priority 1, 2, and 3 inspections that were completed on time during the review period;
2. Number of Priority 1, 2, and 3 inspections that were completed overdue during the review period, and the range of time past due the inspections were completed;
3. Number of Priority 1, 2, and 3 inspections that are overdue at the time of the review, and the range of time past due the inspections are at the time of the review;
4. Number of initial inspections that were completed on time during the review period;
5. Number of initial inspections that were completed overdue during the review period, and the range of time past due the inspections were completed;
6. Number of initial inspections that are overdue at the time of the review, and the range of time past due the inspections are at the time of the review;

Interim Procedure SA-101: Reviewing the Common Page: 8 of 9 Performance Indicator Status of Materials Inspection Program Issue Date:

12/18/19

7. Number of reciprocity licensees for each year of the review period and the number of reciprocity inspections that were completed during each year of the review period; and
8. Number of inspection findings from Priority 1, 2, and 3, and initial inspections that were issued to the licensees more than 30 days, or 45 days for a team inspection, after the inspection exit meeting was held and the amount of time past the 30/45-day date that the late inspection findings were sent or are overdue. The Principal Reviewer should also document the reason any inspection findings were dispatched overdue.

E. Discussion of Findings with Radiation Control Program The IMPEP team should follow the guidance given in SA-100, for discussion of technical findings with staff, supervisors, and management. If performance issues are identified that lead to programmatic weaknesses, the team should seek to identify the root cause(s) of the issues which can be used as the basis for developing recommendations for corrective actions.

VI. APPENDICES A. Overdue Inspection Calculation Worksheet B. Frequently Asked Questions C. Examples of Less than Satisfactory Findings of Program Performance VII. REFERENCES Management Directives (MD) available at https://scp.nrc.gov.

NMSS SA Procedures available at https://scp.nrc.gov.

NRC Inspection Manual Chapters available at https://www.nrc.gov/reading-rm/doc-collections/insp-manual/manual-chapter/.

NRC/Agreement State Working Groups available at https://scp.nrc.gov.

VIII. ADAMS REFERENCE DOCUMENTS For knowledge management purposes, listed below are all previous revisions of this procedure, as well as associated correspondence with stakeholders, that have been entered into the NRCs Agencywide Document Access Management System (ADAMS).

Interim Procedure SA-101: Reviewing the Common Page: 9 of 9 Performance Indicator Status of Materials Inspection Program Issue Date:

12/18/19

Interim Procedure SA-101: Reviewing the Common Page: 10 of 9 Performance Indicator Status of Materials Inspection Program Issue Date:

12/18/19 No. Date Document Title/Description Accession N b 1 10/24/02 STP-02-074, Opportunity to Comment on Draft ML022970629 Revisions to STP Procedure SA-101 2 1/24/03 Summary of Comments on SA-101 ML031130704 3 4/4/03 STP Procedure SA-101 ML031080519 4 4/19/07 FSME-07-037, Opportunity to Comment on Draft ML071090427 Revision to FSME Procedure SA-101 5 6/14/07 Summary of Comments on SA-101 ML072160015 6 7/23/07 FSME Procedure SA-101 ML072160012 7 NMSS Procedure SA-101 ML19339H097

Appendix A OVERDUE INSPECTION CALCULATION WORKSHEET Guidance for calculating the number of overdue inspections:

1. Inspections considered in the calculation are Priority 1, 2, and 3 inspections and all initial inspections. An inspection will be considered overdue if it falls under one of the following cases:
a. A Priority 1 inspection completed greater than 6 months past the inspection due date (18 months since the start of the last inspection);
b. A Priority 2 inspection completed greater than 12 months past the inspection due date (36 months since the start of the last inspection);
c. A Priority 3 inspection completed greater than 12 months past the inspection due date (48 months since the start of the last inspection); and
d. An initial inspection completed greater than 12 months from the date of license issuance, or greater than 18 months if the licensee did not possess licensed material in the first 12 months.
2. Inspections are always compared to NRC Priorities in IMC 2800.
3. Multiple overdue inspections for the same licensee are counted as a single event. Depending on the Priority, there may be more than one inspection for a specific licensee conducted during the review period. However, if more than one inspection is significantly overdue and/or not yet completed, the Principal Reviewer should count them as one missed or overdue inspection but should note examples of the overdue ranges for the IMPEP report.

For example, if only one inspection was conducted for a Priority 1 licensee during a four-year period, for the purpose of the overdue inspection calculation, this would be considered one (1) overdue inspection and the reviewer should note the number of months exceeding the 18-month period. Even though the inspection could be overdue 30 months, it would still be counted as one (1) overdue inspection.

4. The percentage of overdue inspections during the review period should be calculated as follows:

% overdue = 100 x Number of Priority 1, 2, and 3 and initial inspections not completed on time per NRC IMC 2800 Number of Priority 1, 2, and 3 and initial inspections that should have been completed A-1

To determine the numerator and denominator:

% overdue = 100 x (PCO + PU + ICO + IU)

(PCO + PU + ICO + IU + PC + IC)

Where:

PCO = number of Priority 1, 2, and 3 inspections completed overdue during the review period PU = number of Priority 1, 2, and 3 inspections overdue at the time of the review PC = number of Priority 1, 2, and 3 inspections completed on time during the review period ICO = number of initial inspections completed overdue during the review period IU = number of initial inspections overdue at the time of the review IC = number of initial inspections completed on time during the review period

5. The following is a sample calculation:

The Program performed 80 Priority 1, 2, and 3 inspections on time during the review period and ten (10) Priority 1, 2, and 3 inspections were performed overdue during the review period. Additionally, at the time of the review there was two (2) Priority 1, 2, or 3 inspections that are still overdue. The Program performed ten (10) initial inspections on time during the review period and performed five (5) initial inspections overdue during the review period. At the time of the review, there was one (1) initial inspection that was still overdue.

PCO = 10 ICO = 5 PU = 2 IU = 1 PC = 80 IC = 10 So:

% = 100 x (PCO + PU + ICO + IU)

(PCO + PU + ICO + IU + PC + IC)

= 100 x (10 + 2 + 5 + 1)

(10 + 2 + 5 + 1 + 80 + 10)

= 100 x 18/108 = 16.7%

A-2

INSPECTION STATUS REVIEWER WORKSHEET State/NRC______________________

Time period covered by IMPEP Review _____________________________

One entry per inspection Entry Licensee License Priority 1, 2 3, Last inspection Date due 50% window Name Number or initial date for Priority 1 or and 2 license issued date, if initial 1 year window inspection for Priority 3 No window for initial 0 Sample 12-2345 1 1/1/13 1/1/14 7/1/14 Company 0 Sample 23-4567 Initial 5/1/13 5/1/14 N/A Company A-3

INSPECTION STATUS REVIEWER WORKSHEET (cont.)

Entry Date Amount of Date Date Report issued Notes Performed time inspection inspection within 30 overdue completed findings days, issued 45 days for team inspection if not, days over 0 9/1/14 2 months 9/1/14 9/15/14 Yes 0 7/1/14 2 months 7/1/14 8/20/14 No - 18 days A-4

Appendix B FREQUENTLY ASKED QUESTIONS Q1: Is there any leniency to counting overdue inspections of Priority 1 and 2 licensees as the NRC IMC 2800 frequency plus 50 percent?

A1: No. For Priority 1, and 2, inspections completed over the 50 percent, the inspection should be considered overdue and documented as such in the calculation. Review teams may take other mitigating factors into consideration and describe them in the narrative portion of the report as appropriate.

Q2: If a Program inspects a Priority 1 licensee only once in a 3-year period, why do we only count that as one overdue inspection?

A2: IMPEP policy is to credit the Program for the inspections they perform yet keep track of how late overdue inspections were eventually conducted. Thus, inspections that should have been performed are not double or triple counted in the calculation, but the reviewer should document how late the overdue inspection was performed or if it is still overdue at the time of the review.

Q3: How important is the overdue inspection calculation to the rating for this indicator? For example, what if the number of overdue inspections turns out to be just under or over 25 percent?

A3: The overdue inspection calculation is just one piece of information that the review team uses to determine the appropriate rating for this indicator. Regardless of how close a calculation is to 25 percent (or 10 percent), the review team should take the Programs overall performance involving the other aspects of this indicator, (e.g., the root cause of the overdue inspections and the Program Managements actions to address the issues) into account when determining an appropriate rating for this indicator.

Q4: What if the data necessary to perform the overdue calculation is not easy to get or determine?

A4: In this case, the review team should sample as many inspections as possible to help determine the rating for this indicator and note in the report that only a sampling was performed. This means that the team members will need to pull files and review information from inspection reports. The review team will need to document in the report the values and assumptions used for the overdue calculation based on the sampling. If possible, the review team should include in the report the total number of Priority 1, 2, and 3 and initial inspections conducted by the Program during the review period, as well as the number that were overdue for inspection at the time of the review.

B-1

Q5: What if a State deviates from the inspection frequencies prescribed in IMC 2800?

A5: Overdue inspections are not determined based on the inspection frequencies established by any Agreement State. The inspection frequencies in IMC 2800 are used as the baseline metric for determining if an inspection is overdue. A number of Agreement States have more aggressive inspection schedules than those prescribed in IMC 2800. However, in cases where an Agreement States inspection frequency is less stringent than IMC 2800, the review team should note the difference(s) and determine if there are performance issues as a result. Several Agreement States have set less stringent frequencies for certain categories of licensees. The State needs to have a documented rationale for the difference(s) and the Management Review Board will make the final determination if public health and safety are jeopardized based on the difference(s).

Q6: What if a State conducted many Priority 1, 2, and 3, and initial inspections overdue during the review period as a result of staff turnover, but have caught up on all the overdue inspections at the time of the review?

A6: If a State has no overdue inspections at the time of the review and has addressed the root cause of the overdue inspections, then there may not be any performance issue and as such, a finding of satisfactory may be appropriate (also taking into consideration the other factors for this indicator). However, if the State has not addressed the root cause of the overdue inspections or has not developed a management plan or other effort to address the issue, then a rating of satisfactory, but needs improvement, or unsatisfactory may be appropriate (also taking into consideration the other factors for this indicator). Additionally, review teams may make specific recommendations to address these types of performance issues.

Q7: For the initial inspections, are only Priority 1, 2, and 3 licensees counted in the calculation?

A7: No. When determining the number of initial inspections performed or overdue, all initial inspections must be included. This includes initial inspections of all priority codes, including Priority 5.

B-2

Appendix C EXAMPLES OF LESS THAN SATISFACTORY FINDINGS OF PROGRAM PERFORMANCE The effectiveness of a Program is assessed through the evaluation of the criteria listed in Section III, Evaluation Criteria, of MD 5.6. These criteria are NOT intended to be exhaustive but provide a starting point for the IMPEP review team to evaluate this indicator. The review team should also take into consideration other relevant mitigating factors that may have an impact on the Programs performance under this performance indicator. The review team should consider a less than satisfactory finding when the identified performance issue(s) is/are programmatic in nature, and not isolated to one aspect, case, individual, etc. as applicable.

This list is not all inclusive and will be maintained and updated in the IMPEP Toolbox on the state communications portal website at https://scp.nrc.gov.

A. The following are examples of review findings that resulted (or could result) in a Program being resulted satisfactory, but needs improvement for this indicator:

1. The Program conducted a total of 291 inspections of high priority licensees and 65 initial inspections during the review period. Of the 291 high priority inspections, the review team determined that 37 inspections were completed overdue by more than 25 percent of the inspection frequency prescribed in IMC 2800, and that one high priority inspection was overdue at the time of the review. Of the 65 initial inspections, the review team determined that 22 inspections were completed more than 12 months after license issuance and that no initial inspections were overdue at the time of the review.

Overall, the review team calculated that the Program performed 16.8 percent of its inspections overdue during the review period. The team determined more than 10 percent, but less than 25 percent, of Priority 1, 2, and 3 and initial inspections were inspected at intervals exceeding the frequencies prescribed in IMC 2800.

2. The review team evaluated the Programs timeliness of issuance of inspection findings.

The Program has a goal of issuing inspection correspondence within 30 days of the final date of the inspection. The review team determined that 30 of the 40 inspection reports reviewed were issued within the 30-day goal. All inspections reviewed except one inspection were clear inspections.

B. The following are examples of review findings that resulted (or could result) in a Program being found unsatisfactory for this indicator:

1. The Program conducted 70 high priority inspections during the review period. Thirty of these inspections were conducted overdue by more than 25 percent of the inspection frequency prescribed in IMC 2800. The Program performed 21 initial inspections during the review period, 13 of which were conducted overdue.
2. The team identifies that the Program issued 18 of the 30 inspection reports greater than 30 days after the inspection exit. All inspections except one were clear inspections. The team determined that the 17 clear inspection findings were issued late due to a backlog of work on the Program Supervisors desk.
3. The Program granted 126 reciprocity permits but did not conduct any reciprocity inspections in three of the calendar years during the review period. In the year leading up to the review, the Program performed 3 reciprocity inspections.

C-1