ML19324F028

From kanterella
Jump to navigation Jump to search
Interim Procedure (IP) SA-108, Reviewing the Non-Common Performance Indicator, Sealed Source and Device Evaluation Program
ML19324F028
Person / Time
Issue date: 01/20/2020
From: Michael Layton
NRC/NMSS/DMSST/ASPB
To:
Poy S
References
SA-108
Download: ML19324F028 (12)


Text

Office of Nuclear Material Safety and Safeguards Procedure Approval Reviewing the Non-Common Performance Indicator, Sealed Source and Device Evaluation Program, Interim State Agreements (SA) Procedure SA-108 Issue Date: December 20, 2019 Review Date: December 20, 2022 Michael C. Layton Original signed by Director, NMSS/MSST Kevin Williams Date: 12/20/19 Paul Michalak Original Signed by Branch Chief, NMSS/MSST/SALPB Paul Michalak Date: 12/17/19 Stephen Poy Original Signed by Procedure Contact, NMSS/MSST/SALPB Stephen Poy Date: 12/17/19 Terry Derstine Original Signed by Chair, Organization of Agreement States Terry Derstine Date: 12/19/19 ML19324F028 NOTE Any changes to the procedure will be the responsibility of the NMSS Procedure Contact.

Copies of NMSS procedures will be available through the NRC Web site at https://scp.nrc.gov.

IP SA-107: Reviewing the Non-Common Performance Page: 1 of 11 Indicator, Legislation, Regulations, and Other Program Issue Date:

Elements 12/20/2019 I. INTRODUCTION This document describes the procedure for conducting reviews of Agreement State and U.S. Nuclear Regulatory Commission (NRC) radiation control programs as specified in NRC Management Directive (MD) 5.6, Integrated Materials Performance Evaluation Program (IMPEP)..

II. OBJECTIVES To verify the adequate implementation of the three sub-elements under this indicator - (a)

Technical Staffing and Training, (b) Technical Quality of the Product Evaluation Program, and (c) Evaluation of Defects and Incidents Regarding SS&Ds.

III. BACKGROUND Adequate technical evaluations of SS&D designs are essential to ensure that SS&Ds will maintain their integrity and that the design is adequate to protect public health and safety.

  • NUREG-1556, Volume 3, Consolidated Guidance About Materials Licenses:

Applications for Sealed Source and Device Evaluation and Registration, provides information on conducting SS&D reviews and establishes useful guidance for review teams. Three sub elements, noted above, will be evaluated to determine if the SS&D program is satisfactory. Agreement States with authority for SS&D evaluation programs who are not performing SS&D reviews are required to commit in writing to having an SS&D evaluation program in place before performing evaluations.

IV. ROLES AND RESPONSIBILITIES A. Team Leader In coordination with the IMPEP Program Manager, determines which team member is assigned lead review responsibility for this performance indicator.

B. SS&D Reviewer

1. Selects documents for review for each of the three sub-elements (e.g.,

training records, SS&D evaluations, event reports). reviews relevant documentation, conducts staff discussions, and maintains a summary of the review for this indicator.

2. Coordinates the review of the indicator with other reviewers, if needed.
3. Informs the Team Leader of the teams findings throughout the onsite review.
4. Presents the teams findings to the Program at the staff exit meeting.
5. Completes their portion of the IMPEP report for the Sealed Source and Device Evaluation Program performance indicator.

IP SA-107: Reviewing the Non-Common Performance Page: 2 of 11 Indicator, Legislation, Regulations, and Other Program Issue Date:

Elements 12/20/2019

6. Attends the Management Review Board meeting for the review and is prepared to discuss their findings (this can be done either in person or remotely).

V. GUIDANCE A. Scope This guidance applies to the three sub-elements to be reviewed under this indicator.

1. Evaluation of SS&D staffing and training should be conducted in a manner similar to, but not necessarily a part of, the Common Performance Indicator:

Technical Staffing and Training, but focused on the training and experience necessary to conduct SS&D activities. The minimum qualifying criteria for SS&D staff authorized to sign registration certificates should be specified by the program and should be used in the review.

2. Review for adequacy, accuracy, completeness, clarity, specificity, and consistency of the technical quality of completed SS&D evaluations issued by the Agreement State or NRC.
3. Reviews of SS&D incidents should be conducted in a manner similar to, but not necessarily a part of, the Common Performance Indicator: Technical Quality of Incident and Allegation Activities, to detect possible manufacturing defects and the root causes of these incidents. The incidents should be evaluated to determine if other products may be affected by similar problems. Actions and notifications to Agreement States, NRC, and others should be conducted as specified in the Office of Nuclear Material Safety and Safeguards (NMSS) State Agreements (SA)

Procedure SA-300, Reporting Material Events.

4. This guidance specifically excludes SS&D evaluations of non-Atomic Energy Act materials (e.g., naturally occurring radioactive material (NORM)).

B. Evaluation Procedures

1. The principle reviewer should refer to MD 5.6, Part II, Performance Indicators, and Part III, Evaluation Criteria, Non-Common Performance Indicator: Sealed Source and Device Evaluation Program, for the SS&D evaluation program criteria. These criteria should apply to program data for the entire review period. A finding of satisfactory is appropriate when a review demonstrates the presence of the following conditions:
a. The SS&D program meets the criteria for a satisfactory finding for the performance indicator, Technical Staffing and Training, as described in Section III.B.1 of the MD 5.6 Directive Handbook.

IP SA-107: Reviewing the Non-Common Performance Page: 3 of 11 Indicator, Legislation, Regulations, and Other Program Issue Date:

Elements 12/20/2019

b. Procedures compatible with NMSS Procedure SA-108, Reviewing the Non-Common Performance Indicator, Sealed Source and Device Evaluation Program, are implemented and followed.
c. Concurrence review of the technical reviewer's evaluation is performed by management or staff having proper qualifications and training.
d. Product evaluations address health and safety issues; are thorough, complete, consistent, and of acceptable technical quality; and adequately address the integrity of the products under normal conditions of use and likely accident conditions.
e. Registrations clearly summarize the product evaluation and provide license reviewers with adequate information with regard to license possession and use of the product.
f. Deficiency letters clearly state regulatory positions and are used at the proper time.
g. Completed registration certificates, and the status of obsolete registration certificates, are clear and are promptly transmitted to the Agreement States, NRC, and others, as appropriate.
h. The SS&D reviewers ensure that registrants have developed and implemented adequate quality assurance and control programs.
i. There is a means for enforcing commitments made by registrants in their applications and referenced by the program in the registration certificates.
j. There are no potentially significant health and safety issues identified from the review, that were linked to a specific product evaluation.
k. The SS&D reviewers routinely evaluate the root causes of defects and incidents involving the devices subject to the SS&D program and take appropriate actions, including modifications of the SS&D sheets and notifications to the Agreement States, NRC, and others, as appropriate.
2. The minimum training and qualification requirements for reviewers should be documented and compatible with MD 5.6, Part II, Non-Common Performance Indicator: Technical Staffing and Training. The reviewer should determine whether the training and experience of all SS&D personnel meet these or equivalent requirements.
a. For NRC, SS&D training and qualification requirements are documented in NRC Inspection Manual Chapter (IMC) 1248, Formal Qualification Programs for Federal and State Material and Environmental Management Programs.

IP SA-107: Reviewing the Non-Common Performance Page: 4 of 11 Indicator, Legislation, Regulations, and Other Program Issue Date:

Elements 12/20/2019

b. Agreement States should have established, documented training and qualification requirements that are either equivalent to NRC IMC 1248 or have implemented Appendix A of NMSS Procedure SA-103, Reviewing the Common Performance Indicator, Technical Staffing and Training.
3. All SS&D evaluations completed since the last IMPEP review are candidates for review.
4. The reviewer should select a representative sample based on the number and the type of evaluations performed during the review period. The selected sample should represent a cross-section of the Agreement States or NRCs evaluations completed and include as many different reviewers and categories (e.g., new registrations, amendments, inactivations, or reactivations) as practical.
5. The reviewer should include any work performed on behalf of the program under review by others, (i.e., an Agreement State, NRC, or a contractor), to ensure the technical quality of the work. The reviewer should also ensure that any individuals performing work on a programs behalf meet the programs training and qualification requirements.

NOTE: Because the work is being performed at the discretion of the program under review, any weaknesses or deficiencies that the review team identifies will affect the appropriate sub-element rating(s) and could ultimately affect the overall indicator rating for the program under review.

6. If the initial review indicates an apparent weakness on the part of a reviewer(s), or problems with respect to one or more type(s) of SS&D or event evaluations, additional samples should be reviewed to determine the extent of the problem or to identify a systematic weakness. The findings, if any, should be documented in the report. If previous reviews indicated a programmatic weakness in a particular area, additional casework in that area should be evaluated to assure that the weakness has been addressed.
7. The reviewer should determine whether or not a backlog exists, based on the criteria established by the program, and if the backlog has any impact on health and safety.
8. The review of incidents involving SS&Ds should be conducted in accordance with the guidance provided in Section V of NMSS Procedure SA-105, Reviewing the Common Performance Indicator, Technical Quality of Incident and Allegation Activities.

IP SA-107: Reviewing the Non-Common Performance Page: 5 of 11 Indicator, Legislation, Regulations, and Other Program Issue Date:

Elements 12/20/2019

9. For Agreement States, the reviewer should also determine if the program has received notification from the NRC about potential generic SS&D issues discovered during trend analysis of the Nuclear Material Events Database (NMED) events and identified in accordance with NRC in Policy and Procedure Letter 1.57, NMSS Generic Assessment Process. The reviewer would determine if such notifications had been received under this process; the effectiveness of the States response to these notifications; the adequacy of the response when compared to the actions that would be reasonably expected to be taken by other evaluation programs within the national program; Policy and Procedure Letter 1.57; and, the programs effort to notify Agreement States and NRC of the corrective actions by the issuance of a revised certificate.
10. In cases where an Agreement State may have SS&D evaluation authority but is not performing SS&D reviews, the reviewer should verify that the program has committed in writing to having an evaluation program, as described in Section (C)(2) of Part II, MD 5.6, in place before performing evaluations.

C. Review Guidelines

1. The response to questions relevant to this indicator in the IMPEP questionnaire should be used to focus the review.
2. The reviewer should be familiar with the latest revision of NUREG 1556, Vol 3, which provides guidance for SS&D evaluations.
3. Any issues identified in the last IMPEP review should be resolved in accordance with Section V.H.4, NMSS Procedure SA-100, Implementation of the Integrated Materials Performance Evaluation Program (IMPEP).
4. For SS&D evaluations, the reviewer should evaluate the following:
a. Technical correctness with regard to all aspects of evaluations. The checklist in the latest revision of NUREG 1556, Vol. 3, or equivalent document, may be used to verify the full range of considerations;
b. Completeness of applications and proper signature by an authorized official;
c. Records to document significant errors, omissions, deficiencies or missing information (e.g., documents, letters, file notes, and telephone conversation records). The decision making process, including any significant deficiencies related to health and safety is noted during the evaluation, and adequately documented in the records;
d. The adequacy of the limitations and/or other considerations of use;

IP SA-107: Reviewing the Non-Common Performance Page: 6 of 11 Indicator, Legislation, Regulations, and Other Program Issue Date:

Elements 12/20/2019

e. The conduct of the concurrence review, as defined in the Glossary, MD 5.6;
f. Acceptance of variances or exceptions to industry standards in accordance with NUREG-1556, Vol. 3, or equivalent guidance.
g. Guidance, checklists, regulations, and policy memoranda to ensure consistency with current accepted practice, standards and guidance;
h. Appropriate use of signature authority for the registration certificates.
i. Thorough technical evaluations of the SS&D designs are essential to ensure that the SS&Ds will maintain their integrity and that the design is adequate to protect public health and safety. NUREG-1556, Volume 3, Consolidated Guidance about Materials Licenses: Applications for Sealed Source and Device Evaluation and Registration provides information on conducting the SS&D reviews and establishes useful guidance for IMPEP teams. Under this guidance, three sub-elements:

Technical Staffing and Training, Technical Quality of the Product Evaluation Program, and Evaluation of Defects and Incidents Regarding SS&Ds, are evaluated to determine if the SS&D program is satisfactory.

Agreement States with authority for SS&D evaluation programs that are not performing SS&D reviews are required to commit in writing to having an SS&D evaluation program in place before performing evaluations.

The following sub-elements will be considered when determining if the SS&D evaluation program is adequate:

(i) Technical Staffing and Training (1) Evaluation of the SS&D program staffing and training should be conducted in the same manner as the evaluation conducted with respect to Common Performance Indicator 1 (refer to Section II.B.1 of this handbook).

(2) The SS&D program evaluation by the IMPEP review team will focus on training and experience commensurate with the conduct of the SS&D reviews as described in IMC 1248 or compatible Agreement State procedure.

IP SA-107: Reviewing the Non-Common Performance Page: 7 of 11 Indicator, Legislation, Regulations, and Other Program Issue Date:

Elements 12/20/2019 (ii) Technical Quality of the Product Evaluation Program Adequate technical evaluations of the SS&D designs are essential to ensure that the SS&Ds used by both licensees and persons exempt from licensing will maintain their integrity and that the design features are adequate to protect public health and safety. The technical quality of the product evaluation program should be assessed by the IMPEP review team on the basis of an in-depth review of a representative cross-section of evaluations performed on various types of products and actions. To the extent possible, the review team should capture a representative cross-section of completed actions by each of the Agreement State or NRC SS&D reviewers.

(iii) Evaluation of Defects and Incidents Regarding SS&Ds Reviews of the SS&D incidents should be conducted in the same manner as the evaluation conducted by the IMPEP review team with respect to Common Performance Indicator 5 (refer to Section II.B.5 of this handbook) to detect possible manufacturing defects and the root causes for these incidents. The incidents should be evaluated to determine if other products may be affected by similar problems. Appropriate action should be taken and notifications made to the Agreement States, NRC, and others, as appropriate, in a timely manner.

D. Review Information Summary The summary maintained by the reviewer for preparation of the final report will include, at a minimum:

1. The applicants name;
2. The registration certificate number;
3. The type of action, e.g., new registration, amendment, inactivation, or reactivation;
4. The date of issuance;
5. SS&D Type; and
6. Narrative of the comments if any.

The summary of review information does not appear in the final report. However, it is a good practice for the reviewer to maintain this information to support the reviewers presentation to the MRB.

E. Discussion of Findings with the Agreement States Radiation Control Programs or NRC

IP SA-107: Reviewing the Non-Common Performance Page: 8 of 11 Indicator, Legislation, Regulations, and Other Program Issue Date:

Elements 12/20/2019

1. The IMPEP team should follow the guidance in SA-100, Implementation of the Integrated Materials Performance Evaluation Program (IMPEP), for discussions of technical findings with inspectors, supervisors, and management. If performance issues are identified by the reviewer(s) that lead to programmatic weaknesses, the reviewer(s) should seek to identify the root cause(s) of the issues which can be used as the basis for developing recommendations for corrective actions. As noted in Section II.A.3, SA-100 contains criteria regarding the development of recommendations by the IMPEP team.
2. In terms of general guidance for the IMPEP review team, a finding of "satisfactory" should be considered when none or only a few or small number of the cases or areas reviewed involve performance issues/deficiencies (e.g.,

inspection, licensing, staffing, etc.) ; an "unsatisfactory" finding should be considered when a majority or a large number of cases or areas reviewed involve performance issues/deficiencies, especially if they are chronic, programmatic, and/or of high-risk significance; and a finding of "satisfactory, but needs improvement" should be considered when more than a few or a small number of the cases or areas reviewed involve performance issues/deficiencies in high-risk-significant regulatory areas, but not to such an extent that the finding would be considered unsatisfactory.

VI. APPENDIX Appendix A - Examples of Less than Satisfactory Programs VII. REFERENCES

1. Management Directives (MD) available at https://scp.nrc.gov.
2. NMSS SA Procedures available at https://scp.nrc.gov.
3. NUREG 1556 Volume 3, Rev. 1, Consolidated Guidance About Materials Licenses:

Applications for Sealed Source and Device Evaluation and Registration.

4. Policy and Procedure Letter 1.57, NMSS Generic Assessment Process.

IP SA-107: Reviewing the Non-Common Performance Page: 9 of 11 Indicator, Legislation, Regulations, and Other Program Issue Date:

Elements 12/20/2019 VIII. AGENCYWIDE DOCUMENTS ACCESS AND MANAGEMENT SYSTEM (ADAMS)

REFERENCE DOCUMENTS For knowledge management purposes, all previous revisions of this procedure, as well as associated correspondence with stakeholders, that have been entered into ADAMS are listed below.

No. Date Document Title/Description Accession Number 1 2/27/04 STP-04-011, Opportunity to Comment on Draft ML061640162 STP Procedure SA-108 2 6/20/05 STP Procedure SA-108, Reviewing the Non-Common ML061640169 Performance Indicator, Sealed Source and Device Evaluation Program, Redline/Strikeout Version 3 6/20/05 Summary of Comments on SA-108 ML061640173 4 6/20/05 STP Procedure SA-108, Reviewing the Non- ML040620291 Common Performance Indicator, Sealed Source and Device Evaluation Program 5 6/30/05 STP-05-049, Final STP Procedure SA-108 ML051810473 6 7/14/09 FSME-09-051, Opportunity to Comment on Draft ML091330602 Revision of FSME Procedures SA-108 and SA-109 7 7/14/09 FSME Procedure SA-108, Draft Revision with ML091330103 tracked changes 8 1/22/10 Final FSME Procedure SA-108 ML092740005 9 1/22/19 FSME Procedure SA-108, Resolution of Comments ML092740069 10 1/22/19 FSME Procedure SA-108, Draft Revision with ML092740014 tracked changes 11 12/20/19 Interim Procedure SA-108: Reviewing the Non- ML19324F028 Common Performance Indicator, Sealed Source &

Device Evaluation Program

Appendix A EXAMPLES OF LESS THAN SATISFACTORY FINDINGS OF PROGRAM PERFORMANCE NOTES:

The effectiveness of a program is assessed through the evaluation of the criteria listed in Section III, Evaluation Criteria, of MD 5.6. These criteria are NOT intended to be exhaustive but provide a starting point for the IMPEP review team to evaluate this indicator. The review team should also take into consideration other relevant mitigating factors that may have an impact on the programs performance under this performance indicator. The review team should consider a less than satisfactory finding when the identified performance issue(s) is/are programmatic in nature, and not isolated to one aspect, case, individual, etc. as applicable.

This list is not all inclusive and will be maintained and updated in the IMPEP Toolbox on the state communications portal at https://scp.nrc.gov.

The following are examples of review findings that resulted (or could result) in a program being found satisfactory, but needs improvement for this indicator.

TECHNICAL STAFFING AND TRAINING

1. The team found that the program did not have sufficient qualified staff to complete the SS&D reviews in a timely manner. The program had only one reviewer qualified to conduct the sealed source and device evaluations, and a qualified manager to conduct the concurrence reviews. The one qualified reviewer was also responsible for other activities and only had a limited amount of time to spend on the reviews. As a result, the reviews were not processed in a timely manner, and were rushed resulting in errors in the reviews performed. However, no health and safety issues were identified with the reviews. This has cross jurisdictional health and safety implications.
2. During the review period, the program hired technical review staff that did not have the scientific or technical backgrounds that would equip them to receive technical training related to the review of SS&D. As a result, an evaluation for a device was issued without reviewing all of the technical features associated with the design and integrity of the device.

SEALED SOURCE DEVICE PROGRAM

1. The team found cases where the SS&D evaluations reviewed did not address the integrity of the products and important health and safety concerns with respect to thoroughness, completeness, consistency, clarity, technical quality, adherence to existing guidance in product evaluations. Specifically, the evaluations did not fully address deficiencies with prototype testing. As a result, sealed sources and devices containing radioactive material were approved that did not demonstrate the product would maintain its integrity during normal use and likely accident conditions. Making this determination is essential when deciding whether to approve a sealed source or device.
2. The program had 10 events of defects and incidents of devices subject to the SS&D program related to a particular irradiator. The program did not fully evaluate the root causes of all defects and incidents involving devices subject to the SS&D program.

Specifically, the program did not evaluate 3 of the events related to the design defect issue A-1

of the irradiator including a root cause evaluation. As a result, the staff did not determine whether the incidents were generic and would require either a design change to the device or a notification to users to make them aware of a potential safety concern. Additionally, the program was unable to demonstrate that they had a process to evaluate defects and incidents.

The following are examples of review findings that resulted (or could result) in a program being found unsatisfactory for this indicator.

TECHNICAL STAFFING AND TRAINING

1. The team found that the program did not have qualified staff to complete the SS&D reviews in a timely manner. The program had no qualified reviewers to conduct the sealed source and device evaluations, however evaluations were being performed. As a result, the reviews were not adequately performed and resulted in an integrity concern for one of the devices approved.
2. During the review period, the number of qualified SS&D reviewers has decreased from 10 down to 3. The program currently does not have enough qualified reviewers to handle the typical SS&D volume. As a result, actions have been completed without the concurrence review of the technical reviewers evaluation.
3. The programs SS&D training program does not meet most of the criteria IMC 1248 and NMSS procedure SA-103 for SS&D reviewers. The training program was deficient/did not fully address directed review of selected SS&D case work, regulatory requirements, and industry codes and standards to meet the criteria of IMC 1248.

SEALED SOURCE DEVICE PROGRAM

1. The team found in most of the cases reviewed, that the SS&D evaluations did not address the integrity of the products and important health and safety concerns with respect to thoroughness, completeness, consistency, clarity, technical quality, adherence to existing guidance in product evaluations. Specifically, evaluations did not fully address deficiencies with prototype testing. As a result, sealed sources and devices containing radioactive material were approved that did not demonstrate the product would maintain its integrity during normal use and likely accident conditions.

Making this determination is essential when deciding whether to approve a sealed source or device.

2. The program had 10 events of defects and incidents of devices subject to the SS&D program related to a particular irradiator design. The program did not fully evaluate the root causes of all defects and incidents involving devices subject to the SS&D program.

Specifically, the program did not evaluate 9 of the events related to the design defect issue of the irradiator including a root cause analysis. As a result, sealed sources and devices containing radioactive material were approved that did not demonstrate the product would maintain its integrity during normal use and likely accident conditions.

Making this determination is essential when deciding whether to approve a sealed source or device. Additionally, the program was unable to demonstrate that they had a process to evaluate defects and incidents.

A-2