ML20244A280

From kanterella
Jump to navigation Jump to search
State Agreement (SA) Procedure 108 Reviewing the Non-Common Performance Indicator, Sealed Source and Device Evaluation Program
ML20244A280
Person / Time
Issue date: 09/15/2020
From: Stephen Poy
NRC/NMSS/DMSST
To:
Poy S
Shared Package
ML20183A179 List:
References
SA-108
Download: ML20244A280 (10)


Text

Office of Nuclear Material Safety and Safeguards Procedure Approval Reviewing the Non-Common Performance Indicator, Sealed Source and Device Evaluation Program, State Agreements (SA) Procedure SA-108 Issue Date:

Review Date:

Kevin Williams Director, NMSS/MSST September 15, 2020 September 15, 2025 Digitally signed by Kevin K

  • w*11*

Williams ev1n I 1ams Date: 2020.09.0813:08:19

-04'00' L* tt R Id Qt Digitally signed by Lizette Roldan-Otero Lizette Roldan-Otero, Ph.D. 1ze e O an-ero Date: 2020.09.0812:12:05-0S'00' Date:

Acting Branch Chief, NMSS/MSST/SALB Date:

Stephen Poy Procedure Contact, NMSS/MSST/SALB David Crowley Chair, Organization of Agreement Sta s ML20244A280 Digitally signed by St h

P Stephen Poy ep en Oy Date:2020.09.08 14:29:51 -04'00' NOTE Date:

Date:

Any changes to the procedure will be the responsibility of the NMSS Procedure Contact. Copies of NMSS procedures are available through the NRC Web site at http,s:llscp_.nrc.9.ov.

9/15/2020

SA-108: Reviewing the Non-Common Performance Indicator, Sealed Source and Device Evaluation Program Page: 1 of 9 Issue Date:

09/15/2020 I.

INTRODUCTION This document describes the procedure for conducting reviews of Agreement State and U.S. Nuclear Regulatory Commission (NRC) radiation control programs as specified in NRC Management Directive (MD) 5.6, Integrated Materials Performance Evaluation Program (IMPEP).

II.

OBJECTIVES To verify the adequate implementation of the three sub-elements under this indicator:

A.

Technical Staffing and Training, B.

Technical Quality of the Product Evaluation Program, and C.

Evaluation of Defects and Incidents Regarding Sealed Source & Devices.

III.

BACKGROUND Adequate technical evaluations of Sealed Source & Device (SS&D) designs are essential to ensure that SS&Ds will maintain their integrity and that the design is adequate to protect public health and safety. NUREG-1556, Volume 3, Consolidated Guidance About Materials Licenses: Applications for Sealed Source and Device Evaluation and Registration, provides information on conducting SS&D reviews and establishes useful guidance for review teams. The three sub-elements, noted above, are evaluated to determine if the SS&D program is satisfactory. Agreement States with authority for SS&D evaluation programs who are not performing SS&D reviews are required to commit in writing to having an SS&D evaluation program in place before performing evaluations.

IV.

ROLES AND RESPONSIBILITIES A.

IMPEP Review Team Leader (Team Leader)

1. In coordination with the IMPEP Program Manager, the Team Leader determines which team member is assigned lead review responsibility for this performance indicator.
2. Communicates the teams findings to Program Management and ensures that the teams findings are in alignment with MD 5.6.
3. This procedure allows for the option to not review a SS&D Evaluation Program that has not performed any evaluations since the last IMPEP review and that there have been no changes or issues since the last IMPEP review that would impact safety of the SS&Ds within the Programs oversight.

SA-108: Reviewing the Non-Common Performance Indicator, Sealed Source and Device Evaluation Program Page: 2 of 9 Issue Date:

09/15/2020 B.

SS&D Reviewer

1. Selects documents for review for each of the three sub-elements (e.g.,

training records, SS&D evaluations, event reports), reviews relevant documentation, conducts staff discussions, and maintains a summary of the review for this indicator.

2. Coordinates the review of the indicator with other reviewers, if needed.
3. Informs the Team Leader of the teams findings throughout the onsite review.
4. Presents the teams findings to the Program at the staff exit meeting.
5. Completes their portion of the IMPEP report for the Sealed Source and Device Evaluation Program performance indicator.
6. Attends the Management Review Board meeting for the IMPEP review; presents and discusses the teams findings for the Sealed Source and Device Evaluation Program performance indicator (this can be done either in person or remotely).

V.

GUIDANCE A.

Scope Guidance applies to the three sub-elements to be reviewed under this indicator.

1. Evaluate the SS&D staffing and training in a manner similar to, but not necessarily a part of, the Common Performance Indicator: Technical Staffing and Training, but focused on the training and experience necessary to conduct SS&D activities. The minimum qualifying criteria for SS&D staff authorized to sign registration certificates should be specified by the program and should be used in the review.
2. Review the technical quality of completed SS&D evaluations for adequacy, accuracy, completeness, clarity, specificity, and consistency of issued by the Agreement State or NRC.
3. Review the SS&D incidents in a manner similar to, but not necessarily a part of, the Common Performance Indicator: Technical Quality of Incident and Allegation Activities, to detect possible manufacturing defects and the root causes of these incidents. The incidents should be evaluated to determine if other products may be affected by similar problems. Actions and notifications to Agreement States, NRC, and others should be conducted as specified in the Office of Nuclear Material Safety and Safeguards (NMSS) State Agreements (SA) Procedure SA-300, Reporting Material Events.
4. Review of SS&D evaluations of non-Atomic Energy Act materials (e.g.,

naturally occurring radioactive material (NORM)) will be specifically excluded.

SA-108: Reviewing the Non-Common Performance Indicator, Sealed Source and Device Evaluation Program Page: 3 of 9 Issue Date:

09/15/2020 B.

Review Guidelines

1. Evaluate the response generated by the Program to relevant questions in the IMPEP questionnaire. Depending on the level of detail of the information provided, the response to the questionnaire relative to this indicator may be useful to focus the review.
2. Identify any issues in the last IMPEP review that should be resolved in accordance with Section V.H.4, NMSS Procedure SA-100, Implementation of the Integrated Materials Performance Evaluation Program (IMPEP).
3. All SS&D evaluations completed since the last IMPEP review are candidates for review.
4. For SS&D evaluations, the reviewer should evaluate the following:
a. Select a representative sample based on the number and the type of evaluations performed during the review period. The selected sample should represent a cross-section of the Agreement States or NRCs evaluations completed and include as many different reviewers and categories (e.g., new registrations, amendments, inactivations, or reactivations) as practical.
b. Select work performed on behalf of the program under review by others, (i.e., an Agreement State, NRC, or a contractor), to ensure the technical quality of the work. The reviewer should also ensure that any individuals performing work on a programs behalf meet the programs training and qualification requirements.

NOTE: Because the work is being performed at the discretion of the program under review, any weaknesses or deficiencies that the review team identifies will affect the appropriate sub-element rating(s) and could ultimately affect the overall indicator rating for the program under review.

c. Identify if the initial review indicates an apparent weakness on the part of SS&D personnel, or problems with respect to one or more type(s) of SS&D or event evaluations, additional samples should be reviewed to determine the extent of the problem or to identify a systematic weakness.

The findings, if any, should be documented in the report. If previous reviews indicated a programmatic weakness in a particular area, additional casework in that area should be evaluated to assure that the weakness has been addressed.

5. Determine whether or not a backlog exists, based on the criteria established by the program, and if the backlog has any impact on health and safety.
6. Review the technical correctness with regard to all aspects of evaluations.

The checklist in the latest revision of NUREG-1556, Volume 3, or equivalent document, may be used to verify the full range of considerations.

SA-108: Reviewing the Non-Common Performance Indicator, Sealed Source and Device Evaluation Program Page: 4 of 9 Issue Date:

09/15/2020

7. Review the completeness of applications and proper signature by an authorized official.
8. Review records to document significant errors, omissions, deficiencies or missing information (e.g., documents, letters, file notes, and telephone conversation records). The decision-making process, including any significant deficiencies related to health and safety is noted during the evaluation, and adequately documented in the records.
9. Review the adequacy of the limitations and other considerations of use.
10. Evaluate the method used for the concurrence review with regard to SS&D reviewers who may not be fully qualified SS&D reviewers and may not have full signature authority.
11. Review the acceptance of variances or exceptions to industry standards in accordance with NUREG-1556, Volume 3, or equivalent guidance.
12. Review the guidance, checklists, regulations, and policy memoranda to ensure consistency with current accepted practice, standards and guidance.
13. Ensure the appropriate use of signature authority for the registration certificates.
14. Ensure the thorough technical evaluations of the SS&D designs are conducted because the SS&Ds design must be adequate to protect public health and safety. NUREG-1556, Volume 3, Consolidated Guidance about Materials Licenses: Applications for Sealed Source and Device Evaluation and Registration provides information on conducting the SS&D reviews and establishes useful guidance for IMPEP teams.
15. Reviews of Technical Staffing and Training should focus on the following:
a. Review the minimum training and qualification requirements for the Programs SS&D personnel. The qualifications should be documented and compatible with MD 5.6, Part II, Non-Common Performance Indicator: Technical Staffing and Training. The reviewer should determine whether the training and experience of all SS&D personnel meet these or equivalent requirements.
b. Agreement States should have established, documented training and qualification requirements that are either equivalent to NRC Inspection Manual Chapter (IMC) 1248, Formal Qualification Programs for Federal and State Material and Environmental Management Programs or have implemented NMSS Procedure SA-103, Reviewing the Common Performance Indicator, Technical Staffing and Training.

SA-108: Reviewing the Non-Common Performance Indicator, Sealed Source and Device Evaluation Program Page: 5 of 9 Issue Date:

09/15/2020

c. NRC should follow the SS&D training and qualification requirements documented in NRC IMC 1248.
16. Reviews of Technical Quality of the Product Evaluation Program should focus on the following:
a. Review the technical evaluations of the SS&D designs to ensure that the SS&Ds used by both licensees and persons exempt from licensing will maintain their integrity and that the design features are adequate to protect public health and safety. The technical quality of the product evaluation program should be assessed by the IMPEP review team on the basis of an in-depth review of a representative cross-section of evaluations performed on various types of products and actions. To the extent possible, the review team should capture a representative cross-section of completed actions by each of the Agreement State or NRC SS&D reviewers.
17. Reviews of Evaluation of Defects and Incidents Regarding SS&Ds should focus on the following:
a. Review incidents involving SS&Ds in accordance with the guidance provided in Section V of NMSS Procedure SA-105, Reviewing the Common Performance Indicator, Technical Quality of Incident and Allegation Activities.

This review is used to detect possible manufacturing defects and the root causes for these incidents with regard to potential generic SS&D issues.

The incidents should be evaluated to determine if other products may be affected by similar problems. Appropriate action should be taken, and notifications made to the Agreement States, NRC, and others, as appropriate.

b. The reviewer will evaluate the Agreement States response to notifications from the NRC with regard to generic SS&D issues related to the effectiveness of the States response to these notifications; the adequacy of the response when compared to the actions that would be reasonably expected to be taken by other evaluation programs within the national program; and the programs effort to notify Agreement States and NRC of the corrective actions by the issuance of a revised certificate.
c. Verify the cases where an Agreement State may have SS&D evaluation authority but is not performing SS&D reviews. The reviewer should verify that the program has committed in writing to having an evaluation program, as described in Section (C)(2) of Part II, MD 5.6, in place before performing evaluations.

SA-108: Reviewing the Non-Common Performance Indicator, Sealed Source and Device Evaluation Program Page: 6 of 9 Issue Date:

09/15/2020 C.

Review Information Summary The summary maintained by the reviewer for preparation of the final report will include, at a minimum:

1. The applicants name;
2. The registration certificate number;
3. The type of action (e.g., new registration, amendment, inactivation, or reactivation);
4. The date of issuance;
5. SS&D Type; and
6. Narrative of the comments, if any.

The summary of review information will not appear in the final IMPEP report.

However, it is a good practice for the reviewer to maintain this information to support the reviewers presentation to the MRB.

D.

Evaluation Process

1. The principle reviewer should refer to MD 5.6, Part II, Performance Indicators, and Part III, Evaluation Criteria, Non-Common Performance Indicator:

Sealed Source and Device Evaluation Program, for the SS&D evaluation program criteria. These criteria should apply to program data for the entire review period. A finding of satisfactory is appropriate when a review demonstrates the presence of the following conditions:

a. The SS&D program meets the criteria for a satisfactory finding for the performance indicator, Technical Staffing and Training, as described in Section III.B.1 of the MD 5.6 Directive Handbook.
b. Procedures compatible with NMSS Procedure SA-108, Reviewing the Non-Common Performance Indicator, Sealed Source and Device Evaluation Program.
c. Concurrence review of the technical reviewer's evaluation is performed by management or staff having proper qualifications and training.
d. Product evaluations address health and safety issues; are thorough, complete, consistent, and of acceptable technical quality; and adequately address the integrity of the products under normal conditions of use and likely accident conditions.

SA-108: Reviewing the Non-Common Performance Indicator, Sealed Source and Device Evaluation Program Page: 7 of 9 Issue Date:

09/15/2020

e. Registrations clearly summarize the product evaluation and provide license reviewers with adequate information with regard to license possession and use of the product.
f. Deficiency letters clearly state regulatory positions and are used at the proper time.
g. Completed registration certificates, and the status of obsolete registration certificates, are clear and are promptly transmitted to the Agreement States, NRC, and others, as appropriate.
h. The SS&D reviewers ensure that registrants have developed and implemented adequate quality assurance and control programs.
i.

There is a means for enforcing commitments made by registrants in their applications and referenced by the program in the registration certificates.

j.

There are no potentially significant health and safety issues identified from the review, that were linked to a specific product evaluation.

k. The SS&D reviewers routinely evaluate the root causes of defects and incidents involving the devices subject to the SS&D program and take appropriate actions, including modifications of the SS&D sheets and notifications to the Agreement States, NRC, and others, as appropriate.

Note: Examples of Less than Satisfactory Findings of Program Performance can be found in the IMPEP Toolbox on the State Communications Portal Web site. These examples may assist the reviewer in identifying less than fully satisfactory findings of a Programs performance.

E.

Discussion of Findings with the Radiation Control Program

1. The reviewer should follow the guidance given in NMSS Procedure SA-100, Implementation of the Integrated Materials Performance Evaluation Program (IMPEP), for discussing technical findings with staff, supervisors, and management.
2. If the IMPEP review team identifies programmatic performance issues, the IMPEP review team should seek to identify the root cause(s) of the issues, which can be used as the basis for developing recommendations for corrective actions. The NMSS procedure SA-100 contains criteria regarding the development of recommendations by the IMPEP team.

VI.

REFERENCES Management Directives (MD) available at https://scp.nrc.gov NMSS SA Procedures available at https://scp.nrc.gov

SA-108: Reviewing the Non-Common Performance Indicator, Sealed Source and Device Evaluation Program Page: 8 of 9 Issue Date:

09/15/2020 NUREG-1556 Volume 3, Consolidated Guidance About Materials Licenses:

Applications for Sealed Source and Device Evaluation and Registration available at https://www.nrc.gov/reading-rm/doc-collections/nuregs/staff/sr1556/v3/

Policy and Procedure Letter 1.57, NMSS Generic Assessment Process (Agencywide Documents Access and Management System (ADAMS) Accession No. ML020170155)

NRC Inspection Manual Chapters available at https://www.nrc.gov/reading-rm/doc-collections/insp-manual/manual-chapter/

IMPEP Toolbox (e.g., examples of a less than satisfactory program) on the State Communications Portal Web site available at https://scp.nrc.gov/impeptools.html VII.

ADAMS REFERENCE DOCUMENTS For knowledge management purposes, all previous revisions of this procedure, as well as associated correspondence with stakeholders, that have been entered into ADAMS are listed below.

No.

Date Document Title/Description Accession Number 1

2/27/04 STP-04-011, Opportunity to Comment on Draft STP Procedure SA-108 ML061640162 2

6/20/05 STP Procedure SA-108, Reviewing the Non-Common Performance Indicator, Sealed Source and Device Evaluation Program, Redline/Strikeout Version ML061640169 3

6/20/05 Summary of Comments on SA-108 ML061640173 4

6/20/05 STP Procedure SA-108, Reviewing the Non-Common Performance Indicator, Sealed Source and Device Evaluation Program ML040620291 5

6/30/05 STP-05-049, Final STP Procedure SA-108 ML051810473 6

7/14/09 FSME-09-051, Opportunity to Comment on Draft Revision of FSME Procedures SA-108 and SA-109 ML091330602 7

7/14/09 FSME Procedure SA-108, Draft Revision with tracked changes ML091330103 8

1/22/10 Final FSME Procedure SA-108 ML092740005 9

1/22/19 FSME Procedure SA-108, Resolution of Comments ML092740069 10 1/22/19 FSME Procedure SA-108, Draft Revision with tracked changes ML092740014

SA-108: Reviewing the Non-Common Performance Indicator, Sealed Source and Device Evaluation Program Page: 9 of 9 Issue Date:

09/15/2020 11 12/20/19 Interim Procedure SA-108: Reviewing the Non-Common Performance Indicator, Sealed Source &

Device Evaluation Program ML19324F028 12 7/6/20 Resolution of Comments ML20188A165 13 9/15/20 Final NMSS Procedure SA-108 ML20244A280