ML24157A374

From kanterella
Jump to navigation Jump to search
OLMC-330 Evaluation of the Operator Licensing Program June 2024
ML24157A374
Person / Time
Issue date: 07/03/2024
From: James Anderson
NRC/NRR/DRO/IOLB
To:
References
OLMC-330
Download: ML24157A374 (1)


Text

OLMC-330 June 2024 1

EVALUATION OF THE OPERATOR LICENSING PROGRAM A.

Purpose To establish the procedures and guidelines necessary to evaluate the Operator Licensing Program and ensure that it consistently implements the requirements in Title 10 of the Code of Federal Regulations (10 CFR) Part 55, Operators Licenses, the guidance in NUREG-1021, Operator Licensing Examination Standards for Power Reactors, and other policy documents.

B.

Overview Annual evaluation of the Operator Licensing Program typically involves audits of two written examinations and operating tests in each NRC region to ensure consistent quality, level of difficulty, administration, and grading. Examination audits are conducted in accordance with OLMC-320, Review of Initial Licensing Examinations. Guidelines for developing the annual audit schedule are in Section C.1 of this OLMC.

The evaluation also includes a detailed review of the operator licensing function of one regional office each year so that each region is reviewed once every 4 years in accordance with OLMC-310, Regional Office Review Procedure. Prior to the regional office review, the region performs a self-assessment of similar scope to the OLMC-310 review. The detailed reviews assess nine functional areas: (1) examination administrative requirements, (2) written examinations, (3) operating tests, (4) operator requalification program, (5) regional operations, (6) resource utilization, (7) regional and program office communications, (8) regional differences, and (9) cross-regional examination participation. Guidelines for scheduling the regional office review and selecting the team are in Section C.2 of this OLMC.

The results of examination audits, the regional office review and the regional self-assessment, and informal reviews (i.e., administrative reviews) are reviewed annually as discussed in Section C.3 of this OLMC to identify topics for training and potential changes to the examination standards.

This OLMC also provides guidelines for cross-regional examination participation and addressing regional differences in Section C.4.

OLMC-330 June 2024 2

C. Procedures 1.

Examination Audits a.

The Operator Licensing and Human Factors Branch (IOLB) in NRR will develop a draft audit schedule for the upcoming year, typically by no later than mid-December of the current year. Examination audits will be performed for two examinations from each regional office each year; however, under extenuating circumstances, as determined by the IOLB branch chief in consultation with the Director of the Division of Reactor Oversight, one examination review may be performed in a given year for a regional office. IOLB will use the examination schedule information in RRPS and the following guidelines to determine the examination samples for the coming year and assign reviewers:

IOLB will limit its review to exams in which there are no members of IOLB serving as the chief examiner.

IOLB will avoid auditing the same chief examiner and facility two years in a row unless other options are not available or practical. Each chief examiner should receive approximately the same number of reviews.

Availability of the reviewers.

Qualifications of the reviewers. The assigned reviewers must be qualified as an examiner for the technology of the plant where the examination will be administered.

On-site examination reviews are preferred to remote reviews. At least one on-site audit will be performed during examination administration, preferably at the region that is scheduled for the regional office review.

b.

Once IOLB has developed a draft audit schedule for the upcoming year, the schedule will be sent to the regional operator licensing branch chiefs for review.

c.

The regional operator licensing branch chiefs may request selection of different examinations for review. Selection of another examination will be based on the regions justification for the request (i.e., already a large number of observers, resource strain) and current resource and scheduling availabilities.

d.

When finalized, the Chief, IOLB, will ensure the audit schedule is provided to the assigned reviewers, regional branch chiefs, and chief examiners via email and posted on SharePoint.

Once finalized, it is acceptable to revise the final schedule based on examination performance, scheduling conflicts, or other unforeseen

OLMC-330 June 2024 3

circumstances. Revisions to the schedule will be provided via email and posted on the SharePoint site.

To minimize the burden on the facility licensee, the region and IOLB should avoid arranging for additional NRC observers to attend examinations scheduled for an onsite audit.

e.

The Chief, IOLB, may identify focus areas for the reviewers at any time during the year.

f.

Each examination review will be performed and documented in accordance with OLMC-320.

2.

Regional Office Review Generally, by no later than mid-December, the Chief, IOLB, will perform the following.

a.

Identify the team leader for the next years regional office review. The team leader will generally be a senior examiner from IOLB and will ensure that the office review is scheduled, conducted, and documented in accordance with OLMC-310.

b.

Contact the appropriate regional branch chief to arrange an optimal time for the review and decide whether the review will be hybrid, fully in-person, or remote.

The date of the review should be selected to support the completion of interviews with the region OL branch staff prior to the completion of the review. The office review may be performed fully on-site or in a hybrid manner (for example, the team members perform their tasks at their home office and then the team lead, program office branch chief, and select team members travel to the regional office to complete the review). The type of review will be stated on the annual audit/office review schedule, as will the name of the team lead and other team members, once determined. Additionally, the office review may be performed remotely if circumstances exist that prevent travel to the regions office or otherwise preclude any in-person activities.

c.

Identify IOLB staff and regional staff to participate in the next years regional office review. When possible, the team will include an experienced chief examiner from another regional office and an operator licensing assistant (OLA) from IOLB or another regional office.

OLMC-330 June 2024 4

d.

Request a DRO management representative to participate in the office review and update the DRO calendar with the dates of the office review.

3.

Annual Review a.

Approximately mid-March each year (i.e., when all audits from the prior calendar year are complete), the Chief, IOLB, will assign one or more reviewers to read the audit reports, informal review (appeal) review results, office review report from the previous year and the regions self-assessment report1. The purpose of the review is to identify strengths and weaknesses in the Operator Licensing Program, which may be used to identify (1) focus areas for upcoming audits and office reviews, (2) training topics for the biennial examiners conference, and (3) necessary revisions to program guidance documents.

b.

The reviewer will use the guidance in Appendix A to perform the review.

4.

Cross-Regional Examination Participation Guidelines a.

Each regional office is encouraged to support another regional office examination at least once per year as resources allow (refer to OLMC-120, National Examination Schedule). Ideally, the regional offices would rotate support to other regional offices from year to year.

b.

If significant regional differences are identified by an examiner participating in an examination in a different region, then the examiner should prepare a report on interaction (ROI) for review and resolution in accordance with OLMC-160, Report on Interaction Process. The ROI should describe the difference(s), the potential impact on the Operator Licensing Program, and any recommendations for resolution.

c.

Examiners are encouraged to identify and share best practices identified from other regions with the rest of the examiners (e.g., by discussing during the biweekly teleconference).

1 In accordance with OLMC-310, the team leader for the region office review will obtain the regions self-assessment report as part of the records collected and reviewed during the office review.

OLMC-330 May 2024 5

Appendix A ANNUAL REVIEW PERFORMANCE 1.

Review all completed audit reports, informal review (appeal) results, office review report, and regions self-assessment reports from the previous calendar year.

2.

Categorize all observations, including strengths, to identify common themes or trends.

Generally, to classify similar observations as a theme, the observations must be observed at least three times in at least two regions. Other criteria may be used to identify a theme, but the basis should be documented in the annual review memo. If all occurrences are identified in a single region, this would typically not be considered a theme for the purposes of identification in the annual review memo.

a.

The following categories may be useful in categorizing findings and observations but are not meant to be all-inclusive; additional categories may also be identified as necessary.

i.

Written Examination 1.

Sample plan: appropriate examination overlap between RO and SRO sections, record of rejected K/As provides appropriate reasons to reject the K/A, and test outline sampling requirements are met. Written examination has appropriate balance of coverage.

2.

Examiner comments: sufficient and appropriate comments from draft to final approved written examination on Form 2.3-5.

3.

K/A mismatch: question meets the intent of the K/A. For multiple part questions, all parts relate to the K/A.

4.

Question appropriate for license level: only reactor operator (RO) level of knowledge on the RO portion of the examination and at least one part of all senior reactor operator (SRO) only questions test SRO level knowledge.

5.

Question appropriate for assigned tier: Questions meet the tier guidance in NUREG-1021. Tier 3 not an extension of Tier 1 or 2.

6.

Level of difficulty: questions are in a range from level 2 to 4 on a 1 - 5 point difficulty scale.

7.

Appropriate use of open references: references do not provide a direct lookup to answer any question or eliminate a distractor for any question.

8.

Psychometrics: includes NUREG-1021 ES-4.2, Sections C and D flaws that are not contained in other categories (for example, backwards logic, cueing, subsets, etc.).

9.

Question attendant information: the questions contain the required attendant information, such as distractor analysis and training and operating procedure references.

OLMC-330 May 2024 6

Appendix A ii.

Operating Test - JPMs 1.

Onsite administration: for JPMs, includes any observations of note for an onsite audit such as exam security issues and scheduling changes (for example, substituting a JPM not validated during the validation week).

2.

Examiner comments: sufficient and appropriate comments that detail the changes from draft JPMs to the final approved JPMs on Form 2.3-3 for JPMs.

3.

JPM outlines: JPM outlines meet the criteria in Form 3.2-1 Administrative Topics Outline and Form 3.2-2, Control Room/In-Plant Systems Outline.

JPMs have appropriate balance of coverage.

4.

JPMs appropriate for the license level: the scope and depth of coverage required is appropriate for the applicants license level.

5.

Level of difficulty: JPMs are in a range of 2 - 4 on a 1 - 5 point difficulty scale.

6.

Alternate path criteria: there are 4 - 6 alternate path JPMs for ROs and SRO-Is and 2 - 3 alternate path JPMs for SRO-Us with each designated alternate path JPM meeting the guidance of ES-3.2 section E. JPMs designated as regular (non-alternate path) do not meet the alternate path criteria.

7.

Task standard: task standards are sufficiently defined in accordance with ES-3.2, section D.1.a.

8.

Critical steps: pre-identified critical steps meet the definition of critical step and are required to be accomplished at the right time to complete the task.

There are no unidentified critical steps.

9.

JPM guide attributes: The JPM guide is consistent between applicant handouts, examiner sheets, and procedure step references; contains the validation time and is marked time critical if applicable; and includes appropriate examiner cues.

iii.

Operating Test - Simulator Scenarios 1.

Onsite administration: document any observations/findings during the administration including schedule changes, use of spare scenario, exam security issues, etc.

2.

Examiner comments: sufficient and appropriate comments that detail the changes from draft scenarios to the final approved scenarios on Form 2.3-3.

3.

Varied initial conditions: scenarios have varied initial conditions for realism and to minimize exam predictability and are documented on Form 3.3-1, Scenario Outline. (ES-3.3 A.2) 4.

Varied transients and malfunctions: there is adequate balance of coverage per ES-2.3 section C.4. Includes scenario overlap with the written exam and JPMs.

OLMC-330 May 2024 7

Appendix A 5.

Number of transients and events for each applicant: Form 3.4-1, Events and Evolutions Checklist.

6.

Scenario quantitative attributes: each applicant met the quantitative attributes in ES-3.3 and targets in ES-3.4, or the chief examiner properly documented the basis for acceptability.

7.

Scenario guide attributes: Forms 3.3-1, Scenario Outline, and 3.3-2, Required Operator Actions, have the expected verifiable applicant actions, examiner cues, and procedure references. The guides include the critical task descriptions and are consistent between the two forms.

8.

Critical task criteria: Pre-identified critical tasks meet the criteria in ES-3.3 Section C, and there are no planned tasks meeting critical task criteria that are unidentified.

iv.

Grading 1.

Written exam grading sheets properly annotated: the applicants missed questions on the written examination are clearly marked. The final grade, including any post-examination answer key changes, is accurately reflected on the applicants answer sheets.

2.

Appropriate documentation of performance deficiencies (PD): PDs are documented and include the expected applicants action, the applicants actual action, why the action is incorrect, and the potential consequences of the applicants action. The documentation includes applicants responses to follow-up questions, if necessary. SPDs and CPDs meet the guidance contained in ES-3.6 of NUREG-1021.

3.

Appropriate assignment of PDs to rating factors: performance deficiencies are appropriately assigned to correct rating factors with sufficient basis to document assignment.

4.

Accurate grading tabulation: grading is accurately recorded on Forms 5.1-2 and 5.1-3 and on the applicants annotated written examination answer sheet.

5.

Resolution of post-examination comments: response to post-examination comments is appropriate and issues identified could not reasonably have been identified by the region prior to administration.

6.

Changes to examination or answer key after administration: changes are appropriate and do not impact exam validity. Additionally, there was no opportunity for the region to identify the change prior to administration.

v.

Other Documents and Records 1.

Completeness of examination package in ADAMS: all required files are in the correct ADAMS exam package under the correct file names.

2.

Appropriate ADAMS profiling: appropriate release data for examination packages, user access, and filed in the correct ADAMS folder.

3.

Necessary signatures provided: all administrative forms have required facility and NRC signatures.

OLMC-330 May 2024 8

Appendix A 4.

Completeness of operator docket files: the operator docket files contain the as-administered examination copies of both the written and operating test sections. The marked-up answer sheet is included in their docket folder. Form 5.1-2 is included and appropriately filled out. (ES-5.1 section A) 5.

Completeness of the examination report: report adequately documents applicant performance and the results, any examination security issues, and the resolution of post-examination comments. (OLMC-510) 3.

After categorizing all findings and observations, determine if there are any themes or trends. Generally, to identify a theme, the category must have at least three occurrences in at least two regions. To identify trends, look at the last two years of data in each category and determine if there is an upward or downward trend identified.

4.

Document themes (which can include strengths) and trends in a memo to the Chief, IOLB, with the regional OL chiefs on distribution, including any recommendations.

Although appeal results may contain personally identifiable information (PII), do not include any PII in the annual review report.

5.

Place the review results memo in the ADAMS Main Library under NRR/NRR-DRO/NRR-DRO/IOLB/Annual Review folder. Designate the document as Non-Publicly Availably, with a sensitivity classification of B.1: Non-Sensitive. Check the NRC-Users = Viewer box and provide other access as needed (such as DRO Admins).

6.

Inform the examiners of the availability of the document during the biweekly telephone call.