ML24347A124

From kanterella
Jump to navigation Jump to search
NUREG 1021 Revision 12 Effectiveness Review Two Year Report 2024
ML24347A124
Person / Time
Issue date: 12/31/2024
From: Maurin Scheetz
NRC/NRR/DRO/IOLB
To:
References
NUREG R12
Download: ML24347A124 (1)


Text

Page 1 of 20 NUREG-1021 Revision 12 Effectiveness Review Two Year Report, September 2024 In May of 2022, the operator licensing program office (NRR/DRO/IOLB) initiated an effectiveness review of the major changes introduced in Revision 12 of NUREG-1021, Operator Licensing Examination Standards for Power Reactors, which went into effect on March 17, 2022. As a result of the Differing Professional Opinion (DPO) process and DPO-2021-002 (Agencywide Documents Access and Management System (ADAMS) Accession No ML23059A216), the original effectiveness review plan was revised in August 2023 to include additional subtasks for the review of applicant and operator performance related to their knowledge of generic fundamentals topics. The updated effectiveness review plan is available in ADAMS at accession No. ML23236A119.

The review focuses on the following major changes:

1.

The addition of generic fundamentals questions on the written examination and the discontinuation of the standalone NRC Generic Fundamentals examination (GFE) 2.

The revised critical task methodology 3.

The identification and documentation of critical and significant performance deficiencies on the simulator operating test 4.

Licensee implementation and use of the National Academy for Nuclear Training education and experience guidelines in Revision 2 of ACAD-10-001, Guidelines for Initial Training and Qualification of Licensed Operators, as the standard set of eligibility requirements for accredited training programs (ML21144A141). Specifically, the review focuses on a new allowance in Rev 2 of ACAD-10-001 to accept SRO-instant applicants with non-engineering degrees. Facility licensees can now assess the degree content of related science degrees for acceptable amounts of engineering, math, chemistry and physics content to determine if applicants can enter initial license training program.

Related to this change is that NUREG-1021 Revision 12 no longer specified that applicants have to that spend three months as an extra person on shift to be eligible for a senior reactor operator (SRO) or reactor operator (RO) license.

5.

The re-structuring of the NUREG into topic-based sections The purpose of the effectiveness review is to monitor the application of new instructions and guidance in NUREG-1021, Revision 12, and determine if additional actions are needed, such as additional training or clarifications to support implementation of the major changes made in the revision. The review includes an assessment of the implementation and outcomes of each major change.

In September 2023, the program office prepared an interim report to share observations from the first year of Revision 12 examinations (ML23304A006) and made several adjustments to the scope of the effectiveness review because of these observations. Outstanding recommendations from the interim report are included in this report (i.e., considerations to improve guidance and instructions in NUREG-1021). After issuing the interim report, the staff collected examination data for a second year. In September 2024, the staff analyzed all data

Page 2 of 20 from the first two years of examinations under NUREG-1021 Revision 12 and compiled this two-year report. This report is organized by the eight tasks directed by the effectiveness review plan as follows:

1.

Generic Fundamentals Integration (Task 1) 2.

Generic Fundamentals Knowledge (Task 2) 3.

Revised Critical Task Methodology (Tasks 3 and 4) 4.

Critical Performance Deficiencies and Significant Performance Deficiencies (Tasks 3 and 5) 5.

Eligibility Requirements (Task 7) 6.

Restructured NUREG (Task 8) 7.

Overall Impact of NUREG-1021 Revision 12 (Tasks 6 and 8)

This report shares the program offices findings and conclusions for effectiveness review plan task numbers 1, 3, 5, 7 and 8 based on information collected from the development, administration and grading of initial licensing examinations during the first two years of Revision

12. The program office will continue to collect data and share findings in support of tasks 2, 4 and 6 through 2027.

Assessment Areas and Findings Generic Fundamentals Integration Task 1 o

Review the integration of generic fundamentals topics into the site-specific examination and verify that the instructions in NUREG-1021 are being used as intended; identify any areas for improving clarity in the instructions/guidance related to these "new" questions.

o Evaluation Period: CY 2022 and CY 2023 Data Collected Title 10 of the Code of Federal Regulations (10 CFR) 55.41, Written Examination: Operators, and 10 CFR 55.43, Written Examination: Senior Operators, require that the written operator licensing examinations for reactor operators (ROs) and senior reactor operators (SROs) include questions concerning various mechanical components, principles of heat transfer, thermodynamics, and fluid mechanics. These regulations also require that the written examinations must address fundamentals of reactor theory, including the fission process, neutron multiplication, source effects, control rod effects, criticality indications, reactivity coefficients, and poison effects. Written examination questions that test knowledge of these concepts are referred to as generic fundamental (GF) questions. GF questions originate from knowledge and ability (K/A) statements in sections 5 Component and 6 Theory of the vendor or technology specific K/A catalog (herein referred to as the applicable K/A catalog).

To assess the integration of GF questions on the site-specific written examination and the adequacy of the instructions in NUREG-1021 for sampling generic fundamental knowledge statements from the applicable K/A catalog, the staff reviewed written examination outlines and tracked the number of generic fundamentals questions for all the examinations administered during the first two years of NUREG-1021 Revision 12. According to the instructions in NUREG-1021, GF K/As are sampled in Tier 2 and Tier 4. Every examination must have six generic

Page 3 of 20 fundamentals questions in Tier 4 (all from Section 6 of the applicable K/A catalog), and, because of the random sampling method, there is a likelihood that a Components K/A is selected for one or more of the Tier 2 systems. This is used to test a concept about a component relevant to the selected system. In other words, some written examinations will have more than six GF questions if the random sampling process results in testing a component K/A in Tier 2.

To monitor the quality of the GF questions, the staff reviewed results from 14 examinations chosen for an in-depth review via the operator licensing program offices program for the review of initial licensing examinations (also known as the exam audit program) in calendar years 2022 and 2023. The exam audit program is conducted in accordance with Operator Licensing Manual Chapter (OLMC)-320 (ML22216A198). For the 14 examinations that were audited, IOLB examiners conducted a detailed review of each generic fundamentals question against the criteria on NUREG-1021, Form 2.3-5, Written Examination Review Worksheet, and observed how chief examiners documented their review of proposed questions against the same criteria.

The staff monitored if GF questions were involved in any applicant requests for informal NRC staff review for calendar years 2022 and 2023.

Results and Findings For the first two years of NUREG-1021 Revision 12 examinations, the staff verified that every written examination included the new category of GF questions in accordance with NUREG-1021, examination standard (ES)-4.1. On average, there were 6.8 questions per examination testing generic fundamental K/As (i.e., from Section 5 and 6 of the applicable K/A catalog). The staff finds that the sampling instructions for determining how many GF questions are on the examination are adequate and being followed properly during the development of the written examination outlines.

NUREG-1021 contains instructions to review the written examination for psychometric and job content flaws; these are flaws that impact the quality of GF questions. All GF questions were reviewed on 14 examinations from 2022 and 2023 as part of the operator licensing program offices exam audit program. Nine of the 96 GF questions audited, or approximately 9% of the GF questions in the sample, contained psychometric or job content flaws. Psychometric and job content flaws were observed on questions in other sections of the written examinations and the audited examinations remained valid for licensing purposes. Three notable observations were collected from the staffs review of the quality of GF questions:

1.

An audit of a 2022 examination found four of six GF questions were taken from the NRC GFE bank and not modified to be site specific.

2.

In two separate exam audits, reviewers cited a similar concern regarding the operational validity of some K/A statements taken from sections 5 and 6 of the applicable K/A catalog, especially when questions for these K/As originated from the NRC GFE bank.

For example, the reviewer found a question written for the theory K/A statement to define fission product poisons, to have low level of difficulty (i.e., LOD 1) and questioned the operational validity of the question. In another example, the reviewer found that even after modification, an NRC GFE bank question was not operationally valid because the question stem provided plant status information including a statement that net positive suction head is lowering for a given pump; the reviewer pointed out

Page 4 of 20 that there is no indication of NPSH in main control room, which lowered the operational validity of this question. NUREG-1021 ES-4.2 section B 2.d states to ensure that Tier 4 questions focus on operationally valid knowledge of reactor theory and/or thermodynamics. Although the instructions are adequate in NUREG-1021 to develop operationally valid questions with LOD greater than one, there is a need for additional scrutiny of questions derived from the NRC GFE bank.

3.

Several GF questions that were derived from the NRC GFE bank contained psychometric flaws, or the question was not applicable for the specific plant. For example, a Tier 2 question that tested a component K/A concerning thermocouples was taken from the NRC GFE bank. The question tested what a thermocouple would display in the main control room if there is an open circuit condition. The NRC GFE bank question involved a simplified circuit taken from a National Academy for Nuclear Training student guide on sensors and detectors. During post examination comments, the examination team learned that thermocouples at the site involve a more complex circuit, and during an open circuit condition, the site thermocouples respond differently than the one tested in the bank questions. One NRC chief examiner provided feedback to the program office that they found the plausibility analysis for the GF questions as submitted by licensees during recent examinations was low; they cautioned use of NRC GFE bank questions as is, because these questions were not developed using the specific instructions contained in the NUREG-1021 examination standards..

Because of these observations, the staff finds that additional scrutiny is needed when developing and reviewing questions that are derived from the NRC GFE bank. Exam developers and reviewers must thoroughly check that they meet the NUREG-1021 requirements for psychometrics and operational validity. The questions need to be technically accurate for the specific site. If it is not possible to write a question that is operationally valid with a level of difficulty greater than one for the section 5 or 6 K/A, the process in NUREG-1021 ES-4.2 for deviating from the approved outline (e.g., K/A rejection and reselection) should be used. The staff finds that the instructions for sampling and developing GF questions are adequate and that the instructions could be enhanced with additional guidance for employing further scrutiny to questions derived from the NRC GFE bank.

There were no requests for informal NRC staff review of GF questions related to an applicant written examination failure during CY 2022 and CY 2023. In summary, the staff finds that the integration of GF questions did not result in additional requests for informal staff reviews.

Recommendations Because the review identified areas to improve GF questions, the staff recommends sharing the observations and findings above with both NRC examiners and industry examination developers at upcoming operator licensing conferences and public meetings. Specifically, NRC GFE bank questions need to be checked for psychometric flaws and operational validity before using them on the site-specific examination. Examples of how to make GF questions appropriate for use on the site-specific written examination are available in the Operator Licensing Program Feedback (OLPF) database (ML23101A094) as the OLPF titled Gen 61. The process for K/A replacement can be used when it is not possible to write an operationally valid question with

Page 5 of 20 a level of difficulty greater than one. The staff recommends adding a caution to a future revision of NUREG-1021 about using NRC GFE bank questions.

Conclusion The program office concludes that generic fundamentals questions have been effectively integrated into the site-specific written examination and that the instructions in NUREG-1021 are adequate to develop generic fundamental style questions for use on the site-specific written examination. Task 1 is complete.

Generic Fundamentals Knowledge Task 2 o

Verify that generic fundamental knowledge is being maintained through initial and continuing training programs.

o Evaluation Period: April 2022 - September 2023, this period is designated as the pre-evaluation period because most applicants for initial licenses took the NRC GFE.

Data Collected The staff collected data for individual applicant performance on the GF questions including whether those applicants previously took the NRC GFE for the designated pre-evaluation period of April 2022 - September 2023. This data will be used to compare with performance on GF questions over two subsequent evaluation periods that extend into September of 2027. This will help the program office capture any changes over time in GF question performance in the years following the discontinuation of the NRC GFE which occurred in September of 2021.

To verify that GF knowledge is being maintained through continuing training programs, the staff reviewed trends from NRCs Annual Reports on the Effectiveness of Training at Operating Power Reactors from CY18 - 21, CY22, and CY23 (ML22154A474, ML23151A696 and ML24142A265, respectively) and periodic operating experience communications to determine if any inspection findings or industry events could be attributed to a deficiency in generic fundamental knowledge. The NRCs Annual Effectiveness Review of Training in the Industry report includes information about inspection findings associated with the crossing cutting aspect H.9 for training, reported industry events, and NRC observations of the training program accreditation process.

Results and Findings Initial Training To assess level of GF knowledge during the pre-evaluation period (April 2022 through September 2023), the staff reviewed the performance of 625 applicants on their given set of GF questions across 56 initial licensing examinations developed using Revision 12. During the pre-evaluation period, 449 of 625 applicants or 72% of applicants previously took the NRC GFE; in other words, most applicants took the NRC GFE.

Table 1 below summarizes GF performance during the pre-evaluation period.

Page 6 of 20 Table 1: Applicant and facility licensee performance on GF questions during pre-evaluation period, April 2022-September 2023 (sample size: 56 exams, 625 applicants)

Percent of Applicants scoring

<70%

Percent of Applicants scoring

<50%

Percent of facility examinations with average scores of <70%

10.7%

2.9%

1.7%

Average performance on the generic fundamentals questions was 87.5% correct with a standard deviation of 14%.

The staff looked for any difference in average performance between applicants that took the NRC GFE when compared to applicants that did not take the NRC GFE and did not find any significant difference. Applicants that previously took the NRC GFE had slightly higher scores on the GF questions (M = 88%. SD = 13%) than those that had not previously taken the NRC GFE (M = 86%, SD = 15%). However, a t-test was found to be non-statistically significant for these two groups, t(622) = 1.56, p =.117.

Out of 625 applicants, 23 applicants answered half or less than half of their GF questions correctly, or approximately 3.7% of applicants and during this time, only one facility licensee examination had an average GF score less than <70%. Thresholds for revisiting the sampling methodology for GF questions were not triggered during the pre-evaluation period.1 The program office will continue to monitor, in accordance with the effectiveness review plan, how GF knowledge is being maintained in initial license training programs.

Continuing Training The staff identified one event described in an inspection report (ML23122A168) that involved generic fundamental knowledge for reactivity. During a reactor startup on December 21, 2022, control room operators exceeded a plant procedure limit for start-up rate due to excessive control rod withdrawal. The licensee determined the direct cause to be inadequate monitoring of diverse indication and the failure to establish critical parameters. The licensees corrective action referenced use of their systematic approach to training process to address the use of diverse indications, critical parameters, and to improve oversight intervention. As a result, the program office associates aspects of this event with the following knowledge statements for criticality found in NUREG-1122, Knowledge and Abilities Catalog for Nuclear Power plant Operators: Pressurized Water Reactors, Section 6.1, Reactor Theory: 192008 Reactor Operational Physics: K1.01, K1.05, and K1.08 (ML20260H083). Since the operators involved in this event were licensed using examination standards that preceded NUREG-1021 Revision 12, this is being captured by the program office as a data point for task 2 in assessing the use of licensed operator continuing training to maintain operator knowledge of GF topics.

Using the summary of findings related to training shared in the NRCs annual Report on the Effectiveness of Training at Operating Power Reactors for CY21, CY22, and CY23, the staff reviewed each finding to determine if any could be attributed to weakness in GF knowledge; no 1 Thresholds to reexamine the sampling method for GF questions are specified in the revised NUREG-1021 Revision 12 Effectiveness Review Plan (ML23236A119).

Page 7 of 20 findings could be attributed to a weakness in GF knowledge. This is summarized in Table 2 below.

Table 2: Reactor Oversight Program Findings related to Training Year Total Number of Findings Number of Findings related to H.9 Percent of Findings Related to H.9 Number of Findings Related to H.9 that could be attributed to GF knowledge 2023 466 6

1.2%

none 2022 425 6

1.4%

none 2021 278 7

2.5%

none Additionally, because of DPO-2021-002 (ML23059A216), the NRR operator licensing staff was directed to conduct additional activities as a result of the change in NUREG-1021 Revision 12 to eliminate the NRC GFE and integrate GF questions on to the site-specific examination. First, the staff was directed to evaluate NRC inspectors ability to monitor and disposition potential generic fundamentals knowledge weaknesses in licensed operators, should they occur. The staff determined that inspectors, particularly resident inspectors, needed training and updated guidance so that the NRC could adequately monitor licensed operator performance. The staff provided training to regional inspectors and all operator licensing examiners during the Spring 2023 counterpart meetings. The training informed inspectors about the changes in fundamentals testing for operators, how to identify potential knowledge deficiencies in licensed operators, and how to disposition potential issues. The staff recently revised Inspection Procedure 71111.11, Licensed Operator Requalification Program and Licensed Operator Performance (ML21257A202) to include guidance for inspectors to observe and follow-up on any potential observations of weaknesses in operator knowledge of generic fundamentals which are defined as reactor theory, thermodynamics, and understanding of how components such as valves, breakers, controllers, work; the revised inspection procedure will be issued on January 1, 2025. The inspection procedure guidance is in accordance with the Reactor Oversight Process and the Memorandum of Agreement Between the Institute of Nuclear Power Operations and the U.S. Nuclear Regulatory Commission (ML23026A093) and does not add any new processes or revise any inspection requirements.

The program office will continue to monitor inspection results and operating experience trends for any indication of deficiency in GF knowledge.

Page 8 of 20 Revised Critical Task Methodology Task 3:

o Determine if the instructions for identifying critical tasks are clear and verify proper and consistent application of the critical task methodology; identify any areas for improving clarity in the instructions/guidance related to critical tasks.

o Evaluation period: April 2022 - April 2024 Data Collected To determine if the instructions for identifying critical tasks are clear and to verify proper and consistent application of the critical task methodology, the staff reviewed results from examinations chosen for an in-depth review as part of the program offices exam audit program.

The exam audit program is conducted in accordance with OLMC-320, IOLB Review of Initial Licensing Examinations (ML22216A198) and 14 initial licensing operating tests were audited for calendar years 2022 and 2023. For these 14 operating tests, the staff (1) conducted a detailed review of each critical task against the critical task methodology in ES-3.3, General Testing Guidelines for Dynamic Simulator Scenarios; (2) tracked the number of times that alternative boundary conditions (ABC) were used; and (3) analyzed if boundary conditions were arbitrary conditions.

Results and Findings Critical Task methodology Overall, the staff found that critical tasks met the critical task methodology in NUREG-1021 Revision 12 with areas for improvement in applying the CT methodology during operating test development and review. There were two occurrences of a task in a scenario guide that met the critical task criteria but were not marked in the scenario guide as a critical task. For example, the task, isolate a ruptured steam generator, is an Emergency Operating Procedure directed action essential to the events overall mitigative strategy; therefore, it should have been marked as critical in the simulator guides where it appeared. NRC examiners did not observe any performance deficiencies associated with this task during administration of the operating tests, thus, this is an administrative issue only for the exams where this was observed to have occurred. The staff finds that CT identification is an area for improvement and recommends additional training be given to examination developers and reviewers so that critical tasks are not missed in future operating tests. For example, when stepping through the EOP(s) during scenario development, use a questioning attitude for the major steps of the EOP and compare them to the critical task methodology in ES-3.3 so that tasks that critical are properly pre-identified in the simulator guide. This can also be verified during onsite validation week.

NUREG-1021 ES-3.3 section C.2 provides instructions for determining the measurable performance standard for a CT including the following:

Boundary conditions ensure that examiners have agreed on limits for what is acceptable for task completion and what constitutes task failure. When bounding CTs, in addition to asking what constitutes how the task is met, it can be helpful to ask how an applicant or operator could fail the task.

Page 9 of 20 NUREG 1021 ES-3.3 section C.2 also provides examples of ABCs including the following type of ABC:

exiting or transition from the procedure that first directs CT accomplishment The staff found two occasions where the boundary condition was not achievable and therefore it was not possible for an applicant to fail the CT. For one CT, the boundary condition involved thermal-hydraulic instabilities, and the reviewer questioned whether the simulator would model these instabilities. The other boundary condition for a CT involved the completion of a step in an EOP. The task had to be performed before the crew completed the EOP step; however, the scenario was setup so that the crew could not accomplish the EOP step without completing the critical task first. In other words, it was not possible to fail the CT.

The instructions in NUREG-1021 for CT boundary conditions do not directly state that it must be possible to fail the CT during the scenario or direct that the boundary condition is possible to meet or exceed during the scenario. The staff finds that the instructions in ES-3.3 could be improved to include instructions that it must be possible to reach the boundary conditions during the operating test and fail the CT. NRC examiner training on this finding is also recommended.

Alternate Boundary Criteria NUREG-1021 ES-3.3 section C.2 includes guidance for establishing boundary conditions for a CT and delineates two types of boundary conditions: preferred boundary conditions and alternative boundary conditions. The program office promulgated clarifying guidance about the proper use of alternative boundary conditions including that the conditions must not be arbitrary (see OLPF 3.3.5).

The staff collected the following information, shown in Table 3, for the use of alternate boundary conditions (ABC) from 14 operating tests reviewed under the exam audit program:

Table 3: Alternate Boundary Conditions Found During Examination Audits in years 2022 and 2023 Exam No.

Number of CTs with ABC Number of arbitrary ABCs 1

3 0

2 2

1 3*

4 4

4 1

0 5

0 Not applicable (N/A) 6 2

0 7

6 1

8 1

0 9

0 (N/A) 10 4

0 11 3

0 12 0

(N/A) 13 0

(N/A) 14 1

0

  • This examination was developed before additional guidance and training about proper use of alternate boundary conditions was provided to NRC examiners and facility licensees.

Page 10 of 20 In general, an operating test consists of three to five scenarios with at least two CTs per scenario; therefore, an operating test includes a range of six to ten CTs. Based on the data from exam audits, the staff finds that ABCs are used less than 40% of the time. NUREG-1021 does not limit the number of ABCs that can be used for the operating test, but alternate boundary conditions are not the preferred type of boundary condition.

As shown in Table 3 above, arbitrary ABCs were identified on three examination audits; all were time-based ABCs without obvious basis for the times chosen. In two cases, the basis for the ABC was documented as agreement by management without any additional information such as why the cited time period is reasonable or if the time period was calculated in some fashion. Another examination audit found an exam with four instances of using arbitrary ABCs for critical tasks. On this operating test, the four ABCs were similar in that each specified critical task completion within a specified time period. Documentation for the basis of the selected time periods was missing though there was a statement that the facility licensee examination developer and the NRC chief examiner agreed on the time-based ABCs. NUREG-1021, ES-3.3 allows the use of a time-based ABC; the time must be reasonable and agreed upon by the NRC chief examiner and the facility licensee. The NUREG does not explicitly state to document the basis for time-based ABCs on the simulator guide. The simulator guide should contain additional information about the basis for time-based ABCs; documenting the basis for time-based boundary conditions should increase clarity and consistency for CTs, which should also prevent the use of arbitrary boundary conditions.

Finally, during the transition period after NUREG-1021 Revision 12 was published, but before it was used for grading Revision 12 examinations, the program office received several questions about the new and revised instructions for critical task identification, the elements of critical tasks, and grading performance deficiencies. As a result, the program office promulgated additional clarification for the critical task methodology in two separate reports on interactions to answer these questions. The program office also published these clarifications in the OLPF database as the OLPFs numbered 3.3.1 through 3.3.5. These clarifications will be incorporated into NUREG-1021 as part of the normal NUREG revision process.

Recommendations The program office has the following recommendations as a result of the findings for this task:

1. Provide training to NRC examiners on findings and observations for application of the CT methodology identified in this section of the report. Communicate best practices for documenting ABCs on simulator operator test forms with NRC examiners
2. On the next NUREG-1021 revision, add specific guidance and instructions in appropriate examination standards for the following:
a. Guidance for identifying all CTs in operating test scenarios. For example, when stepping through the EOP(s) during scenario development, use a questioning attitude for the major steps of the EOP and compare them to the critical task methodology in ES-3.3 so that Critical tasks are properly pre-identified in the simulator guide. This can also be verified during onsite validation week.

Page 11 of 20

b. Instruction that it must be possible for the applicants to fail a CT; meaning, the boundary conditions must be achievable. Clarify that it must be possible to accomplish the CT and it must be possible to fail the CT.

c.

Instructions to document the basis for time-based boundary conditions.

d. Guidance related to the clarifications found in OLPFs 3.3.1 through 3.3.5 into the next NUREG-1021 revision.

Conclusion The program office concludes that the instructions for identifying critical tasks are adequate and that the CT methodology is generally being applied consistently, with some exceptions identified. The staff identified several areas for improving the instructions and guidance related to critical tasks. Task 3 is complete. The program office will continue to collect data and assess the application of the critical task methodology for AP1000 simulator operating tests under Task 4 of the review, which remains open until additional AP1000 operating tests are administered.

Critical Performance Deficiencies and Significant Performance Deficiencies Tasks 3 and 5:

o Verify proper and consistent application of grading of critical performance deficiencies; identify any areas for improving clarity in the instructions/guidance related to critical performance deficiencies.

o Verify that the instructions for identifying and grading significant performance deficiencies are clear and verify proper and consistent application of grading instructions; identify any areas for improving clarity in the instructions/guidance related to significant performance deficiencies.

o Evaluation period: April 2022 - April 2024 Data Collected To verify proper and consistent grading of critical performance deficiencies (CPDs) and significant performance deficiency (SPD), the staff reviewed all documented CPDs and SPDs from the 75 initial licensing operating tests administered during the first two years of NUREG-1021 Revision 12 (April 2022 - April 2024). For each CPD and SPD, the staff reviewed the operating test comments on the affected applicants Form 3.6-4 and compared the grading documentation to the instructions in NUREG-1021 ES-3.6, Grading and Documenting Operating Tests, section B. The staff reviewed all requests for informal staff reviews during this period to determine if any involved CPD or SPD related contentions.

The staff also reviewed feedback received informally and through the program offices report on interaction process, about the new and modified instructions in NUREG-1021 Revision 12 for CPDs and SPDs.

Results and Findings There were 11 CPDs and 27 SPDs during the two-year period in which 75 operating tests were given. The staff found one instance in which the documentation of an applicants CPD

Page 12 of 20 and related PDs diverged from the instructions in NUREG-1021: multiple performance deficiencies were combined and graded as a CPD because the consequences of the PDs resulted in the creation of a new CT. NUREG-1021 ES-3.6 B does not instruct the grader to combine PDs into a CPD. Instead, each PD should be treated separately and assessed against the criteria for a SPD or CPD. In this case, the program office agreed that the crew created a new critical task. The creation of this new critical task was the CPD by itself. The observed performance deficiency (the applicants action or inaction) was that the applicant failed to move through EOPs in a timely manner to ensure the safety injection was terminated prior to meeting the entrance criteria for FR-P.1. The post scenario CT is to terminate safety injection prior to meeting the entrance criteria for FR-P.1. It was a CT because it is an action essential to an events overall mitigative strategy. The staff found that the applicants competency based scores should have been lower than those documented. The staff discussed this error with the cognizant region. As result, the region initiated a report on interaction (ROI). Through that ROI, the program office had the opportunity to answer questions and give feedback for NRC examiners for hypothetical CPD and SPD grading cases that involved multiple related PDs and more than one member of the crew.

From the review of the 27 SPDs assigned to applicants during the first two years of Revision 12 examinations, the staff found that in almost all cases, each documented SPD met the grading criteria, although the reviewer noted that sometimes it was not clear why the SPD was assigned to a specific competency rating factor. Additionally, at one facility, multiple applicants had similar SPDs related to the diagnosis of plant indications. Some of these SPD write-ups did not include the actual performance deficiency that was observed and instead focused more on how SPD criteria were met.

Because there were a few instances of incorrect application of the instructions in NUREG-1021 ES-3.6, the staff identified several enhancements to improve the instructions in NUREG-1021 for grading CPDs and SPDs.

There are no informal staff reviews associated with CPDs or SPDs for the period of this report.

Recommendations The program office recommends enhancing the instructions in NUREG-1021 with the guidance provided in the ROI for grading CPDs and SPDs and sharing the results of this task including the specific CPDs and SPD findings with the NRC examiners at the next examiner training conference.

Conclusion The program office concludes that the instructions for identifying and grading CPDs and SPDs are adequate and that corrective actions were taken to ensure consistent application of the guidance. Tasks 3 and 5 are complete.

Page 13 of 20 Eligibility Requirements Task 7 o

Assess the use of new allowances in ACAD-10-001 for SRO-instant applicants to be eligible for licensing class by possessing a "related sciences degree." Assess any reductions to time spent on shift as under-instruction. Determine the impact, if any, of the changes made to eligibility requirements.

o Evaluation period: June 2022-June 2024 Data Collected NRC chief examiners shared their observations of facility licensee use of the new allowance for SRO-instant applicants to possess a related science degree as a prerequisite to entering initial license class and changes made to time spent on shift as under-instruction during initial training with the program office in support of the effectiveness review. Data for this task was collected from responses to the questionnaire that each chief examiner received approximately 4 weeks post exam administration.

One question on the questionnaire asked whether any applicants possessed a related science degree, as defined in ACAD 10-001, Guidelines for Initial Training and Qualification of Licensed Operators, revision 2. ACAD 10-001 revision 2 was required to be implemented by facility licensees with accredited training programs by no later than May 31, 2022.

Therefore, information was collected from initial licensing examinations from June 2022-June 2024.

Another question on the questionnaire asked whether the facility licensee made a reduction to the number of hours that applicants complete as the extra-person-on-shift (EPOS) during initial license training. This second question was used to identify any inconsistencies in how licensees implement changes to the number of hours applicants complete as the extra-person-on-shift (EPOS) during initial license training. Under, NUREG-1021, Revision 11, there was specific guidance that applicants should have a minimum of three months of EPOS. The change in Revision 12 to cite ACAD 10-001 as the primary source of eligibility criteria for training programs accredited by the National Nuclear Accreditation Board removed the NRC requirement for a specific amount of time for EPOS (i.e., three months which equates to 480 - 540 hours0.00625 days <br />0.15 hours <br />8.928571e-4 weeks <br />2.0547e-4 months <br /> depending on licensee work schedules). Operator Licensing Program Feedback No. 2.2.15 clarifies that:

absent a commitment to ensure applicants spend at least 3 months as an extra person on shift, stations with an accredited training program would need to determine the amount of time to schedule applicants as the extra person on shift using the systems approach to training process (e.g., by considering the time needed for applicants to successfully complete on-the-job training tasks and under instruction watches).

This data collection began in January 2023 and ended in June 2024; specifically, the staff tracked any reductions to the previous requirement of 3 months of extra person on shift time observed by chief examiners during reviews of operator licensing applications and initial licensing training program documents. Although a reduction in the number of EPOS hours alone does not indicate a deficiency or issue, the staff chose to note where the hours had been reduced since it would be indicative of a change to the licensees training program.

Page 14 of 20 Changes (including reductions to EPOS hours) should be supported by the systematic approach to training process.

Results and Findings Related Science Degrees From June 2022 through June 2024, 74 initial licensing examinations were administered for 883 applicants. In this sample, there were 15 observations of SRO-instant applicants possessing a degree meeting the related science categorization in ACAD 10-001; this is less than 2% of applicants. NRC chief examiners reported that facility licensees are using the guidance for making determinations about related science degrees in accordance with the guidance in ACAD 10-001. Examples of related science degrees observed include: a four-year automative technology degree, a Bachelor of Science (B.S.) degree in energy systems technology, a B.S. in computer technology, a B.S. in mathematics, a B.S in nursing, and an applicant with both an associates degree in electrical engineering and a B.S in technical management. Finally, each of the 15 applicants passed their licensing examination and received an SRO license.

As a result, the program office finds that use of this provision is infrequent (i.e., less than 2% of applicants over two years) and facility licensees are following the guidance in ACAD-10-001 for making these determinations. Similarly, the staff did not observe an adverse impact to the licensing process for these applicants because each SRO-instant applicant with a related science degree passed their initial licensing examination.

The program office finds that facility licensee use of this new allowance is adequate.

Extra Person On Shift In the 52 initial licensing examinations administered by the NRC from January 2023 to June 2024, the staff observed one reduction to the previous requirement for 3 months of extra-person-on-shift (EPOS) time. One facility licensee lowered their time requirement for EPOS from 3 months to 6 weeks or a minimum of 240 hours0.00278 days <br />0.0667 hours <br />3.968254e-4 weeks <br />9.132e-5 months <br />; this change was determined by the amount of time applicants needed to complete on the job training requirements on their qualification cards; most applicants at this site had 270 hours0.00313 days <br />0.075 hours <br />4.464286e-4 weeks <br />1.02735e-4 months <br /> recorded in their training record.

The staff also observed that a fleet utility updated fleet training procedures so that three months of EPOS time is built into the initial licensing training program and trainees are required to stand a certain number of under-instruction watches in licensed positions during the three month period. EPOS time is tracked in number of watches in each position instead of hours per applicant. This was done to allow flexibility for the candidates should an unanticipated situation arise that the candidate has no control over.

The program office cannot conclude that reductions made to EPOS time were done using the systems approach to training without further review of documentation detailing how these changes were made.

Recommendations The program office recommends that NRC region-based operator licensing staff review facility licensees documentation of training program changes to ensure that any reductions made to EPOS time were done in accordance with the facility licensees systems approach to training

Page 15 of 20 process and that the changes are adequate. This should be performed during initial licensing examinations audits. Results or findings should be shared with the operator licensing program office.

Conclusion Based on the observations shared above during this two year period, the staff concludes that the change made to SRO eligibility requirements for use of related sciences degrees is being implemented as intended. The NRC regions should consider additional inspection and review of reductions to EPOS time, when reductions are observed, to ensure that changes are made using the systems approach to training. This can occur irrespective of the effectiveness review and therefore, Task 7 is complete.

Restructured NUREG Task 8 o

Assess the ease-of use of NUREG-1021 Revision 12 as a result of changes made for streamlining and added value of these changes.

o Evaluation Period: CY 2022 and CY 2023 and first 6 months of CY 2024 Data Collected To assess the ease-of-use of the restructured format of NUREG-1021 Revision 12, the program office conducted both quantitative and qualitative analyses. First, the staff compared the number of hours charged for Revision 12 examinations with the number of hours charged for Revision 11 examinations. Specifically, the staff collected the total number of Cost Activity Code (CAC) hours charged for exam development, administration, and documentation from April 2022 through June 2024 in each region and compared that to the total hours charged to the same set of CACs in years 2017 - 2021. The staff performed a separate analysis of the CAC hour totals in years 2017 - 2018 with the hours in 2019 - 2020 to rule out any differences in hours charged due to the COVID-19 pandemic, which would have impacted the Revision 11 data. The resulting data was analyzed for differences to determine the potential impact on the usability of Revision 12 based on the restructured format.

Secondly, the staff collected qualitative information about the restructuring of NUREG-1021 through a survey (July 2023) and regular solicitation of feedback throughout the first year that Revision 12 examinations were being developed, reviewed, administered, and graded.

Results and Findings From the review of CAC hours for exam development, the data shows that the number of hours for exam development can vary greatly from exam to exam. This is expected due to the multiple variables of exam development (e.g., facility or NRC developed exam, exam author proficiency, training and qualifications). A comparison of hours for exam development in 2017 - 2021 (Revision 11 exams) and April 2022 - June 2024 (Revision 12 exams) indicates an increase in the average number of hours for Revision 12 exams. To understand if use of NUREG-1021 Revision 12 resulted in a significant increase in the number of hours required for exam development, a Pareto analysis was performed to compare the spread of exam hours reported during the exam timeframe. The intent of the Pareto analysis is to identify the frequency of the exam development hours reported, based

Page 16 of 20 on the number of exams. The results identified that the overall number of hours of exam development related activities concentrated in the lower bands of the numbers charged.

The results showed that 79% of Revision 12 exams were developed charging less than 283 hours0.00328 days <br />0.0786 hours <br />4.679233e-4 weeks <br />1.076815e-4 months <br />, compared to 75% of Revision 11 exams developed charging less than 250 hours0.00289 days <br />0.0694 hours <br />4.133598e-4 weeks <br />9.5125e-5 months <br />.

This would reasonably conclude that, outside of a small percentage of outlying exams with higher exam development hours, the majority of exam development charges did not change as a result of Revision 12. A comparison of CAC hours for exam administration under Revision 11 and Revision 12 indicates the overall number of hours of exam administration related activities reduced by an average of 9% per exam across the regions for Revision 12 exams when compared to Revision 11 exams. Furthermore, the standard deviation of hours for Revision 12 exam administration lowered by approximately 50%, possibly indicating that use of Revision 12 resulted in more consistent hours for exam administration between exams.

A comparison of CAC hours for exam documentation under Revision 11 and Revision 12 indicates that the overall number of hours for exam documentation reduced by an average of 11% per exam across the regions for Revision 12 exams when compared to Revision 11 exams. A pareto analysis was not necessary for the exam administration and documentation CACs based on the average reduction in hours identified.

For qualitative analysis, feedback was collected from NRC examiners via survey questions about using NUREG-1021 Revision 12 to gauge its effectiveness and efficiency in the administration of initial examinations. The survey results showed that, on average, the regional examiners considered the administration of Revision 12 pretty much the same level of effort as administering and documenting an exam using Revision 11. Initially, exam development activities related to identification of and defining critical tasks was considered a little harder/more burdensome, including using the grading process (PD, SPD, CPD).

However, this feedback decreased as examiner proficiency and familiarity increased through use of the new process. Overall ease of use and effectiveness of NUREG-1021 Revision 12 changes showed that the changes resulted in approximately the same level of effort in exam related activities.

Recommendations The program office does not have any recommendations in this task area.

Conclusion An analysis of the data collected for Task 8 showed an overall reduction in CAC hours related to exam administration and documentation and no significant change to the exam development hours during the NUREG-1021 Revision 12 implementation timeline. Additionally, examiner feedback indicates no adverse results in the implementation of Revision 12 regarding ease of use for exam development.

Based on the above, it is reasonable to conclude that the effectiveness of NUREG-1021 Revision 12 regarding ease of use and effectiveness for operator licensing implementation did not adversely impact the exam development process. The program office concludes that the restructuring of NUREG-1021 was an effective use of resources yielding an easier to use guidance document. Task 8 is considered met and further analysis is not required.

Page 17 of 20 Overall Impact of NUREG-1021 Revision 12 Tasks 6 and 8 o

Determine the overall impact of the Revision 12 changes on the pass/fail rate for NRC initial licensing examinations.

o Conduct other assessments of value o

Evaluation period: April 2022 - June 2024 Data Collected For this task area, the program office collected data from 82 NUREG-1021 Revision 12 examinations administered from April 1, 2022, to June 30, 2024. Specifically, the staff tracked written examination scores and the pass/fail results for the operating test. This information was compared to historical data from the 2020 - 2021 exams implemented under NUREG-1021 Revision 11 to analyze for adverse trends specific to the implementation of Revision 12.

Additionally, the staff tracked the number of ROIs, appeals and hearings related to NUREG-1021 Revision 12 and compared this to the total received during implementation of NUREG-1021, Rev 11. The staff also collected feedback from the region-based examiners via a survey (July 2023) on the overall effectiveness of the major changes in NUREG-1021 Revision 12 and what areas, if any, should be considered for improvement in the next revision.

Results and Findings The overall pass rate of the NRC exams from April 2022 to June 2024 period was 97.1%, with an overall failure rate of 2.9%. This is slightly lower than the historical average fail rate of 3.5%.

The written examination pass rate is 97.5%, which is almost matched with the historical average pass rate range of 97.0%. The operating test pass rate is 99.7%, which is almost matched with the historical average pass rate range of 99.6%. See Table 4 for a summary of these metrics.

Table 4: Overall Pass and Failure Rates for Revision 11 and Revision 12 licensing examinations Pass and Failure Rates Rev 11 Exams Rev 12 Exams Pass Rate 96.5%

97.1%

Failure Rate 3.5%

2.9%

Written Exam Pass Rate 97.0%

97.5%

Operating Test Pass Rate 99.6%

99.7%

A more detailed review of written examination scores during the evaluation period shows the average RO, SRO, and combined RO and SRO or overall grades were 89.1%, 87.8%, and 89.7%, respectively. The historical averages for the RO, SRO and overall sections of the written exam are 90.3%, 85.8%, and 90.1%, respectively. Upon review, these results are similar to the

Page 18 of 20 previous performance results of NUREG-1021; therefore, the results are well within an acceptable level of deviation for the given data collected. See Ttable 5 for reference.

Table 5: Overall Written Examination Averages, Revision 11 and Revision 12 written examinations Overall Written Examination Averages Rev 11 Exams Rev 12 Exams RO Portion 90.3%

89.1%

SRO Portion 85.8%

87.8%

Overall Exam 90.1%

89.7%

A total of 72 ROIs were received from March 2017 to March 2022, averaging to ~ 1.36 ROIs per month before Revision 12 was effective. From March 2022 to June 2024, the number of ROIs received was 33, or ~ 1.1 ROIs per month after Revision 12 was effective.

While the number of ROIs has varied per year, the overall average does not indicate an abnormally high number of ROIs as a result of using NUREG-1021 Revision 12.

Under NUREG-1021 Revision 11, from 2017 to March 2022, there was an average of 6 administrative reviews per year. Under Revision 12, from March 2022 to June 2024, there was an average of 5.2 administrative reviews per year. There has only been one hearing in the last ten years; this occurred in 2013, under Revision 9. The program office finds that the number of reviews per year is stable and there is not an increase in reviews associated with the implementation of NUREG-1021 Revision 12.

With respect to the feedback collected from regional examiners that took the NUREG-1021 Revision 12 survey, there were a total of 28 responses. Of those who responded, there were numerous comments regarding the operator examination process, including requests for revisiting the performance deficiency grading methodology, use of critical tasks, and the competency areas and rating factors. Additional comments were received on the Tier 4 generic fundamentals question requirements changes made in revision 12, which are being reviewed as part of the effectiveness review. Additional feedback related to post scenario documentation was received regarding grading on potential critical task failures or significant performance deficiencies.

There were two positive comments pertaining to Revision 12, including feedback stating the overall organization is much better than previous revisions. The staff also received positive feedback on the improved usability on Revision 12 from members of industry during the February 2024 National Operator Licensing Workshop.

The data supports that the changes made in NUREG-1021 Revision 12 did not result in a significant deviation in average exam scores or pass rates. The program office should continue to collect data in accordance with the effectiveness review plan.

The program office will continue to track pass and failure rates for NUREG-1021 Revision 12 operator licensing examinations in conjunction with Task 2 for generic fundamentals knowledge through Sept 2027.

Summary

Page 19 of 20 The operator licensing program office initiated an effectiveness review to monitor the application of new instructions and guidance in NUREG-1021, Revision 12, and determine if additional actions are needed, such as additional training or clarifications to support implementation of the major changes made in the revision. In summary, the program office finds that the changes introduced in NUREG-1021 Revision 12 for:

1.

Sampling GF topics on the site-specific written examination and the discontinuation of the standalone NRC GFE; 2.

The revised critical task methodology; 3.

The identification and documentation of critical and significant performance deficiencies on the simulator operating test; 4.

Licensee implementation and use of Revision 2 of ACAD-10-001, as the standard set of eligibility requirements for accredited licensed operator training programs; and 5.

The re-structuring of the NUREG into topic-based sections are being implemented adequately and without any impact to passing rates on the NRC initial licensing examination. The staff identified areas to improve application of the instructions and guidance in NUREG-1021 and subsequently enhance the examination standards in the future.

A summary of the staffs recommendations in this report is included below:

1.

Enhance the guidance for developing written examination questions in NUREG-1021 to include additional guidance for developing questions for Tier 2 and Tier 4 GF questions from the NRC GFE bank to make sure that they are free of psychometric flaws, technically accurate and have an adequate level of difficulty.

2.

Share the two-year report results and findings from Task 1 above on GF question integration with NRC examiners and industry examination writers to improve the quality of GF questions in the interim period before NUREG-1021 is updated with this enhancement.

3.

Provide training to NRC examiners on findings and observations for application of the CT methodology identified in this section of the report. Communicate best practices for documenting ABCs on simulator operator test forms with NRC examiners.

4.

Communicate the findings and results from the task for grading CPDs and SPDs with the NRC examiners at the next examiner training conference.

5.

On the next NUREG-1021 revision, add specific guidance and instructions in the appropriate examination standards for the following:

a.

The need to identify all CTs in operating test scenarios. For example, when stepping through the EOP(s) during scenario development, use a questioning attitude for the major steps of the EOP and compare them to the critical task methodology in ES-3.3 so that tasks that critical are properly pre-identified in the simulator guide. This can also be verified during onsite validation week.

b.

Instructions that it must be possible for the applicants to fail a CT; meaning, the boundary conditions must be achievable. Clarify that it must be possible to accomplish the CT and it must be possible to fail the CT.

Page 20 of 20 c.

Instructions to document the basis for time-based boundary conditions.

d.

Incorporate clarifications found in OLPFs 3.3.1 through 3.3.5 e.

Enhance the guidance for grading critical performance deficiencies and significant performance deficiencies with the examples provided in the ROI on grading SPDs and CPD.

6.

NRC region-based examiners should review additional documentation during initial licensing examinations audits to ensure that any reductions made to EPOS time were done in accordance with the facility licensees systems approach to training process and that the changes are adequate. Results or findings should be shared with the operator licensing program office.

Finally, the program office will continue to collect and analyze data for the tasks related to operator knowledge of generic fundamentals and use of the CT methodology for AP1000 simulator operating tests in accordance with the effectiveness review plan and issue annual reports.