ML22272A047

From kanterella
Jump to navigation Jump to search
10/18-19/2022 ACRS Public Meeting - Predecisional - Documents to Support Part 53 - DRO-ISG-2023-01 Operator Licensing Programs
ML22272A047
Person / Time
Issue date: 09/29/2022
From:
Office of Nuclear Material Safety and Safeguards
To:
Beall, Robert
Shared Package
ML22272A034 List:
References
10 CFR Part 53, DRO-ISG-2023-01, NRC-2019-0062, RIN 3150-AK31
Download: ML22272A047 (48)


Text

This interim staff guidance is the latest guidance that the NRC staff has publicly released to support interactions with the Advisory Committee on Reactor Safeguards (ACRS). This version is based on reviews by NRC staff and consideration of stakeholder input. The NRC staff expects to adopt further changes in the guidance.

This guidance has not been subject to complete NRC management or legal review, and its contents should not be interpreted as official agency positions. The NRC staff plans to continue working on the guidance provided in this document.

DRO-ISG-2023-01 Operator Licensing Programs Draft Interim Staff Guidance September 2022 DRO-ISG-2023-01 Operator Licensing Programs Draft Interim Staff Guidance 10 CFR Part 53 ADAMS Accession No.: MLxxxxxxxxx TAC: xxxxxx OFFICE QTE [IRAB PM] [NRR Technical [NRR Technical Lead/Author] Lead Branch Chief]

NAME DATE OFFICE [Other NRR [Other NRC [Regional Offices, OGC Division Directors, Division as appropriate]

as appropriate] Directors, as appropriate]

NAME DATE OFFICE [PGCB LA] [NRR Technical Lead Division Director]

NAME DATE OFFICIAL RECORD COPY DRAFT INTERIM STAFF GUIDANCE OPERATOR LICENSING PROGRAMS DRO-ISG-2023-01 PURPOSE The U.S. Nuclear Regulatory Commission (NRC) staff is providing this interim staff guidance (ISG) to facilitate staff reviews of applications for an operating license (OL) or combined license (COL) under Title 10 of the Code of Federal Regulations (10 CFR) Part 53, Risk-informed, Technology-Inclusive Regulatory Framework for Commercial Nuclear Plants. The guidance in this ISG supports the staff review of the portion of such applications related to the operator licensing examination program. This ISG provides guidance for the reviews of examination programs that are tailored to the specific role of operating personnel at commercial nuclear power plants other than research and test reactors (RTRs). Specifically, it addresses the review and approval of both initial and requalification examination programs for senior reactor operators (SROs), reactor operators (ROs), and generally licensed reactor operators (GLROs),

as provided for by the proposed requirements in subpart F and subpart P of Part 53. This ISG also addresses proficiency programs for SROs and ROs under proposed Part 53. Due to the need for operator licensing programs to potentially begin in advance of facility licensing, this guidance is written to address both the applicants for and holders of facility OLs and COLs under proposed Part 53; these are referred to as applicants and/or licensees throughout this document for the sake of brevity.

This guidance may also facilitate the NRC staff review of non-large light water power reactor applications for an OL under 10 CFR Part 50, Domestic Licensing of Production and Utilization Facilities, or a COL under 10 CFR Part 52, Licenses, Certifications, and Approvals for Nuclear Power Plants, as appropriate to the design challenges posed by new or novel facility designs.

Establishing such an operator licensing examination program may require the applicant to request exemptions from certain requirements of 10 CFR Part 50, Part 52, or Part 55, Operators Licenses that the staff would then need to review against the applicable exemption criteria of Part 55.11. The NRC staff would use the guidance in this draft ISG to inform its consideration of requests for exemptions from Parts 50, 52, and 55 requirements for Part 50 and Part 52 applicants and licensees, or for proposals for using alternative methods to those described in NUREG-1021 or NUREG-1478 by applicants and licensees under Part 50 or 52.

BACKGROUND The 10 CFR Part 53 regulation is under development, and as such, the guidance in this ISG is subject to change based on the outcome of that rulemaking. Key documents related to the Part 53 rulemaking, including proposed rule language and stakeholder comments, can be found at Regulations.gov under Docket ID NRC-2019-0062.

RATIONALE This ISG was developed to meet near-term guidance needs to support the NRC staff review of commercial nuclear plant applications with respect to risk-informed, technology-inclusive operator licensing examination programs. NUREG-1021, Operator Licensing Examination Standards for Power Reactors, implements operator licensing standards and regulations as specified in 10 CFR 55.40 for LLWR designs. NUREG-1021 was designed for developing operator licensing initial examinations in conjunction with the applicable NUREG-series knowledge and ability (K/A) catalogs. Similarly, NUREG-1478, Operator Licensing Examiner Standards for Research and Test Reactors, implements operator licensing standards and regulations for research and test reactors. The examination methods and procedures described in these documents are based largely on the jobs and tasks that are performed by personnel at operating LLWRs and RTRs and the concepts of operations and staffing models for these facilities.

Commercial nuclear plants that would be licensed under proposed 10 CFR Part 53 will likely use different technologies and employ different operating and staffing models as compared to LLWRs. Therefore, the jobs and tasks performed by operating personnel at these plants is expected to be different than for operating personnel at LLWRs and RTRs. As such, the knowledge, skills, and abilities that operating personnel at nuclear plants that would be licensed under proposed 10 CFR Part 53 will need to perform their duties and the examination methods needed to evaluate those knowledge, skills and abilities are also expected to be different than those contained in NUREG-1021 and NUREG-1478. For example, it is expected that examination programs developed for nuclear power plants that would be licensed under proposed 10 CFR Part 53 would be of smaller scope and format compared to those in NUREG-1021 or NUREG-1478 given the expected reduced reliance on operating personnel to maintain safe plant operations at these facilities.

As a result, the staff determined that the existing examination guidelines could not be applied at nuclear plants that would be licensed under proposed 10 CFR Part 53 without exemptions and substantial deviations from the guidance in NUREG-1021 and/or NUREG-1478. To ensure regulatory clarity and efficiency, the staff determined that new guidance needed to be developed, and it needed to account for the wide range of technologies, operating concepts, and staffing models expected to be employed at non-LLWR plants that would be licensed under proposed 10 CFR Part 53 or Part 50 or 52. The staffs goals in the development of the alternative were (1) to enable applicants and licensees to leverage the results of human factors engineering design activities, such as job and task analysis, to identify the knowledge, skills, and abilities operating personnel need to have at their facility(ies) and use that as the basis for developing examination standards for licensing competent operating personnel, and (2) to establish reliable guidelines for examination program development at these plants that is based on the best available knowledge from research and expertise on the measurement of knowledge and abilities.

As discussed in this ISG, applicants or licensees will be able to propose an examination program for licensing personnel at their facility that is based on, or tailored to, the role of operating personnel at their facility, in lieu of using those examination methods that were developed for LLWRs. This ISG provides guidance for applicants and licensees who develop and submit examination programs for their facility for NRC review, and for the NRC staff to review the adequacy of these programs for verifying the competency of operating personnel licensed under such programs. The APPLICABILITY This ISG would be applicable to all operator licensing examination programs and changes to those programs submitted as part of applications for OLs and COLs under 10 CFR Part 53 and to support the staffs review of exemptions to Part 55 for operator licensing examination programs for applications for OLs and COLs under Parts 50 or 52.

GUIDANCE 1.0 Knowledge, Skills, and Abilities List Development The NRC staff should ensure that applicants for or holders of an OL or COL under 10 CFR Part 53 develop a design-specific and/or site-specific knowledge, skills, and abilities (KSA) list using a systems approach to training (SAT) process as described in 10 CFR Part 53.725(b) and Part 55.4. The staff should ensure that the applicant or licensee includes all aspects of job performance, including but not limited to tasks important to safe plant operation and the fundamentals of plant operations as part of the licensed operator training program job and task analysis as required as part of an SAT-based training program. NRC guidance on KSA list development is provided in DRO-ISG-2023-04, Facility Training Programs, and NUREG-0711.

The NRC staff should ensure that the applicant or licensee screens the KSA list to identify those tasks and associated KSAs important to safe plant operation (as defined in section 1.3 of DRO-ISG-2023-04) and tasks related to the foundational theory of plant operations.

A criticality analysis (discussed in the section titled, Criticality Analysis, below) may have been performed to determine which KSAs require testing as part of the licensing examinations. Other types of analyses may also be used. Alternatively, deterministic criteria may be used to determine which KSAs require testing on licensing examinations. If the applicant or licensee uses cut-off values (e.g., subject matter expert (SME) ratings) for the inclusion/exclusion of KSAs, the basis for that methodology should be documented, along with a list of excluded KSAs. See Figure 1.0 for a depiction of the expected steps in KSA list development for examination programs, including those performed as part of the separate training program.

Criticality Analysis A criticality analysis is one process acceptable to the NRC staff for determining the essential tasks related to doing a job that require assessment for operator licensing. Tasks that are essential are more important to be assessed for operator licensing. This analysis is a useful input to the examination development step of content specification determination for each examination type (discussed later in this document) that determines which KSAs will be included in a specific examination type. As part of the criticality analysis, SMEs consider factors such as how frequently each task is done, the difficulty of each task, the importance of the tasks, and the risk involved. An acceptable approach is to obtain SME ratings on the tasks using Likert rating scales. A Likert rating scale is a numeric scale (typically five or seven points) that allows the individual to gauge to what extent they agree or disagree with a statement. In the case of criticality analysis, raters may judge importance (e.g., low versus high importance),

frequency of use (e.g., low to high frequency), and so forth. Staff should ensure that, once the data is collected, the applicant or licensee has compiled the data in a criticality matrix that displays the list of tasks and their respective ratings for each factor.

Acceptance Criteria:

The NRC staff should ensure that, if a criticality analysis is performed, the applicant or licensee compiles the data in a criticality matrix displaying the list of tasks and respective ratings for each factor [Criterion 1.1].

Expected KSA List Coverage The NRC staff should ensure that the KSA list that results from the task analysis performed per DRO-ISG-2023-04, at a minimum, reflects the knowledge, plant design, operator actions, administrative tasks, facility procedures, and emergency plan responsibilities that are relevant to the licensed operator function at the applicant or licensees facility. Items 1-3 provide further detail concerning these various areas:

1. Knowledge of both the plant design and operator actions needed to achieve plant safety should be reflected in the KSA list. The following examples in (a) though (f) below are for illustrative purposes and are not meant to represent a complete or applicable list for each reactor design or operator KSA requirement; instead, these examples provide detail on some of the KSAs that would be expected to result from the task analysis:

(a) safety-related and non-safety related, safety significant (if applicable) structures, systems, and components (SSCs) for the nuclear plant that the operator interacts with (e.g., controls, monitors, etc.), to include those SSCs required for radioactive material handling, radiation monitoring, and post-accident monitoring (b) characteristics associated with the SSCs (e.g., connections and interdependencies, support systems, control and monitoring, behavior during normal operations and transient conditions, monitoring methods, technical specifications, etc.)

(c) site-specific procedures and policies (e.g., shift turnover, normal operations, administrative tasks, radiological control, plant startup, etc.) that pertain to operators (d) abnormal and emergency plant events, the associated emergency and abnormal operating procedures for responding to the events, and implementation of the site emergency plan (e) KSAs that address the SSCs necessary to meet the defense-in-depth (DID) requirements of the plant as they relate to licensing basis events, increases in the cumulative plant risk, and beyond design-basis events (This includes non-safety related, safety significant SSCs that provide DID to the safety-related SSCs of the plant, and their associated characteristics, as applicable. If necessary, new KSAs should be developed to address plant safety DID, like (a) and (b) above.)

(f) inherent, passive, or automatic safety-significant SSCs, their characteristics, and the associated operator actions, procedures, and SSCs that provide DID to the inherent, passive, or automatic safety-significant SSCs

2. Fundamental, theoretical knowledge associated with the specific design and operations of the reactor should be reflected in the KSA list. Operators should have a fundamental understanding of the technologies, materials, and processes of the reactor design, including an understanding of applicable reactor theory, thermodynamics, and chemical theory topics. Operator understanding of the physical and chemical phenomena is important to the understanding of system behavior and the diagnosis and analysis of plant events.
3. Important administrative responsibilities that are associated with the SRO and GLRO roles should be reflected in the KSA list:
  • conditions and limitations in the facility license
  • facility operating limitations in the technical specifications and their bases
  • facility licensee procedures required to obtain authority for design and operating changes in the facility
  • radiation hazards that may arise during normal and abnormal situations, including maintenance activities and various contamination conditions
  • assessment of facility conditions and selection of appropriate procedures during normal, abnormal, and emergency situations
  • procedures and limitations involved in determination of various internal and external effects on reactivity
  • fuel handling facilities and procedures Acceptance Criteria:

The NRC staff should ensure that the coverage of the KSA list adequately addresses the following, as applicable:

  • safety-significant SSCs that the operator interacts with and the associated characteristics of those SSCs [Criterion 1.2]
  • safety-significant SSCs that provide plant safety DID and the associated characteristics of those SSCs [Criterion 1.3]
  • inherent, passive, or automatic safety-significant SSCs, their characteristics, and the associated operator actions, procedures, and SSCs that provide DID to the inherent, passive, or automatic safety-significant SSCs [Criterion 1.4]
  • site-specific normal operating procedures that pertain to operators [Criterion 1.5]
  • abnormal and emergency plant events, associated procedures, and implementation of the site emergency plan [Criterion 1.6]
  • theoretical knowledge items of reactor theory, thermodynamics, and chemical theory, as applicable, associated with the technologies, materials, and processes of the reactor design [Criterion 1.7]
  • knowledge of administrative topics, including notifications, radiological controls, maintenance controls, technical specifications, equipment control, conduct of operations, and the emergency plan [Criterion 1.8]

Necessary Documentation The NRC staff should ensure that the examination program KSA list development process is documented. The documentation should detail the steps of the process used, other processes or documents that provide input, terms or acronyms, and the resulting list of KSAs applicable to the site operator licensing examinations. The documentation should also clearly describe the methods for determining testable and non-testable KSAs. Those KSAs screened out as non-testable and the associated bases should be captured in the documentation.

The initial KSA list will be reviewed and retained by the NRC staff and will be maintained for the operational lifetime of the facility by the licensee. Staff should ensure that the documentation related to the KSA list provides information on the process used to maintain the KSA list, to include responsibilities, frequencies, input, and output paths. Staff should ensure that the applicant or licensee has a process in place to update the KSA list as needed due to site changes (e.g., equipment, procedures, training, etc.).

The KSAs should be grouped in a logical fashion to form the KSA list to facilitate the formation of examinations through the selection of KSA types (i.e., to more easily sample KSAs that represent the breadth and scope of the KSA list, and/or to meet the examination developers goals for question significance and rigor).

Acceptance Criterion:

The NRC staff should ensure that the KSA list details the development process, the determination for KSA testability (e.g., for non-testable KSAs, lack of criticality or untestable), and KSA list maintenance, and that the KSAs are organized in a logical format to facilitate use and review [Criterion 1.9].

Figure 1.0 - Overview of KSA Development Process Steps Performed as part of SAT + Task the SAT-based training program development Task Analysis process - See the Facility Training Programs ISG DID SSC Review DID Operator Action Review Add Theoretical Knowledge Performed as part of the examination Review for SRO / program development RO / GLRO KSAs process - See this ISG Screen for Testable KSAs Group KSAs Finalize KSA list 2.0 Operator Licensing Test Development The NRC staff should ensure that a comprehensive measurement strategy that covers all the aspects of performance that the job task analysis determined to be essential is used to determine operator competency. The first step is to develop an overall operator licensing strategy or Test Plan. The Test Plan will typically include multiple assessment measures1 (as is necessary to cover all the tasks and KSAs for a job). Once the Test Plan is established, the individual tests or examinations can be developed.

Test Plan Rationale The Test Plan rationale is a document that describes the comprehensive strategy for determining operator qualification for the job. This includes a description of how the tasks and the respective KSAs, delineated in the job task analysis, will be measured. NRC staff should ensure that documentation includes a mapping of tasks/KSAs to measurement tools (e.g.,

knowledge examination, simulation examination, etc.). 10 CFR Part 53 applicants or licensees should determine the appropriate assessment measures for their facility and are not required to only use the methods outlined in NUREG-1021.

Acceptance Criteria:

The NRC staff should ensure that:

  • The applicant or licensee provides documentation of assessment measure(s) to be used for each task and how the measures are aligned conceptually with the KSAs being measured. For example, an applicant or licensee may provide a matrix that delineates each task (and/or the KSAs making up those tasks) and pairs the task with the respective measurement strategy. Sufficient detail needs to be provided to allow the NRC staff to determine adequacy of coverage [Criterion 2.1].
  • The applicant or licensee provides an overall measurement strategy that may include a variety of knowledge and performance examinations (i.e., written, computer-based, simulator-based, walk-through, and other types of examination)

[Criterion 2.2].

  • An instructional designer, industrial/organizational psychologist, or a human factors specialist has determined the relevance of the measurement approach for the particular KSAs [Criterion 2.3].
  • The applicant or licensee provides documentation on KSAs not covered by any assessment measures [Criterion 2.4].

Measurement Strategy/Tools - General Information The NRC staff should ensure that the Test Plan documents that the measurement strategy and tools align conceptually with the inherent characteristics of the KSA being measured. While KSAs differ depending on the tasks, general categories include knowledge and understanding, cognitive skills (analysis, synthesis, evaluation, problem solving), and practical skills (observable 1

Assessment measures are formal, consistent, and dynamic measuring tools used to assess the ability of individuals to perform in the field.

behaviors). At the same time, many examination strategies and methods exist. The most common are written, computer-based, or performance based (e.g., walk-through or simulation).

It is essential for the chosen methods to have the correct conceptual underpinnings necessary to accurately capture the respective KSAs of interest.

Knowledge - Knowledge is learned information. Knowledge is often categorized into four types.

Factual knowledge consists of terminology and discrete facts. Conceptual knowledge consists of categories, theories, principles, and models. Procedural knowledge includes knowledge of a technique, process, procedure, or method. Metacognitive knowledge includes knowledge of oneself and knowledge of various learning skills and techniques.

  • Testing methods: Knowledge is typically assessed with oral, written, or computer-based tests.

Skill - A skill is an acquired or learned proficiency to perform a certain task or role. Skills can be psychomotor, behavioral, cognitive, or some combination thereof. Examples of skills include physically controlling a vehicle (e.g., driving a car, riding a bicycle, landing an aircraft), athletics (wrestling, swimming, tennis, etc.), oral and written communication, solving problems, programming computers, various construction activities, and so on.

  • Testing methods: While some skills may be assessed with oral, written, and computer-based tests, other skills can only be assessed using a behavioral measure in which some aspect of the examinees behavior is observed and recorded (e.g., via a simulation or a natural setting).

Ability - Capacity to apply knowledge and skills simultaneously in order to complete a task or perform an observable behavior.

  • Testing methods: While some abilities may be assessed with oral, written, and computer-based tests, other abilities can only be assessed using a behavioral measure in which some aspect of the examinees behavior is observed and recorded (e.g., via a simulation or a natural setting).

Format Specifications - Format specifications provide the format of items (i.e., tasks or questions) and responses (responding to multiple choice questions versus performing/demonstrating skills for a complex task) and the type of scoring procedures. Format specifications describe the rationale for the chosen format that includes considerations of validity of the format (i.e., the conceptual underpinnings).

Complex item formats include performance assessments and simulations. Specifications for complex items describe the domain from which the items (e.g., tasks and questions) are sampled, the component of the domain to be assessed, and critical features of the items for replication in alternate forms.

Test takers demonstrate their abilities or skills to perform tasks in settings that closely resemble situations encountered on the job (e.g., assemble a part, write a report, start a pump, etc.) via performance assessments. Since a performance test includes only a sample of tasks, NRC staff should ensure that the specifications clearly define the critical dimensions or competencies to be measured (e.g., skills and knowledge, cognitive processes,2 context3 for performing the tasks, etc.) and how the tasks align with those competencies. When tasks are designed to elicit complex cognitive processes, specifications need to include detailed analyses of the tasks and scoring criteria to provide necessary validity evidence.

Simulations are like performance assessments and may substitute when actual task performance is not feasible or desirable (e.g., because it would be too costly or dangerous).

Specifications for simulations describe the domain of activities to be covered by the tasks, critical dimensions of performance reflected in each task, and format considerations, such as the number or duration of the tasks.

Multimethod Examinations - The central assumption of using multiple methods is that each method has its strengths and weaknesses and that a combination of diverse examinations is necessary to benefit from the relative strengths of each individual examination. Research shows that using combinations of examinations can lead to high degrees of accuracy of predicting job performance.

Generally, to optimize examinations for determining if an individual is qualified, the information needed to make that determination should first be identified and then the method that is most appropriate for obtaining that information should be selected. A particular method should be used for what it is best suited for. For example, a written examination is best suited for assessing fundamental knowledge, comprehension, and application of knowledge and a performance-based examination using the actual work environment or a simulation is best suited for assessing an individuals skill at operating a piece of equipment. It is expected that there will be differences in performance by an examinee on each method if the methods actually measure different aspects of each individuals qualifications.

Examination Content Specification Along with the over-arching Test Plan providing the comprehensive measurement approach, the NRC staff should ensure that a detailed content specification or content framework is provided for each respective type of test/examination method (e.g., written, computer-based, simulation, etc.) to be used. The content specifications should delineate the specific content, skills, and diagnostic features of the domain or construct that the examination type covers (e.g.,

knowledge of valves, knowledge of emergency response, etc.). The content specifications should be guided by the job task analysis. The content specification process may use varying methods to determine KSA inclusion on an examination (e.g., ranking KSAs, applying values to KSAs, grouping KSAs according to subject matter or task criticality, safety significance, etc.).

The Test Plan might measure the most important work behaviors or KSAs or a few that are prerequisite to others or a smaller set of KSAs used to predict critical outcomes (e.g.,

accidents). A sampling method or plan should be included that discusses how to select specific KSAs for use on a specific examination for administration, also referred to in this document as an examination instance.

2 Cognitive processes are those associated with human cognition. Cognition can be defined as those mental processes involved in attending, making sense, reasoning, remembering, and making decisions.

3 Context, for the purposes of this guidance, can be defined as being the circumstances that form the setting for an event and in terms of which it can be fully understood and assessed.

For each test/examination instance, the documentation should include the specific KSAs included on the test as well as the number of KSAs, the number of questions per KSA, and the results of the sampling method used to select those KSAs. If more than one question or performance assessment item per KSA is allowed, the documentation should include the basis for this allowance.

Documentation The NRC staff should ensure that the applicant or licensee develops documentation with all necessary test specifications (e.g., the amount of time allowed for the testing, directions for the test takers, procedures to be used for administration, materials for test takers to use, scoring and reporting procedures, description of necessary hardware and software, etc.) and the basis for these test specifications.

Acceptance Criteria:

The NRC staff should ensure that:

  • The applicant or licensee provides documentation of the procedures used to determine KSA inclusion and exclusion on the examination [Criterion 2.5].
  • For each test/examination method, the applicant or licensee provides content specifications that provide a clear description of the jobs and tasks covered in the test methods and the specific content, including:

o the sampling methods/plan for determining inclusion on a specific examination instance (for example, the test might measure the most important work behaviors or KSAs, KSAs that are prerequisite to others, or a smaller set of KSAs used to predict critical outcomes (e.g.,

accidents).) [Criterion 2.6]

o a decomposition of the tasks into the knowledge, skills, cognitive processes, abilities, attitudes, or behaviors that are included in the test method, and which are excluded from the test method [Criterion 2.7]

o documentation on test length (Documentation will specify the number of questions allotted to different content areas based on the job analysis.)

[Criterion 2.8]

o specification of the type of personnel (examinee population) for which the test is intended (i.e., RO, SRO, or GLRO) [Criterion 2.9]

o specification of intended uses of the test method (i.e., a clear statement of the purposes and applications for which the test is intended) [Criterion 2.10]

o format specifications that provide the testing strategy/format of items (e.g., written, computer-based, performance-based, etc.) and describe the rationale for the formatting that includes considerations of validity of the format and not simply ease of use [Criterion 2.11]

o evidence/supporting documentation that the test strategy aligns appropriately depending on the distinction of knowledge vs. skills

[Criterion 2.12]

  • The applicant or licensee provides documentation with all necessary test specifications and the bases for these test specifications [Criterion 2.13]

Test Development The following guidance, in combination with the guidance in NUREG-1021, Revision 12, Appendix A, Overview of Generic Examination Concepts, is intended to assist the NRC staff in determining if applicants and licensees followed adequate test/examination development processes.

For any new tests/examinations, the NRC staff should ensure that the applicant or licensee follows an effective test development process and provides a description of the test development process. Effective test development typically includes generating items (for written tests) or tasks (for performance tests), reviewing items using statistical analysis and/or qualitative review of items, assembling the test, establishing validity of the overall test, and developing documentation/test manuals.

The NRC staff should ensure that test development processes are effective and include the following:

  • Generating more questions or tasks than are needed for one version of the test.
  • Conducting a review of the pool of items for content quality, clarity, and any construct-irrelevant issues -

o Ideally, new test items or tasks are given to people who represent (to the extent possible) the target test population. Depending on the sample size, statistical analysis can assess some psychometric properties, such as item difficulty and ability for an item to distinguish between different levels of test taker knowledge or skill (refer to the validity sections of this document for detailed information on establishing validity of the entire test).

o Qualitative review of items may occur. This may include using a structured interview or think-aloud protocol to provide evidence that the desired cognitive processes are elicited. For extended response items or performance tasks, test developers should provide example responses that represent the various scoring levels.

  • After review, the test developer assembles items into one or more test forms (i.e.,

versions) as per the content and length specifications; this includes documenting that the items selected meet specifications in content and psychometrics. Content reviews may entail replacing items that are too similar or that have other negative characteristics (such as providing answers to other items). Multiple test forms can be generated.

  • Establishing the validity of the overall test (refer to the validity section of this document).
  • Verifying compliance with 10 CFR 53.780(b) or 53.815(b)(3)(iv), as applicable; the NRC staff considers the following list of general categories as the minimum set of KSAs needed to safely perform operator duties. Eliminations of any of these categories are adequately justified.

o reactor theory and thermodynamic principles o plant systems and components o reactivity management and manipulations o radiation control and safety o emergency, abnormal, and normal operations o administrative requirements and conditions of the facility license o technical specifications The NRC staff should ensure that Test Plan documentation includes the following:

Description of what important work behaviors, activities, and worker KSAs are included in the domain, how the content of the work domain is linked to the test development process, and why certain parts of the domain were or were not included in the testing method; this information should be based on accurate and thorough information about the work, including analysis of work behaviors and activities, responsibilities of the job incumbents, and/or the KSAs prerequisite to effective performance on the job.

Evidence that demonstrates that the test adequately samples and is linked to the important work behaviors, activities, and/or worker KSAs; the documentation of the processes used to develop the test is the primary evidence that the scores will be sufficient to predict on-the-job performance. The NRC staff should understand that the sufficiency of the match between the test and the work domain is a matter of professional judgment based on the evidence provided. However, the documentation of the process should include the criteria and the rationale for the inclusion/exclusion criteria.

A rationale for content sampling (i.e., selecting the particular content to include) that is based on the professional judgment of the testing professional and an analysis of work that details important work behaviors and activities, important components of the work context, and KSAs needed to perform the work, if sampling is required; true random sampling of the content of the work domain is usually not feasible or appropriate.

Consequently, a systematic, quasi-random sampling procedure should be implemented to ensure that appropriate content sampling and examination security are maintained.

The rationale underlying the content sampling should be documented in a Test Plan specifying which KSAs are to be measured by which assessment methods. If sampling is not performed because the content domain is small enough to be tested on every examination, the applicant or licensee should state that and explain how examination security concerns related to predictability are addressed.

Acceptance Criteria:

The NRC staff should ensure that:

  • The applicant or licensee provides a description of the test development process, to include generating items, reviewing items using statistical analysis and/or qualitative review of items, evidence of overall test validity, and the conditions, directions, procedures, and materials for taking the test [Criterion 2.14].
  • The applicant or licensees test development process describes how it will be ensured that, at a minimum, topics from the following list of general categories of knowledge and abilities will be sampled during the course of examinations.

Eliminations of any of these categories is adequately justified.

o reactor theory and thermodynamic principles o plant systems and components o reactivity management and manipulations o radiation control and safety o emergency, abnormal, and normal operations o administrative requirements and conditions of the facility license o technical specifications [Criterion 2.15]

  • The applicants or licensees Test Plan documentation includes the following:

o what important work behaviors, activities, and worker KSAs are included in the domain, how the content of the work domain is linked to the testing procedure, and why certain parts of the domain were or were not included in the testing method [Criterion 2.16]

o evidence that demonstrates that the test adequately samples and is linked to the important work behaviors, activities, and/or worker KSAs, including the criteria and the rationale for the inclusion/exclusion criteria; and [Criterion 2.17]

o a rationale for a systematic, quasi-random content sampling procedure based on the professional judgment of the testing professional and an analysis of work that details important work behaviors and activities, important components of the work context, and KSAs needed to perform the work; if sampling is not performed due to small content domain size, an explanation of how examination predictability is addressed should be included [Criterion 2.18]

3.0 Examination Validation The NRC staff should ensure that the applicant or licensee has a plan to validate the new examinations ability to assess operator KSAs and/or task performance capability.

Rationale for Validation Plan The NRC staff should ensure that the applicant or licensee provides a description of the type of evidence collected to support test validity. The degree and type of evidence for the test will depend on SMEs. SME judgment should be based on the following factors:

  • how the test results will be interpreted
  • how the test results will be used
  • the type and degree of harm that might result if the test is not valid
  • the probability that incorrect inference can be corrected before harm occurs
  • availability of research on similar examinations taken by similar personnel for similar reasons
  • sufficiency of qualified individuals in validation samples
  • practicality in data collection
  • availability of appropriate criteria for conducting criterion-based validation studies Acceptance Criterion:

The NRC staff should ensure that the applicant or licensee provides that the validity plan will be implemented prior to the first administration of a test developed under the examination program. As part of the examination program, applicants and licensees should monitor and collect data following the validity plan over training cycles. If significant deviation from estimated risk associated with the type/degree of harm that may result or from the estimated probability that incorrect inference can be corrected before that harm occurs is identified, NRC staff should ensure that applicants and licensees have processes in place to document the deviation, refine the validity plan, and re-assess the examination [Criterion 3.1].

Validity of the Test As part of the validity plan, the NRC staff should ensure that the applicant or licensee provides evidence that the test works and will work as intended. The validity of a test can be demonstrated through one or more kinds of validity for each test, including content validity, predictive validity, and/or concurrent validity. The validity plan should require content validity.

Ideally, predictive and concurrent validity would be demonstrated as well; however, the practical constraints stemming from the small examinee population size inherent to the nuclear domain may limit the ability of applicants and licensees in providing this information (refer to the section on criterion validity for more information).

Content Validity A test with content validity has a strong linkage between the content of the test and key work behaviors and activities. The aim is for the test to measure relevant KSAs required for the job, while not including irrelevant knowledge and skills.

A content validation study provides evidence that the testing method samples the worker KSAs and/or work behaviors and activities that are necessary for the job. SMEs are an integral part of the test development team.

The NRC staff should verify that the validity plan includes documentation about the SMEs involved in test development, including that:

The SMEs have in-depth knowledge of the work, the responsibilities of the job incumbents, and/or the KSAs necessary for effective job performance. Documentation on the qualification of each SME should include a description sufficient for the reader to infer that the SME was knowledgeable about factors such as shift, location, type of equipment used, software and hardware, etc.

SMEs have defined the work domain and identify the key work behaviors, activities, and worker KSAs. SME judgments are important for both defining the content of the test and the methods used for testing.

Documentation includes details on how SME judgements were used to determine content validity. That is, if SME judgments were ratings on the match between test content and work requirements, documentation includes the criteria and rating procedures used for each aspect. Evidence of rating standardization should also be included.

Criterion Validity There are two general approaches to estimate the correlation between test scores and criterion scores: predictive and concurrent validation strategies. Predictive validation strategies are the most accurate but can present practical and ethical concerns. Concurrent validation strategies are more practical procedures to assess validity.

Predictive Validation Strategies From a scientific standpoint, predictive validation is the simplest and most accurate method to determine validity. However, this approach is not practical or realistic for real world application, as it requires the population in the validity study to be similar to the population of individuals who would apply for the job.

The general process to complete a predictive validation study is:

Step 1: Obtain test scores from a group of applicants and hire all the applicants.

Step 2: At some point in time, obtain performance measures of those hired and correlate these performance measures with the prior test scores.

In other words, a predictive validation study requires random hiring, which sets up an ethically risky position of selecting some people who are very likely to fail (and potentially create costly and/or dangerous situations in the process of failing). However, this is the strategy necessary for a true predictive validity approach. Consequently, a true predictive validity approach is not suitable for demonstrating test validity because public health and safety are paramount.

Concurrent Validation Strategies Concurrent validation involves obtaining both test scores and criterion scores (e.g., a measure of job performance) in an intact, preselected population (e.g., existing employees, students accepted into college, pilots who have completed a certain level of training, etc.) and correlating the two sets of scores (i.e., correlating the test scores with the criterion scores). The fundamental difference between this approach and the predictive validation approach is the lack of a random sample. That is, the population of workers at a plant, students accepted into college, pilots with several certificates, etc. is narrower and with much less variability than the population who initially applied for but were not selected or did not choose to continue through the hiring process. From a statistical standpoint, this generates a restriction of range in the scores which, in turn, impacts the correlation between the test scores and the criterion measures. Thus, the correlation between test scores and criterion scores in a pre-selected (not random) sample will likely not be the same as for the broader population in general. While test theory suggests that concurrent validation would underestimate this correlation between test scores and criterion scores in the broader population, studies suggest concurrent validation studies are often sufficiently similar to predictive validation studies.

If a concurrent validity study is used in a setting that is very selective (i.e., when only those with high test scores are included), coefficients will generally underestimate the validity of the test, and the underestimation may be severe. Considering the practical limitations placed on the applicant or licensee in demonstrating criterion validity at the time of the first administration of these examinations, the NRC staff acknowledges for the purposes of review that the validity plan may treat criterion validity as an ongoing goal as more data is collected on the examinations.

Interpreting Validity Coefficients Correlations between test scores and criterion measures could range in absolute value between 0.0 and 1.0. The higher the criterion-related validity, the more accurate the estimate based on the predictor measure can be. In practice, a good, carefully chosen test is unlikely to show a correlation coefficient with an important criterion greater than 0.5, and validity coefficients greater than 0.3 are not common in applied settings. The levels of criterion validity achieved by most tests rarely exceed 0.6 or 0.7. The validity coefficient in itself does not provide a complete measure of the effects of the tests on decisions.

Evaluating a predictive or concurrent validation study This section outlines the information needed to evaluate criterion validity. As noted, however, for the application approval process, predictive validity is very unlikely to constitute an acceptable approach for the reasons discussed previously.

A predictive or concurrent validation study includes a written description of the study method, containing:

  • a set of participants who are representative of the desired population (i.e., operators)
  • well-defined, reasonable standards of quantitative assessment
  • procedure that describes how the data is collected
  • statistical test results Acceptance Criterion:

The NRC staff should ensure that the applicant or licensee has a program that ensures that the tests will work as intended and that the tests will demonstrate one or more kinds of validity (i.e., content validity, predictive validity, and/or concurrent validity) [Criterion 3.2].

4.0 Scoring Specifications A key aspect of examination development is determining the scoring specification.

The scoring specification should be criterion referenced. Criterion-referenced scoring focuses on the absolute score relative to a defined level of competence or criterion, and the criterion-referenced method is appropriate for operator licensing testing; note that norm-referenced scoring ranks a score for an individual within a distribution of scores or compares it with the average performance of test takers in a reference population (e.g., based on age, job classification, etc.), and is not appropriate for operator licensing.

The scoring specification should describe how each test item is scored and how the item scores are combined to produce an overall score.

For procedures where ratings are based on scorer observation, scoring specifications should describe scorer qualifications, scorer training, identification and resolution of rating discrepancies, and steps to eliminate bias in judgments.

Cut-Off Scores When using cut-off scores, the passing score should be defined clearly and be accompanied with a rational argument. Criteria for reasonable cut-off scores should include the following:

  • Reflects the minimum qualification necessary to perform the job in a safe and efficient manner.
  • Represents criteria consistent with a job analysis.
  • Includes evidence that a thoughtful approach was used in determining the cut-off score and included both SME judgments about the cut-off score and evidence of the validity of the test overall.
  • Uses a content validity approach (often referred to as criterion referenced cut-off scores in which the criterion is SME judgment):

o The three approaches for evaluating the test items are -

SMEs examined the test items and rated the likelihood that a barely qualified or barely competent person could answer that question correctly.

The cut-off score is the average of the ratings.

SMEs may have rated the relative importance / relevance of each item.

SMEs may have rated the capability of barely qualified individuals to identify the correctness of the possible responses (i.e., options in a multiple-choice test).

o Note that a lack of interrater agreement is a disadvantage of all three methods. In organizations where scores are based on the subjective judgment of an evaluator, evaluators should be limited to a single or constant group of evaluators for all candidates to reduce error amongst evaluators and to increase reliability and validity within the scoring process.

  • Does not use a norm-referenced approach. Norm-referenced methods to establish cut-off scores are based on an entire distribution of test scores (e.g., mean + one standard deviation). The norm-referenced approach is not acceptable for situations requiring establishment of minimum competency, subject matter mastery, licensure, and other situations.

Acceptance Criterion:

The NRC staff should ensure that the applicant or licensee clearly defines appropriate cut-off scores and their bases. If a cut-off score of 80% is used, then no additional basis needs to be provided as 80% has been determined to be an acceptable cut-off score for power reactors in NUREG-1021. If a cut-off score of < 80% is used, then the basis for that cut-off score is provided [Criterion 4.1].

5.0 Reliability of the Test Reliability is a necessary, but not sufficient, requirement for validity. If a test is reliable, there will be a low amount of measurement error in the test. If the trainees repeated the test, their results would be relatively the same. This particular type of reliability is called test re-test reliability.

Assessing individuals at two different time points even with use of alternate forms of the test is not always feasible or desirable. Thus, reliability is usually assessed by examining the internal consistency of the items. When examining internal consistency, the researcher assesses how the items of the test relate to one another. Split-half reliability correlates one half of the test to the other half. It is still possible that test items of poorer quality could be grouped together while the better items are grouped together through random chance. To minimize this item-related bias, typically, the Cronbachs alphaaverage of all possible split-half reliabilitiesis reported. Statistical software can quickly generate all possible groupings and generate correlations between two groups and calculate the average of all these correlations. Essentially, each split-half reliability is analogous to a dart on a dart board; if all the darts strike close to the center, then the test is internally consistent or reliable.

The following information on how test reliability is determined should be included, as appropriate:

  • information on the reliability of a test (i.e., the test manual provides data adequate to permit judgment on whether scores are sufficiently dependable for the recommended use)
  • sufficient evidence to support a thorough judgment of the reliability of a test (e.g., parallel form reliability, internal consistency/inter-item relations, test-retest reliability, interrater reliability, etc.)
  • findings that are adequate to justify use of a test for operator licensing-related purposes
  • information confirming that the participants who supply the data used to compute reliability coefficients are appropriately suited to support operator licensing examination development-related work and sufficiently described in the documentation to permit judgment of whether the data derived from them should apply
  • confirmation that appropriate procedures for computing the reliability coefficients are used
  • reliability data presented in the conventional statistical form of product-moment correlation coefficients and standard errors of measurement, with standard errors presented for different levels of performance
  • for tests with alternate versions, data allowing for comparisons between the versions,
  • evidence reported for internal consistency (or inter-item correlations)
  • confirmation that the test manual provides evidence of the stability of test performance over time (i.e., test-retest reliability)

Acceptance Criterion:

The NRC staff should ensure that the applicant or licensee provides information on the reliability of tests and supports that information with a reasonable description of the methodology used to determine its adequacy [Criterion 5.1].

6.0 Test Manual A comprehensive test manual should accompany each examination program that is submitted for programmatic approval under 10 CFR Part 53.780(b) or 53.815(b)(3)(iv), as applicable.

Acceptance Criteria:

The NRC staff should ensure that:

  • The applicant or licensee provides a comprehensive test manual that covers each type of test. The manual includes information on the purpose of the test, how the test was developed, and how to administer and score the test [Criterion 6.1].
  • Test manuals include the following:

o a clear statement of the purposes and applications for which the test is intended [Criterion 6.2]

o clearly defined construct(s) the test intends to measure, the group(s) for which the test is intended, the specific application of the test [Criterion 6.3]

o justification of the relevance of the test content for the construct(s) to be measured [Criterion 6.4]

o a summary of research findings for the test [Criterion 6.5]

o dates of any revisions to the manual, and any additional statistical tests for the revised version(s) [Criterion 6.6]

o information on revisions of the test instruments associated with technology changes [Criterion 6.7]

o any limitations of the test [Criterion 6.8]

o a description of the expertise required to administer and interpret the test

[Criterion 6.9]

o the time needed to administer the test [Criterion 6.10]

o the time allowed for testing [Criterion 6.11]

o a description of all -

procedures to be used for administration (and allowed variations)

[Criterion 6.12]

materials for test takers to use [Criterion 6.13]

scoring and reporting procedures, and description of necessary hardware and software [Criterion 6.14]

o clear and complete instructions to administer the test and interpret the results properly [Criterion 6.15]

o all necessary forms and any necessary aids to interpreting the test results

[Criterion 6.16]

o case descriptions indicating how test scores can be interpreted [Criterion 6.17]

o a clearly defined domain and discussion of the procedures used for sampling from that domain [Criterion 6.18]

o discussion of any potential threats to the valid use of the test scores

[Criterion 6.19]

  • For a specific test instance, documentation will include the following:

o name of the test, authors of the test (by name and position), publisher of the test and the date published, and existence of alternate forms

[Criterion 6.20]

o dates of any revisions to the test and any additional statistical tests for the revised version(s) [Criterion 6.21]

o information on revisions of the test associated with technology changes

[Criterion 6.22]

o a report on the evidence of the validity of the specified use of the test

[Criterion 6.23]

The documentation should, however, avoid referring to correlations between items and total test score as evidence of validity [Criterion 6.24].

o a description of criterion variables, adequacy of criterion variables, and aspects of criterion performance that are not adequately reflected in the criterion measure(s) when criterion validity is reported [Criterion 6.25]

o statements expressing relationships that are presented in quantitative terms so that the reader can tell how much confidence to attach to them (e.g., the correlation with Variable X is 0.55 rather than the test correlates substantially with Variable X) [Criterion 6.26]

  • Additionally, for computer-based testing, the test manual also includes the following:

o information on any necessary computer software installation [Criterion 6.27]

o information on operation of the software [Criterion 6.28]

o provision for technical support [Criterion 6.29]

7.0 Additional Characteristics of High-Quality Test Materials Some characteristics of high-quality test materials apply specifically to written and to computer-based tests. These are described below.

High-quality Written Test Materials NUREG-1021, Form 4.2-2, Question Development Checklist, can be used as a job aid during the process of developing and reviewing written examination questions. NUREG-1021, Appendix B, Examples of Written Examination Questions, provides examples that illustrate psychometric errors to avoid.

Additional characteristics include the following:

  • correctly formulated items (see NUREG-1021, Appendix B)
  • standardized test items
  • the items, test booklet, answering scales, and answer form follow user-centered4 design principles in a way that errors can be avoided when completing the test
  • clear and complete instructions for the test
  • a scoring system that is consistent and error resistant, including:

o an objective scoring system o if the test is being scored by raters or observers, a clear and complete system for assessment and observation High-quality Computer-based Test Materials High-quality computer-based test materials include the following:

  • description of scoring (i.e., computerized or an objective scoring system)
  • documentation describing whether the test items are standardized or adaptive
  • if adaptive scoring, documentation describing the following:

o the decision rules o how the test meets the provisions for uniformity in licensing

  • error resistant user interface
  • clear and complete instructions for the administrator and examinee
  • sufficiently secure test 8.0 Other Examination Program Considerations Additional items that are applicable to any examination program can be found at the following NUREG-1021 sections:
  • Section 1.3, Examination Security
  • Section 2.1.G, Guidelines for Freezing Plant Procedures
  • applicable portions of Section 2.2, Applications, Medical Requirements, and Waiver and Excusal of Exam and Test Requirements for specifically licensed operators
  • Section 2.3, Reviewing and Approving Operator Licensing Initial Examinations, for specifically licensed operators
  • applicable portions of Section 5.1, Issuing Operator Licenses and Post-Examination Activities, for specifically licensed operators
  • Section 5.2, Application Denials and Requests for Informal NRC Staff Review, for specifically licensed operators
  • Section 5.3, Maintaining, Changing, and Renewing Operator Licenses, for specifically licensed operators
  • Appendix A, Overview of Generic Examination Concepts 4

User-centered refers to a design that is based on looking at the way that people will use something and considering what they will do with that design.

Acceptance Criteria:

The NRC staff should ensure that the applicant or licensee documents how it will meet examination security requirements under 10 CFR Part 53.780(d) or 53.815(d) as applicable. If the facility will have specifically licensed operators, it will also have procedures for preparing and submitting applications, including any waivers and excusals, as well as procedures for ensuring medical requirements are met. Facilities with specifically licensed operators will also have procedures for maintaining, updating, and renewing operator licenses [Criterion 8.1].

The NRC staff should also ensure that the applicant or licensee documents how examinees will be briefed on the examination process prior to examination administration, similar to the guidelines provided in Section 1.2 of NUREG-1021 but tailored to the specific examination process developed for the facility [Criterion 8.2].

Preparing for Operator Licensing Initial Examinations The NRC staff should ensure that the applicants or licensees examination program reflects the following, as applicable:

  • testing a representative sample of the KSAs needed to safely perform RO and SRO or GLRO duties, to include both the examination methods and criteria to be used to assess passing performance, as required by 10 CFR 53.780(b)(1) and 53.780(c)(2)(i) or 10 CFR 53.815(b)(3)(iv) and 53.815(b)(3)(v), as applicable
  • that the program will be approved by the NRC prior to its use in developing or administering initial examinations and requalification examinations for either ROs and SROs or GLROs as described under 10 CFR 53.730(g) or 53.815(b)(6), as applicable
  • that the program will be maintained according to the requirements of 10 CFR 53.1565 or 53.6065, as applicable
  • that prepared examinations for RO and SRO applicants will be made available to the NRC for review and approval in advance of their administration, as required by 10 CFR 780(b)(2)
  • that sufficient advance notification will be provided to the NRC to allow for a representative of the NRC to be afforded the opportunity to be present during examination administration, as required by 10 CFR 53.780(b)(3), 53.780(c)(2)(ii)(B), and 53.815(b)(3)(v), or to administer the examination, as required by 10 CFR 53.780(b)(3)

Acceptance Criterion:

The NRC staff should ensure that the applicant or licensee documents how it will ensure that the facilitys examination program

  • tests a representative sampling of the KSAs to safely perform licensed duties

[Criterion 8.3]

  • is approved by the NRC prior to use [Criterion 8.4]
  • will be maintained [Criterion 8.5]
  • contains provisions for how the prepared examinations for RO and SRO applicants will be made available to the NRC for review and approval prior to administration, if applicable [Criterion 8.6]
  • contains provisions for giving sufficient advance notification to the NRC to allow for an NRC representative to be present during examination administration and, as applicable, to also administer the examination [Criterion 8.7]

9.0 Simulation Facilities Proposed 10 CFR Part 53 include requirements that pertain to the use of simulation facilities for the purposes of licensed operator training, experience requirements, and examinations. Such requirements could apply to both on-site or off-site simulation facilities and facility licensee-operated or third-party operated simulation facilities depending upon the individual circumstances of a given applicant or licensee. The following supplemental information can help inform the development of an applicants or licensees simulation facility for use in training, meeting experience requirements, or for the conduct of examinations consistent with the proposed requirements in 10 CFR 53.780(e) or 53.815(e), as applicable.

The term simulation fidelity refers to the level of realism in terms of the physical, psychological, and functional aspects. For the purposes of human-in-the-loop5 simulation in non-large light water reactor technologies -

  • Physical fidelity is the degree to which the simulator looks, feels, and is designed to replicate the actual environment.
  • Functional fidelity is how well the simulator functions and provides stimuli to reflect the actual environment.
  • Task fidelity is the replication of tasks and maneuvers executed by the user.
  • Psychological-cognitive fidelity is how well the simulator replicates psychological and cognitive factors (e.g., communication, decision making, and situational awareness).

Digital technologies, including desktop computers, augmented reality, and virtual reality use varying levels of simulation realism to represent task scenarios that reflect the actual job situation.

The NRC staff should ensure that, when using new simulation technologies, the applicant or licensee documents and demonstrates how the simulation provides a level of fidelity sufficient to assess the intended knowledge and skills as required by 10 CFR Part 53.780(e) 53.815(e) as applicable.

To best replicate scenarios for non-large light water reactor facilities, simulation facilities should have the same cognitive requirements as the real environment. If a simulation facility is less cognitively demanding than the real environment, it may not be as useful of a tool in training or examination.

Acceptance Criterion:

5 Human-in-the-loop simulations involve humans taking part in interactive simulations in a manner that allows for human actions to influence the outcome of the simulation.

The NRC staff should ensure that:

  • If a simulation facility is used in examinations, the applicant or licensee documents how the simulation facility provides a level of fidelity and cognitive requirements sufficient to assess the intended knowledge and skills of operators [Criterion 9.1].
  • If a simulation facility is used in examinations, the applicant or licensee documents policies and procedures to demonstrate compliance with simulator fidelity requirements under 10 CFR Part 53.780(e) or 53.815(e) as applicable [Criterion 9.2].

A crucial aspect of simulation-based assessment is the structure or the linking together of task-based content, scenario events, performance measures, and feedback. An integrated approach is essential to demonstrate the validity of a simulation-based examination. For each proposed simulation examination, the applicant or licensee will describe the purpose(s) of the examination, the definition of the construct or domain measured, the intended examinee population, the measurement tools, and the interpretations for intended uses, consistent with methods described for content determination and validation described in this document. This includes:

Identifying the Jobs and Tasks

  • Provide a clear description of the jobs and tasks covered in the examination and the specific content. The rationale for the tasks covered should be documented in an examination plan.
  • Provide a task decomposition in terms of the knowledge, skills, cognitive processes, attitudes, or behaviors that are included in the examination and that are excluded from the examination (i.e., identify what is in the scope of the examination).
  • Specify the type of personnel (examinee population) for which the examination is intended.
  • Specify the intended uses of the examination (i.e., a clear statement of the purposes and applications for which the examination is intended).
  • Provide evidence/supporting documentation that the simulation strategy is appropriate based on the distinction of knowledge versus skills.

Acceptance Criteria:

The NRC staff should ensure that:

  • If a simulation facility is used in examinations, the applicant or licensee provides content specifications that provide a clear description of the jobs and tasks covered in the simulation and the specific content, as follows:

o Tasks are decomposed into both those knowledge, skills, cognitive processes, attitudes, or behaviors that are included in the examination and those that are excluded from the examination [Criterion 9.3].

o The type of personnel (examinee population) for which the examination is intended is specified [Criterion 9.4].

o The intended uses of the examination (i.e., a clear statement of the purposes and applications for which the examination is intended) is specified [Criterion 9.5].

o Evidence and/or supporting documentation is provided showing that the simulation strategy is appropriate based on the distinction made between knowledge and skills [Criterion 9.6].

Identifying the Scenario Events For any scenario that is part of a simulation facility examination, the applicant or licensee should provide a list of scenario events that link to the overarching tasks. See NUREG-1021 ES-3.3 and ES-3.4 for examples of possible types of events. The applicant or licensee should include documentation describing how the scenario events give the examinee the opportunity to demonstrate the KSAs necessary to perform the task. Additionally, the applicant or licensee should include documentation related to acceptable qualitative and quantitative scenario attributes that, if met, would ensure that the overall scenario and individual scenario events are appropriately realistic and at a satisfactory difficulty level. See subsequent sections of this ISG for considerations regarding scenario events and the examination criteria determination.

Acceptance Criterion:

The NRC staff should ensure that, if a simulation facility is used in examinations, the applicant or licensee provides policies and procedures to link scenario events to the operator tasks and to document that information for scenario-based examinations

[Criterion 9.7].

Identifying the Metrics For each scenario event, the applicant or licensee will provide event-based metrics to assess the degree to which the examinee achieved the task objectives. The precise event-based performance measures will assess how effectively the examinee is demonstrating the desired knowledge and skills. The metrics will be designed to facilitate ease of use by the examination administrator to avoid errors in grading.

Acceptance Criterion:

The NRC staff should ensure that the applicant or licensee provides policies and procedures to develop event-based metrics for scenario-based examinations [Criterion 9.8].

Feedback The applicant or licensee should describe the process in which the examiner will integrate the performance results and provide feedback to the examinee on their performance in terms of scenario events that link to the jobs tasks.

10.0 Administering Operating Tests The NRC staff should ensure that applicants or licensees have examination procedures similar to those found in Section 3.5 of NUREG-1021. Qualified facility staff (e.g., certified instructors, licensed personnel, etc.) may administer the examinations and should follow procedures similar to NRC examiners. For operator licensure, the applicant or licensee should propose a schedule that will protect examination security while providing an efficient license examination process.

The applicant or licensee should inform the NRC of examination schedules. The applicant or licensee should support the administration of the examination by providing the following as needed:

  • personnel to operate simulation facility equipment
  • surrogate operators
  • monitors
  • approved examination materials
  • examination administrators The NRC staff should ensure that the applicant or licensee has measures in place to ensure that examiners behave in accordance with NRC examination and integrity codes of conduct which should be documented as part of the facility examination program in order to ensure that examination integrity, as required by 10 CFR 53.780(d) or 53.815(d) as applicable.

Applicants and licensees must retain examination administration records in accordance with the relevant requirements of 10 CFR 53.780 or 53.815, as applicable, and NRC staff should ensure that this is reflected in the operator examination plan.

Acceptance Criterion:

The NRC staff should ensure that the applicant or licensee documents how to administer the operating test portion of the examination while maintaining examination security and examiner codes of conduct to ensure that test integrity is maintained [Criterion 10.1].

11.0 Examination Program Change Management Process The NRC staff should ensure that the examination program documents the change management process and specifies both which changes to approved examination programs can be made without prior NRC approval and which changes require prior NRC approval. Changes to the approved initial and requalification examination programs for ROs and SROs, and to the approved requalification examination program for GLROs, must be consistent with the regulatory requirements of either 10 CFR 53.1565 or 53.6065, as applicable. Generally, the applicant or licensee should specify that changes require prior NRC approval if they:

  • require an exemption from a regulation
  • require a change to technical specifications
  • negatively impact examination security/integrity
  • negatively impact consistent examination administration
  • negatively impact consistent examination evaluation Generally, the applicant or licensee should specify that the following changes do not require prior NRC approval:
  • minor editorial changes (such as to correct typos or to provide clarification)
  • administrative changes associated with test proctoring
  • changes or deletions to validity determination (with the exception of content validity)
  • refinements to the validity plan based on test program results
  • updating SME lists
  • documenting a new validation or reliability study using the same process as that previously approved, provided that it does not result in substantive changes to the examination program
  • test manual updates in the areas of research findings, reporting/scoring procedures (such as hardware or software), forms, validity report, software for computer-based training, or technical support information
  • changes associated with Section 7.0, Additional Characteristics of High-Quality Test Materials, of this guidance
  • changes to guidance associated with freezing procedures
  • changes related to simulation facilities that are not used for the training, examination, or meeting of experience requirements for ROs and SROs
  • changes to simulation facility policies and procedures (provided that fidelity is maintained)
  • updating evidence/documentation that the simulation strategy aligns with the distinction between knowledge and skills, provided that no other test changes are recommended
  • changes to the feedback mechanism addressed in Section 9.0, Simulation Facilities, of this guidance Changes other than those described above should generally be identified as requiring NRC approval prior to implementation to ensure that there is no impact on the consistent, valid administration of the examination program.

As an example of the proper implementation of an examination programs change management process, although adding an item to a KSA list may appear to be a conservative change that should not require prior NRC approval, a closer analysis may indicate that such an addition could impact the sample and Test Plan as it relates to the number of questions being asked. If such a KSA were to be added without evaluating these impacts, it could result in a skewed sample plan and perhaps more important KSAs being tested at a reduced frequency, thereby diluting the testing pool with a KSA of lesser importance. If a significant number of such KSAs were to be added, it might necessitate a revision to the number of items on the test to ensure that appropriate sampling is maintained.

Acceptance Criterion:

The NRC staff should ensure that the applicant or licensee documents the change management process for the examination program and specifies which changes to approved programs can be made without and which require prior NRC approval. This process is consistent with the requirements of either 10 CFR 53.1565 or 53.6065, as applicable, and should also conform to the guidance of this section [Criterion 11.1].

12.0 Static Computer-Based Testing Static computer-based testing is a type of examination that does not have interactive features, but only allows multiple choice, selection on the screen, or text entry. This would not include computer-adaptive tests, or any form of simulator. Although static computer-based testing may be permissible for examination, this document does not directly address this testing method. To utilize such a testing approach, the applicant or licensee would likely need to create similar guidance for examination development that can provide an equivalent to the guidance in this document.

13.0 Additional Guidance for Requalification Programs Applicants and licensees are required under 10 CFR Part 53 to have requalification programs for the ROs and SROs or GLROs at their facilities. While requalification programs are generally subject to the same guidance specified for initial examination programs in this document, there are additional elements of requalification programs that the reviewer should consider.

For all requalification programs, the program must have a provision to ensure that any requalification examination failures are properly remediated and retested, and that a retake examination is passed prior to allowing the licensed operator to return to licensed duties.

The NRC staff should ensure that, for applicants and licensees with ROs and SROs (i.e.,

facilities subject to the requirements of 10 CFR 53.760 to 53.795), the applicant or licensee establishes programs containing the following programmatic aspects:

  • Applicants and licensees are required by 10 CFR 53.780(c)(2)(i) and 53.780(c)(2)(ii)(C) to administer a requalification examination to all ROs and SROs during a period that is not to exceed 24 months between examinations. This time frame is limited by the duration of the requalification training cycle required under 10 CFR 53.780(c)(1)(i). Thus, the program must include sufficient provisions to facilitate this required examination process on a recurring basis.
  • The program must provide for making prepared requalification examinations available for NRC review, as required by 10 CFR 53.780(c)(2)(ii)(A).
  • The program must ensure that the NRC is notified of upcoming requalification examinations and is afforded the opportunity to be present during administration, as required by 10 CFR 53.780(c)(2)(ii)(B).
  • The program must provide for ensuring that a summary of examination results for each RO and SRO is promptly forwarded to the NRC following the completion of the requalification examination, as required by 10 CFR 53.780(c)(2)(ii)(D).

The NRC staff should ensure that, for applicants and licensees with GLROs (i.e., facilities subject to the requirements of 10 CFR 53.800 to 53.830), the applicant or licensee establishes programs containing the following requirements of 10 CFR 53.815(b)(3)(v):

  • A requalification examination must be administered to each GLRO within a recurring periodicity that is defined within the program.
  • If the requalification examination periodicity is less than or equal to 24 months, the NRC staff does not need to perform further review. If the applicant or licensee proposes a periodicity exceeding 24 months, the NRC staff should ensure that the applicant or licensee provides bases for that periodicity, using the following insights:

o the SAT process o operator performance trends o industry operating experience o changes in the aggregate experience level and turnover of GLRO staffing o significant changes to the design or operation of the facility

  • The program must ensure that the NRC is notified of upcoming requalification examinations and is afforded the opportunity to be present during administration.

Acceptance Criterion:

The NRC staff should ensure that:

  • the requalification examination program has a provision to ensure that any requalification examination failures are properly remediated and retested, and that a retake examination is passed prior to allowing the licensed operator to return to licensed duties [Criterion 13.1].
  • the requalification examination program designates an appropriate periodicity for the administration of requalification examinations. For ROs and SROs, this periodicity may not exceed 24 months. For GLROs, this periodicity may exceed 24 months with adequate provisions for informing the periodicity [Criterion 13.2].
  • the requalification examination program contains sufficient provisions to satisfy the additional requirements of 10 CFR 53.780(c)(2) or 53.815(b)(3)(v), as appropriate, that are discussed in this section [Criterion 13.3].

14.0 Proficiency Programs for Specifically Licensed Operators and Senior Operators Specifically licensed operators and senior operators are required under 10 CFR Part 53 to be subject to a Commission-approved proficiency program. The applicant or licensee must include a program that is adequate to meet the requirements of 10 CFR 53.780(g). A proficiency program must provide for the following:

  • Ensure that ROs and SROs will actively perform the functions of an RO or SRO, as appropriate.
  • Maintain proficiency regarding shift functions.
  • Maintain familiarity with plant status.
  • Include those steps that will be taken to re-establish proficiency when it cannot be maintained.

Proficiency programs should clearly define the specific duties, locations, and amounts of time associated with each of the items above. The specifics of each should be informed by considering the following:

  • the facility-specific concept of operation
  • the facility-specific staffing plan
  • human performance considerations
  • the full scope of duties assigned to the individual while on-shift (e.g., collateral functions for maintenance, radiation protection, etc.)

Changes that reduce the scope or requirements of an approved proficiency program must be approved by the NRC staff prior to implementation per 10 CFR 53.1565 or 53.6065, as applicable.

While GLROs are required by 10 CFR 53.805(a)(5) and 53.815(g) to be subject to a proficiency program that is administered by the facility licensee, GLRO proficiency programs do not require prior approval by the NRC and, therefore, are not within the scope of this ISG.

Acceptance Criteria:

The NRC staff should ensure that:

  • The applicant or licensee provides a proficiency program for ROs and SROs that addresses the requirements of 10 CFR 53.780(g) and contains appropriate justification for the adequacy of its provisions. Proficiency programs that are equivalent to, or more conservative than, the requirements of 10 CFR 55.53(e) and (f) may be considered to be acceptable without additional justification

[Criterion 14.1].

  • The applicants or licensees programmatic provisions for change control specifically disallow making changes to the approved proficiency program without obtaining prior NRC approval as required by 10 CFR Part 53.1565 or 53.6065 as applicable [Criterion 14.2].

15.0 Waivers for Generally Licensed Reactor Operators Per 10 CFR Part 53.815(f), the requirements for a test or examination for generally licensed reactor operators may be waived in accordance with the facilitys approved examination program. Therefore, the applicant or licensee should include appropriate criteria that may be used to waive the requirements for a test or examination. If the applicant or licensee is using requirements similar to those listed in 10 CFR 55.47, no further NRC staff review is required.

Otherwise, the applicant or licensee should also provide a basis for these criteria describing how the criteria will ensure that the individuals are able to safely and competently operate the facility as required by 10 CFR 53.815(b).

Acceptance Criteria:

The NRC staff should ensure that the applicant or licensee includes criteria for waiving the requirement for a test or examination for generally licensed operators, including a basis for those criteria as needed [Criterion 15.1].

16.0 Acceptance Criteria Summary In order to approve an application for an OL or COL under 10 CFR Part 53, the NRC staff must determine, among other things, that the requirements in Part 53 subpart F or subpart P, as applicable, are met for the design and technology under review. This determination should be based on whether the information provided in the application is sufficient to conclude that the following acceptance criteria discussed in this guidance are met:

Section 1.0 1.1 If a criticality analysis is performed, the applicant or licensee compiles the data in a criticality matrix displaying the list of tasks and respective ratings for each factor.

The coverage of the KSA list should adequately address 1.2 safety-significant SSCs that the operator interacts with and the associated characteristics of those SSCs.

1.3 safety-significant SSCs that provide plant safety DID and the associated characteristics of those SSCs.

1.4 inherent, passive, or automatic safety-significant SSCs, their characteristics, and the associated operator actions, procedures, and SSCs that provide DID to the inherent, passive, or automatic safety-significant SSCs.

1.5 ...site-specific normal operating procedures that pertain to operators.

1.6 abnormal and emergency plant events, associated procedures, and implementation of the site emergency plan.

1.7 theoretical knowledge items of reactor theory, thermodynamics, and chemical theory, as applicable, associated with the technologies, materials, and processes of the reactor design.

1.8 knowledge of administrative topics, including notifications, radiological controls, maintenance controls, technical specifications, equipment control, conduct of operations, and the emergency plan.

1.9 The KSA list details the development process, the determination for KSA testability (e.g., for non-testable KSAs, lack of criticality or untestable), and KSA list maintenance, and that the KSAs are organized in a logical format to facilitate use and review.

Section 2.0 2.1 The applicant or licensee provides documentation of assessment measure(s) to be used for each task and how the measures are aligned conceptually with the KSAs being measured. Sufficient detail needs to be provided to allow the NRC staff to determine adequacy of coverage.

2.2 The applicant or licensee provides an overall measurement strategy that may include a variety of knowledge and performance examinations.

2.3 An instructional designer, industrial/organizational psychologist, or a human factors specialist has determined the relevance of the measurement approach for the particular KSAs.

2.4 The applicant or licensee provides documentation on KSAs not covered by any assessment measures.

2.5 The applicant or licensee provides documentation of the procedures used to determine KSA inclusion and exclusion on the examination.

For each test/examination method, the applicant or licensee provides content specifications that provide a clear description of the jobs and tasks covered in the test methods and the specific content, including 2.6 the sampling methods/plan for determining inclusion on a specific examination instance.

2.7 a decomposition of the tasks into the knowledge, skills, cognitive processes, abilities, attitudes, or behaviors that are included in the test method, and which are excluded from the test method.

2.8 documentation on test length.

2.9 specification of the type of personnel for which the test is intended.

2.10 specification of intended uses of the test method.

2.11 format specifications that provide the testing strategy/format of items and describe the rationale for the formatting that include considerations of validity of the format and not simply ease of use.

2.12 evidence/supporting documentation that the test strategy aligns appropriately depending on the distinction of knowledge vs. skills.

2.13 The applicant or licensee provides documentation with all necessary test specifications and the bases for these test specifications.

2.14 The applicant or licensee provides a description of the test development process, to include generating items, reviewing items using statistical analysis and/or qualitative review of items, evidence of overall test validity, and the conditions, directions, procedures, and materials for taking the test.

2.15 The applicant or licensees test development process describes how it will be ensured that, at a minimum, topics from the following list of general categories of knowledge and abilities will be sampled during the course of examinations.

Eliminations of any of these categories is adequately justified:

  • reactor theory and thermodynamic principles
  • plant systems and components
  • reactivity management and manipulations
  • radiation control and safety
  • emergency, abnormal, and normal operations
  • administrative requirements and conditions of the facility license
  • technical specifications.

The applicants or licensees Test Plan documentation includes the following 2.16 what important work behaviors, activities, and worker KSAs are included in the domain, how the content of the work domain is linked to the testing procedure, and why certain parts of the domain were or were not included in the testing method.

2.17 evidence that demonstrates that the test adequately samples and is linked to the important work behaviors, activities, and/or worker KSAs, including the criteria and the rationale for the inclusion/exclusion criteria.

2.18 a rationale for a systematic, quasi-random content sampling procedure based on the professional judgment of the testing professional and an analysis of work that details important work behaviors and activities, important components of the work context, and KSAs needed to perform the work; if sampling is not performed due to small content domain size, an explanation of how examination predictability is addressed should be included.

Section 3.0 3.1 The applicant or licensee provides that the validity plan will be implemented prior to the first administration of a test developed under the examination program. As part of the examination program, applicants and licensees should monitor and collect data following the validity plan over training cycles. If significant deviation from estimated risk associated with the type/degree of harm that may result or from the estimated probability that incorrect inference can be corrected before that harm occurs is identified, applicants and licensees document the deviation, refine validity plan, and re-assess the examination.

3.2 The applicant or licensee has a program that ensures that the tests will work as intended and that the tests will demonstrate one or more kinds of validity.

Section 4.0 4.1 The applicant or licensee clearly defines appropriate cut-off scores and their bases. If a cut-off score of 80% is used, then no additional basis needs to be provided as 80% has been determined to be an acceptable cut-off score for power reactors in NUREG-1021. If a cut-off score of < 80% is used, then the basis for that cut-off score is provided.

Section 5.0 5.1 The applicant or licensee provides information on the reliability of tests and supports that information with a reasonable description of the methodology used to determine its adequacy.

Section 6.0 6.1 The applicant or licensee provides a comprehensive test manual that covers each type of test. The manual includes information on the purpose of the test, how the test was developed, and how to administer and score the test.

Test manuals include the following 6.2 a clear statement of the purposes and applications for which the test is intended.

6.3 clearly defined construct(s) the test intends to measure, the group(s) for which the test is intended are specified, the specific application of the test.

6.4 justification of the relevance of the test content for the construct(s) to be measured.

6.5 a summary of research findings for the test.

6.6 dates of any revisions to the manual, and any additional statistical tests for the revised version(s).

6.7 information on revisions of the test instruments associated with technology changes.

6.8 any limitations of the test.

6.9 a description of the expertise required to administer and interpret the test.

6.10 the time needed to administer the test.

6.11 the time allowed for testing.

6.12 a description of all procedures to be used for administration (and allowed variations).

6.13 a description of all materials for test takers to use.

6.14 a description of all scoring and reporting procedures, and description of necessary hardware and software.

6.15 clear and complete instructions to administer the test and interpret the results properly.

6.16 all necessary forms and any necessary aids to interpreting the test results.

6.17 case descriptions indicating how test scores can be interpreted.

6.18 a clearly defined domain and discussion of the procedures used for sampling from that domain.

6.19 discussion of any potential threats to the valid use of the test scores.

For a specific test instance, documentation will include the following 6.20 name of the test, authors of the test (by name and position), publisher of the test and the date published, and existence of alternate forms.

6.21 dates of any revisions to the test and any additional statistical tests for the revised version(s).

6.22 information on revisions of the test associated with technology changes.

6.23 a report on the evidence of the validity of the specified use of the test.

6.24 avoid referring to correlations between items and total test score as evidence of validity.

6.25 a description of criterion variables, adequacy of criterion variables, and aspects of criterion performance that are not adequately reflected in the criterion measure(s) when criterion validity is reported.

6.26 statements expressing relationships that are presented in quantitative terms so that the reader can tell how much confidence to attach to them.

For computer-based testing, the test manual also includes the following 6.27 information on any necessary computer software installation.

6.28 information on operation of the software.

6.29 provision for technical support.

Section 8.0 8.1 The applicant or licensee documents how it will meet examination security requirements. If the facility will have specifically licensed operators, it will also have procedures for preparing and submitting applications, including and waivers and excusals, as well as procedures for ensuring medical requirements are met. Facilities with specifically licensed operators will also have procedures for maintaining, updating, and renewing operator licenses.

8.2 The applicant or licensee should also document how examinees will be briefed on the examination process prior to examination administration, similar to the guidelines provided in Section 1.2 of NUREG-1021 but tailored to the specific examination process developed for the facility.

The applicant or licensee documents how it will ensure that the facilitys examination program 8.3 tests a representative sampling of the KSAs to safely perform licensed duties.

8.4 is approved by the NRC prior to use.

8.5 will be maintained.

8.6 contains provisions for how the prepared examinations for RO and SRO applicants will be made available to the NRC for review and approval prior to administration, if applicable.

8.7 contains provisions for giving sufficient advance notification to the NRC to allow for an NRC representative to be present during examination administration and, as applicable, to also administer the examination.

Section 9.0 9.1 If a simulation facility is used in examinations, the applicant or licensee documents how the simulation facility provides a level of fidelity and cognitive requirements sufficient to assess the intended knowledge and skills of operators.

9.2 If a simulation facility is used in examinations, the applicant or licensee documents policies and procedures to ensure simulator fidelity.

If a simulation facility is used in examinations, the applicant or licensee provides content specifications that provide a clear description of the jobs and tasks covered in the simulation and the specific content, as follows 9.3 tasks are decomposed into both those knowledge, skills, cognitive processes, attitudes, or behaviors that are included in the examination and those which are excluded from the examination.

9.4 the type of personnel for which the examination is intended is specified.

9.5 the intended uses of the examination is specified.

9.6 evidence and/or supporting documentation is provided showing that the simulation strategy is appropriate based on the distinction made between knowledge and skills.

9.7 If a simulation facility is used in examinations, the applicant or licensee provides policies and procedures to link scenario events to the operator tasks, and to document that information for scenario-based examinations.

9.8 The applicant or licensee provides policies and procedures to develop event-based metrics for scenario-based examinations.

Section 10.0 10.1 The applicant or licensee documents how to administer the operating test portion of the examination while maintaining examination security and examiner codes of conduct to ensure that test integrity is maintained.

Section 11.0 11.1 The applicant or licensee documents the change management process for the examination program and specifies which changes to approved programs can be made without and which require prior NRC approval.

This process is consistent with the requirements of either 10 CFR 53.1565 or 53.6065, as applicable, and should also conform to the guidance of Section 11.0.

Section 13.0 13.1 The requalification examination program has a provision to ensure that any requalification examination failures are properly remediated and retested, and that a retake examination is passed prior to allowing the licensed operator to return to licensed duties.

13.2 The requalification examination program designates an appropriate periodicity for the administration of requalification examinations. For ROs and SROs, this periodicity may not exceed 24 months. For GLROs, this periodicity may exceed 24 months with adequate provisions for informing the periodicity.

13.3 The requalification examination program contains sufficient provisions to satisfy the additional requirements of 10 CFR 53.780(c)(2) or 53.815(b)(3)(v), as appropriate, that are discussed in Section 13.0.

Section 14.0 14.1 The applicant or licensee provides a proficiency program for ROs and SROs that addresses the requirements of 10 CFR 53.780(g) and contains appropriate justification for the adequacy of its provisions.

Proficiency programs that are equivalent to, or more conservative than, the requirements of 10 CFR 55.53(e) and (f) may be considered to be acceptable without additional justification.

14.2 The applicants or licensees programmatic provisions for change control specifically disallow making changes to the approved proficiency program without obtaining prior NRC approval.

Section 15.0 15.1 The applicant or licensee includes criteria for waiving the requirement for a test or examination for generally licensed operators, including a basis for those criteria as needed.

With the satisfaction of the above acceptance criteria, the NRC staff can conclude that the examination program complies with the requirements of 10 Part 53 subpart F or subpart P, as applicable. Thus, there is reasonable assurance that the examination program meets the standards for examination in a manner that is commensurate with the design-specific safety role of the associated ROs and SROs or GLROs.

IMPLEMENTATION The NRC staff will use the information in this ISG in reviewing operator licensing examination programs submitted as part of applications made for OLs and COLs under 10 CFR Part 53.

Additionally, this guidance may be used to inform staff consideration of requests for exemptions from Parts 50, 52, and 55 requirements for Part 50 and Part 52 applicants and licensees, or for proposals for using alternative methods to those described in NUREG-1021 or NUREG-4178 by applicants and licensees under Part 50 or 52. The NRC intends to incorporate feedback obtained during the public comment period for the 10 CFR Part 53 proposed rule and associated guidance into a final version of this ISG, which would be issued along with the issuance of the final rule for 10 CFR Part 53.

BACKFITTING AND ISSUE FINALITY DISCUSSION DRO-ISG-2023-01, if finalized, would not constitute backfitting as defined under proposed 10 CFR 53.1590 or 53.6090, Backfitting, and as described in MD 8.4; constitute forward fitting as that term is defined and described in MD 8.4; or affect the issue finality of any approval issued under proposed 10 CFR part 53, Risk-Informed, Technology-Inclusive Regulatory Frameworks for Commercial Nuclear Plants. The guidance would not apply to any current licensees or applicants or existing or requested approvals under proposed 10 CFR Part 53, and therefore its issuance cannot be a backfit or forward fit or affect issue finality. Further, applicants and licensees would not be required to comply with the positions set forth in this ISG CONGRESSIONAL REVIEW ACT Discussion to be provided in the final ISG.

FINAL RESOLUTION The NRC staff will transition the information and guidance in this ISG into the RG or NUREG series, as appropriate. Following the transition of all pertinent information and guidance in this document into the RG or NUREG series, or other appropriate guidance, this ISG will be closed.

ACRONYMS CFR Code of Federal Regulations COL combined license DID defense-in-depth GLRO generally licensed reactor operator ISG interim staff guidance K/A knowledge and ability KSA knowledge, skills, and abilities LLWR large light-water reactor NEIMA Nuclear Energy Innovation and Modernization Act NRC U.S. Nuclear Regulatory Commission OL operating license RO reactor operator SAT systems approach to training SME subject matter expert SRO senior reactor operator SSCs structures, systems, and components

- A1 -

Appendix A Currently Approved Examination Methods The following methods have been determined to be adequate testing methods for use on operator licensing examinations as documented in NUREG-1021, Operator Licensing Examination Standards for Power Reactors, and the use of these methods in an applicant or licensee examination program will not require additional review by the NRC.

Written Examinations An applicant or licensee examination program that involves written examinations developed, administered, and graded in accordance with the guidance in examination standard (ES)-4, Initial Written Examinations, of NUREG-1021 and meeting the guidance of Appendices A, Overview of Generic Examination Concepts, and B, Examples of Written Examination Questions, of that NUREG does not require further review with respect to written examinations prior to NRC approval. Because the sample plans of NUREG-1021 do not necessarily accommodate non-large light water power reactor designs, the applicant or licensee should provide the basis for the sampling plan to be used and explain how it meets the guidance of Appendix A of NUREG-1021 to ensure examination validity. The applicant or licensee should also provide a basis for the use of the written examination test format to test the specific knowledge and ability items.

Any difference from the guidance in NUREG-1021 should be documented and a basis for that difference provided. Examples of differences include, but are not limited to, the number of questions on the written examination, the passing score, and how to sample (random and systematic or other method) and differences in administration (such as using multiple locations to test simultaneously or using computers to provide proctoring services).

Operating Test Job Performance Measures An applicant or licensee examination program that involves operating tests using job performance measures developed in accordance with the methodology specified in ES-3.1 and ES-3.2 of NUREG-1021 does not require further review with respect to the use of the job performance measure as a test instrument prior to NRC approval. The applicant or licensee should provide a basis for the use of the job performance measure test format to test the specific knowledge and ability items as well as establish a basis for the number of job performance measures used on the test and what constitutes passing performance. The format of the job performance measure that follows the methodology of NUREG-1021 does not require further review prior to NRC approval.

Job performance measures developed per the NUREG-1021 methodology should also be graded per the methodology in ES-3.6 of NUREG-1021. Any differences should be documented and explained.

- A2 -

Operating Test Dynamic Simulator Scenarios An applicant or licensee examination program that involves operating tests using dynamic simulator scenarios developed in accordance with the methodology specified in ES-3.1, ES-3.3, and ES-3.4 of NUREG-1021 does not require further review with respect to the use of the dynamic simulator scenario as a test instrument prior to NRC approval. The applicant or licensee should provide a basis for the use of the dynamic simulator scenario format to test the specific knowledge and ability items.

Different considerations should be applied to commercial nuclear plants with designs that significantly differ from existing large light-water reactor plants in terms of the degree of automation, human-system interaction, or interaction between people. The extent to which earlier scenarios are modified to avoid predictability under the guidance of NUREG-1021 might not be necessary, depending on the scope of the knowledge, skills, and abilities (KSA) list. For example, a commercial nuclear plant design with inherent safety systems and automation might only have a couple of tasks important to safe plant operations that are screened into the operator licensing examination. In this instance, it might be appropriate to include those tasks in every task-based examination to ensure that the operator can complete those tasks satisfactorily. The operating test will need to be validated to determine whether it assesses the target KSAs using the selected operator tasks necessary for maintaining efficient and safe control of the plant under normal and abnormal situations. The applicant or licensee should describe the basis for any portion of the dynamic simulator scenario development process that differs from that described in NUREG-1021.

An applicant or licensee should grade dynamic simulator scenarios developed per the guidance of NUREG-1021 using the methodology in ES-3.6 of NUREG-1021. Any differences in grading methodology should be documented and explained.

- A3 -

APPENDIX B Resolution of Public Comments

[Note: as this ISG is intended to accompany the 10 CFR Part 53 rulemaking package, stakeholder comments, as well as the resolution of those comments, will be included in the comment resolution document associated with the Part 53 rule]

- B1 -

APPENDIX C References Part 1 - Regulatory References ANSI/ANS 3.1-2014, Selection, Qualification, and Training of Personnel for Nuclear Power Plants.

Nuclear Energy Innovation and Modernization Act (NEIMA; Public Law 115-439).

Nuclear Energy Institute (NEI), Template for an Industry Training Program Description, NEI 06-13A, Revision 2.

U.S. Code of Federal Regulations, Domestic Licensing of Production and Utilization Facilities, Part 50, Chapter 1, Title 10, Energy.

U.S. Code of Federal Regulations, Licenses, certifications, and approvals for nuclear power plants, Part 52, Chapter 1, Title 10, Energy.

U.S. Code of Federal Regulations, Operators licenses, Part 55, Chapter 1, Title 10, Energy.

U.S. Nuclear Regulatory Commission, Risk-Informed, Technology-Inclusive Regulatory Framework for Commercial Nuclear Plants, Draft Rule, May 2022 (ML22125A000).

U.S. Nuclear Regulatory Commission, Human Factors Engineering Program Review Model, NUREG-0711, Revision 3.

U.S. Nuclear Regulatory Commission, Operator Licensing Examination Standards for Power Reactors, NUREG-1021, Revision 12.

U.S. Nuclear Regulatory Commission, Knowledge and Abilities Catalog for Nuclear Power Plant Operators: Pressurized Water Reactors, NUREG-1122, Revision 3.

U.S. Nuclear Regulatory Commission, Knowledge and Abilities Catalog for Nuclear Power Plant Operators: Boiling Water Reactors, NUREG-1123, Revision 3.

U.S. Nuclear Regulatory Commission, Qualification and Training of Personnel for Nuclear Power Plants, Regulatory Guide 1.8, Revision 4.

U.S. Nuclear Regulatory Commission, Facility Training Programs, DANU [XX]-ISG-[YYYY-##].

C1

Part 2 - Psychometric References Adams, N.E. (2015). Blooms taxonomy of cognitive learning objectives. Journal of the Medical Library Association. 103(3), 152-153. http://dx.doi.org/10.3163/1536-5050.103.3.010 Aggarwal, R., Mytton, O., Derbrew, M., Hananel, D., Heydenburg, M, & Reznick, R. (2010).

Training and simulation for patient safety. Quality and Safety in Health Care. 19(2), 33-43.

American Educational Research Association, American Psychological Association, & National Council on Measurement in Education (Eds.). (2014). Standards for educational and psychological testing. American Educational Research Association.

https://www.testingstandards.net/uploads/7/6/6/4/76643089/standards_2014edition.pdf Carey, J. M., & Rossler, K. (2021). The how when why of high fidelity simulation. National Library of Medicine. StatPearls Publishing.

https://www.ncbi.nlm.nih.gov/books/NBK559313/

Cascio, W.F., Alexander, R. A., and Barrett, G.V. (1988). Setting cut-off scores: Legal, psychometric and professional issues and guidelines. Personnel Psychology, 41(1).

Chiang, I.A., Jhangiani, R.S., & Price, P.C. (2019). Understanding psychological measurement.

Research Methods in Psychology, (2nd Ed).

https://opentextbc.ca/researchmethods/chapter/understanding-psychological-measurement/

Echtelt, R.V. (2020). Whats the difference between knowledge and skill? AG5.

https://www.ag5.com/whats-the-difference-between-knowledge-and-skills/

Evers, A., Sijtsma, K., Lucassen, W., & Meijer, R.R. (2010) The Dutch review process for evaluating the quality of psychological tests: History, procedure, and results. International Journal of Testing, 10(4), 295-317. DOI: 10.1080/15305058.2010.518325 Finan, E., Bismilla, Z., Whyte, H. E., LeBlanc, V., & McNamara, P. J. (2012). High-fidelity simulator technology may not be superior to traditional low-fidelity equipment for neonatal resuscitation training. Journal of Perinatology. 32, 287-292.

https://www.nature.com/articles/jp201196 Goldstein, I. L., Zedeck, S., & Schneider, B. (1993). An exploration of the job-analysis-content validity process. In N. Schmitt & W. Borman (Eds.), Personnel selection in organizations (pp. 3-34). San Francisco, CA: Jossey-Bass.

Kardong-Edgren, S., Anderson, M., & Michaels, J. (2009). Does simulation fidelity improve student test scores? Clinical Simulation in Nursing. 3(1), 21-24.

https://doi.org/10.1016/j.ecns.2009.05.035 Lievens, F., & Patterson, F. (2011). The validity and incremental validity of knowledge tests, low-fidelity simulations, and high-fidelity simulations for predicting job performance in advanced-level high-stakes selection. Journal of Applied Psychology. 96(5), 927-940. DOI:

10.1037/a0023496 Martin, S.L. and Raju, N.S. (1992). Determining cut-off scores that optimize utility: A recognition of recruiting costs. Journal of Applied Psychology, 77(1), 15-23.

Massoth, C., Roder, H., Ohlenurg, H., Hessler, M., Zarbock, A., Popping, D. M., & Wenk, M.

(2019). High-fidelity is not superior to low-fidelity simulation but leads to overconfidence in medical students. BMC Medical Education. 19.

C2

Motowidlo, S. J., Dunnette, M. D., & Carter, G. W. (1990). An alternative selection procedure:

The low-fidelity simulation. Journal of Applied Psychology. 75(6), 640-647.

Murphy, K. R. and Davidshofer, C. O. (2005). Psychological testing: Principles and Applications (6th ed.). Pearson/Prentice Hall. ISBN 0-13-189172-3.

Norman, G., Dore, K., & Grierson, L. (2012). The minimal relationship between simulation fidelity and transfer of learning. Medical Education. 46(7), 636-647.

https://doi.org/10.1111/j.1365-2923.2012.04243.x Society for Industrial and Organizational Psychology. (2018). Principles for the validation and use of personnel selection procedures. American Psychological Association.

https://www.apa.org/ed/accreditation/about/policies/personnel-selection-procedures.pdf Salas, E., & Burke, C. S. (2002). Simulation for training is effective when Quality Safety and Healthcare. 11, 119-120.

Salas, E., Paige, J. T., & Rosen, M. A. (2013). Creating new realities in healthcare: the status of simulation-based training as a patient safety improvement strategy. BMJ Quality and Safety. 22, 449-452. doi:10.1136/bmjqs-2013-002112 Schmidt, F. L., & Hunter, J. E. (1998). The validity and utility of selection methods in personnel psychology: Practical and theoretical implications of 85 years of research findings.

Psychological Bulletin, 124, 262-274.

Thorndike, R.M. (2005). Measurement and evaluation in psychology and education, (7th Ed).

Pearson. ISBN 9780130199980 C3