ML23017A130
| ML23017A130 | |
| Person / Time | |
|---|---|
| Issue date: | 04/20/2023 |
| From: | Office of Nuclear Reactor Regulation |
| To: | |
| References | |
| DRO-ISG-2023-04 | |
| Download: ML23017A130 (94) | |
Text
1 DRO-ISG-2023-04 Advanced Reactor Content of Application Facility Training Programs Draft Interim Staff Guidance This draft staff white paper has been prepared and is being released to support ongoing public discussions. This draft white paper uses an interim staff guidance (ISG) format because the staff is considering using this format to provide staff guidance in the near future to support the review of advanced reactor applications.
This paper has not been subject to NRC management and legal reviews and approvals, and its contents are subject to change and should not be interpreted as official agency positions.
2 DRO-ISG-2023-04 Advanced Reactor Content of Application Project Facility Training Programs Draft Interim Staff Guidance ADAMS Accession No.: ML23017A130
DRAFT INTERIM STAFF GUIDANCE ADVANCED REACTOR CONTENT OF APPLICATION Facility Training Programs DRO-ISG-2023-04 I.
PURPOSE The U.S. Nuclear Regulatory Commission (NRC or Commission) staff is providing this interim staff guidance (ISG) for two reasons. First, this ISG provides guidance on the contents of applications to an applicant for a construction permit (CP) or operating license (OL) under Title 10 of the Code of Federal Regulations (10 CFR) Part 50, Domestic Licensing of Production and Utilization Facilities (Ref. 1), or for a combined license (COL) under 10 CFR Part 52, Licenses, Certifications, and Approvals for Nuclear Power Plants (Ref. 2) (e.g., for a nonlight-water (non-LWR), stationary microreactor, or small modular light water reactor (LWR)). The guidance in this ISG may also be applicable to an applicant for a design certification (DC) or standard design approval (SDA) who has provided topical reports for NRC staff review with these applications.
Additionally, the NRC is currently developing a set of performance-based, technology-inclusive regulations for licensing nuclear power reactors designated as 10 CFR Part 53, Licensing and Regulation of Advanced Nuclear Reactors (RIN 3150-AK31). The application guidance found in this ISG can also be used to support the review of advanced reactor content of application guidance that applies to Title 10 of the Code of Federal Regulations (10 CFR) Part 53, Licensing and Regulation of Advanced Nuclear Reactors. The NRC anticipates that the agency will update its guidance for the contents of applications and NRC staff review of advanced nuclear reactor license and permit applications depending on the content of any regulations that might result from the Part 53 effort.
The guidance found in this ISG supports the NRC staff review of the portion of an application associated with the training program for plant personnel, including licensed operator requalification programs and general licensed operator continuing training programs. This guidance may also be used to support additional training program inspection guidance as currently specified in NUREG-1220. This ISG will serve as the advanced reactor application guidance for facility training programs. This ISG provides both applicant and staff review guidance.
II.
BACKGROUND This ISG is based on the advanced reactor content of application project (ARCAP), whose purpose is to develop technology-inclusive, risk-informed, and performance-based application guidance. The ARCAP is broader than, and encompasses, the industry-led technology-inclusive content of application project (TICAP). The guidance in this ISG supplements the guidance found in Division of Advanced Reactors and Non-power Production and Utilization Facilities (DANU)-ISG-2022-01, Review of Risk-Informed, Technology-Inclusive Advanced Reactor
Applications - Roadmap, issued in [September] 2022 (Ref. 9), which provides a roadmap for developing all portions of an application.
Because the NRC is still developing 10 CFR Part 53, the guidance in this document is subject to change based on the outcome of this rulemaking. Following approval of the 10 CFR Part 53 final rule, this ISG guidance will be supplemented, as necessary, to provide guidance for developing technical specifications to reflect any differences between current requirements in 10 CFR Parts 50 and 52 and new requirements in Part 53. The Part 53 rulemaking would revise the NRC's regulations by adding a risk-informed, technology-inclusive regulatory framework for advanced nuclear reactors, in response to the related requirements of the Nuclear Energy Innovation and Modernization Act (NEIMA; Public Law 115-439), as amended by the Energy Act of 2020. Key documents related to the Part 53 rulemaking, including preliminary proposed rule language and stakeholder comments, can be found at Regulations.gov under Docket ID NRC-2019-0062.
III.
RATIONALE The guidance found in this ISG is supplemental to the guidance found in NUREG-1220, Training Review Criteria and Procedures, Revision 1 (Agencywide Documents Access and Management System (ADAMS) Accession No. ML102571869). Because NUREG-1220 was intended to be used primarily for conducting for cause inspections of training programs at operating reactors (e.g., following an event at a power plant when training program weaknesses contributed to the event), it was necessary for the NRC staff to prepare additional guidance for reviewing descriptions of training programs and/or training procedures submitted for Commission approval. This ISG also includes guidance for performing periodic NRC inspections of the implementation of SAT training programs at facilities following initial approval.
The guidance in this ISG is consistent with the practices and methods that are in use by operating power reactor licensees for training all categories of plant personnel listed in 10 CFR 50.120 as well as licensed operators. Furthermore, the guidance in this ISG is also consistent with the practices and methods that are used by other industries that implement the SAT process to produce qualified personnel, including the Department of Defense (Ref. 7) and Department of Energy (Ref. 8). The guidance in this ISG provides one acceptable way for an applicant or facility licensee to implement a Systems Approach Training (SAT) process and demonstrate compliance with the relevant requirements, and the NRC staff will review training programs using the guidance in this ISG to evaluate whether the program uses a systems approach to training. If an applicant or facility licensee omits or deviates from the guidance in this ISG, a justification should be provided (e.g., an explanation for why the activity or method is not needed to implement a systems approach to training process).
IV.
APPLICABILITY This ISG is applicable to applicants for a CP or OL under 10 CFR Part 50 or for a COL, DC, or SDA under 10 CFR Part 52. When Part 53 is available, it will be applicable to all holders of and applicants for a power reactor CP and OL under 10 CFR Part 53.
V.
GUIDANCE A.
Application of the Systems Approach to Training Process Depending on the specific regulatory basis that is applicable to an advanced reactor facility applicant or licensee, training programs that are based on the SAT process may be required by one or more of the following regulations:
50.120, Training and qualification of nuclear power plant personnel 52.79, Contents of applications; technical information in final safety analysis report 53.785(a), Initial training program 53.785(c), Requalification program 53.815(a), Initial training program 53.815(c), Continuing training program 53.840, Training and qualification requirements 53.4250, Training Program 53.4282, Training and qualification requirements 55.59(c), Requalification program requirements The following guidance regarding the application of the SAT process is applicable to all the required training programs as defined by regulation above.
The SAT process has five elements, which are defined under 10 CFR 55.4 and proposed 10 CFR 53.725(b). Compliance with requirements for the use of the SAT process requires a facility applicant or licensee to apply the five elements of the SAT process to training programs for plant personnel. An additional description of the SAT process is located in ANSI/ANS-3.1-2014, Selection, Qualification, and Training of Personnel for Nuclear Power Plants section 6.2.1, as endorsed by RG 1.8, Rev. 4, Qualification and Training of Personnel for Nuclear Power Plants.
The following guidance is supplemental to NUREG-1220, Training Review Criteria and Procedures, and NUREG 0800, Standard Review Plan for the Review of Safety Analysis Reports for Nuclear Power Plants: LWR Edition, Chapter 13, Conduct of Operations, and will be utilized in consideration of training program submittal and inspections.
Analysis Phase The goal of the SAT process is ensure that plant personnel are adequately trained and qualified to perform their jobs. With this focus, SAT based training programs are established and maintained through determining training needs by conducting training needs analysis, developing a valid task list by conducting a job analysis, and conducting task analysis.
1.1. Conducting Training Needs Analysis The entry into the analysis phase is to determine the need for training through conducting a training needs analysis. A training need is any initiator that has the potential to impact the training process.
1.1.1. Needs Analysis is a Process that Includes Training and Line Personnel.
Training needs analysis is a process-based, systematic review of a training request to determine if training is necessary.
Training needs analyses are conducted by qualified instructors and utilize line subject matter experts and line management who are knowledgeable of the job requirements and standards of performance.
EVALUATION CRITERIA:
Training Program Approval:
The staff should verify that the licensee has a procedure documenting the systematic performance of the needs analysis process. The procedure should include clear guidance meeting the guidance of section 1.1.1 of this ISG.
Training Program Inspection:
The staff should verify that the licensees procedure for documenting the systematic performance of the needs analysis process is the same as or conforms to the one approved initially, and any changes made since the last review are appropriate, such as the deletion of sections.
1.1.2. The Needs Analysis Process is Used to Analyze Internal and External Factors.
The training needs analysis process provides guidance to systematically review internal and external changes that could impact the training program. The following items are examples of some of the possible events that can result in entry into the needs analysis process:
o Plant design changes/modifications o Changes in plant operating or administrative procedures o Changes in job or task roles or responsibilities o Regulatory requirement changes o Plant performance issues o Trainee/line feedback o Periodic program review/assessment o Lessons learned from operating experience 1.1.2.1.
Initial Training Programs For initial training program development, the training needs analysis process will initiate a performance based, systematic job and task analysis in the analysis phase using sections 1.2 - 1.3.
Additional training needs are initially identified by reviewing existing requirements from the licensee, state, or government. Required training from a regulatory source (Occupational Safety and Health Act (OSHA), etc.) does not require further documented analysis justifying the training since the needs analysis has already been done by the regulating source. These requirements are included and addressed in the design and development phases of the program.
1.1.2.2.
Existing Training Programs For existing training programs, needs analysis is conducted on an as-needed basis to maintain the training program. The needs analysis process is entered as the result of the evaluation phase, such as from line feedback post training, or through an event that could impact the training program as identified in section 1.1.2.
EVALUATION CRITERIA:
Training Program Approval:
The staff should verify that the licensee has a procedure documenting the systematic performance of the needs analysis process. The procedure should include:
Clear guidance meeting the guidance of section 1.1.2 of this ISG for consistent performance of analyzing internal and external factors. Validate that there is a procedural statement referencing the licensee to initiate a training needs analysis process for an event that could impact the training program.
See section 1.2 and 1.3 for job and task analysis results in determining initial training program compliance.
Training Program Inspection:
The staff should perform a review of training needs analysis conducted by the licensee per the approved training program. The inspector should attempt to collect a sampling of training needs analysis from the possible events in section 1.1.2.2. The sampling of section 1.1.2.2 can be credited through review of needs analysis in section 1.1.3.
1.1.2.3.
Needs Analysis Process Utilizes Job and Task Analysis Process When Applicable.
The training needs analysis process includes reviewing the information requiring analysis against the current training program to determine what, if any, changes are required. The training need is focused to specific tasks or knowledge, skills or abilities (KSA) which is then compared to the current training program to determine if the current program adequately covers the task or specific KSA requested, and if so, at what retrain periodicity.
For example, if line personnel request training on a topic, the training needs analysis needs to identify the reason for the training request and what specific task, knowledge, skill or ability the line personnel are asking for. The specificity of the detailed information allows the training staff to review the request details against the approved training program curriculum to determine if the requested material is currently trained, and if training on the request will be beneficial to the organization.
If the training request has the potential to impact an incumbent's job (i.e. the job scope, equipment, or responsibilities changed), or tasks (i.e. one or more tasks may be impacted because of procedure, standards or facility changes), then entry into job analysis and/or task analysis sections are necessary while conducting a training needs analysis.
For example, if a design change is modifying a piece of installed equipment whose operation is defined as a task in the operator training program, then the design changes must be compared against the approved task list to determine if there are any new tasks being created by the change, and then any tasks impacted by the change must be reexamined through task analysis to determine if the design change impacted any task characteristic (conditions, standards, elements, or KSAs).
EVALUATION CRITERIA:
Training Program Approval:
The staff should verify that the licensee has a procedure documenting the systematic performance of the needs analysis process. The procedure should include:
Clear guidance meeting the guidance of section 1.1.3 of this ISG for consistent performance of utilizing job and task analysis process when applicable.
See section 1.2 and 1.3 for job and task analysis results in determining initial training program compliance.
Training Program Inspection:
The staff should perform a review of training needs analysis that show analysis of a change requiring entry into job and/or task analysis. If possible (e.g., if changes were made during the inspection period), a sampling of analysis that impacted both the job and task analysis should be included in the review. Review the initiating change (design change, procedure change, etc.) against the training needs analysis to determine the following per the approved training program:
Did the analysis identify the change against the approved task list and associated KSAs?
For job analysis impact, was the task re-reviewed in its screening for selection for training? Was the approved DIF analysis updated as necessary?
For task analysis impact, were the task characteristics identified in 1.3.2 updated appropriately?
Were the resulting recommended actions from the training needs analysis appropriate given the changes identified (recommended for/against training, updating training materials)?
Perform a review of the original job and/or task analysis against the final analysis.
Do the changes to the analysis (task list, KSAs) reflect the decisions of the needs analysis?
1.1.3. Changes to the Task List and Associated KSAs Job and task analysis for initial training programs provides the foundation of the SAT program. Accordingly, when a training needs analysis results in modification to the approved task list and associated program KSAs, the following apply:
1.1.3.1.
Changes to non-Commission Approved Training Programs Referring to training programs for plant personnel in 10CFR 50.120, the licensee is allowed to make changes to their SAT based training programs task list and associated KSAs as necessary to maintain plant performance, provided the changes are analyzed through the needs analysis process. Such changes shall be documented and approved for periodic training program review by the NRC staff during routine inspections to verify compliance with SAT approved procedure.
1.1.3.2.
Changes to Commission Approved Operations Training Programs The licensee is required to obtain prior NRC approval to make changes to the Commission approved training program under the following conditions:
(1) The change results in the deletion of task(s) and/or associated KSAs related to systems or equipment important to safety, required by technical specifications, or NRC regulation.
(2) The change results in the deletion of task(s) and/or associated KSAs that are currently utilized in the Operator Examination process.
For example, if a licensee performs a training needs/job analysis and determines that a task as defined in (1) is not required in the task list and should be deleted, NRC approval is required prior to implementing those changes. However, if a plant design modification is installing new plant equipment, the licensee is permitted to conduct needs/job/task analysis and take action as analyzed to incorporate the new plant equipment into the training program as determined by the results of the analysis.
Note that task deletion is not the same as task deselection from training. In task deselection, a task in the approved training program curriculum that was selected for training (as defined in section 1.2.4) is no longer required to be trained and is therefore deselected from the trained task list. The task still resides in the approved job analysis along with the DIF data and all associated attributes of the task analysis, including KSAs. The task will remain unselected for training until such a time that program evaluation (periodic task list review, line feedback, line performance) determines that task training is beneficial for sustained performance. For Commission approved training programs, the task and applicable KSAs are still considered testable per the examination standard regardless of the task being deselected from the licensees training curriculum.
In task deletion, if the licensee conducts needs/job analysis and determines that a task is no longer required in the training program and should be deleted, then the task, DIF data, and all associated KSAs are removed from the training program and the task will not be further considered for training under the periodic task list review.
For Commission approved training programs, the deletion of a task removes the task and applicable KSAs from the program and subsequently from the operator examination process; therefore, NRC approval is required. Task deletion would occur through program change resulting in modification of the job scope, such as equipment changes eliminating a user interface and rendering the task description obsolete.
1.1.3.3.
Changes to the Objectives and Lesson Plan Material Does Not Always Require Changes to the Task and KSA list The requirements in 1.1.4.1 above does not limit the licensee from making changes to the approved training programs design and development, so long as the required tasks and applicable KSAs are covered.
If a plant change or feedback on a training program is analyzed and requires additional detail or changes to the content of a lesson plan associate with technical specification related equipment, the change is permitted without prior NRC approval providing the associated KSAs are still met.
For example, a task selected for training has a knowledge component that states knowledge of the __ system controls, and is tied to the objective From memory, describe the controls of the __ system in accordance with (station procedures/documentation). If a training analysis results in modification of the lesson plan material covering the system controls, or changes to the wording of the objective, then those changes would be permissible under the SAT process provided the requirements of the KSA are still met by the objective and associated content, (i.e. the objective and associated content still adequately train the linked knowledge component - knowledge of system controls).
EVALUATION CRITERIA:
Training Program Approval:
The staff should verify that the licensee has a procedure documenting the systematic performance of the needs analysis process. The procedure should include:
Clear guidance meeting the requirements of section 1.1.4.1 of this ISG for consistent performance of how to make changes to the task and KSA list for non-Commission approved training programs.
Clear guidance meeting requirements of section 1.1.4.2 of this ISG for consistent performance of how to make changes to the task and KSA list for Commission approved training programs.
Clear guidance on the delineation between task deletion and deselection as defined in section 1.1.4.3 of this ISG.
Training Program Inspection:
The staff should perform a review of training needs analysis that show analysis of a program change. Review the change against the following criteria per the approved training program:
Were changes made to the approved task list in compliance with the guidance of section 1.1.4 of this ISG regarding obtaining pre-approval for necessary changes?
If changes were made to training program design and development materials for the Commission approved program, did the changes impact the design or developed materials coverage of the tasks or associated KSAs?
1.1.4. Needs Analysis Process Maintains the Initial and Continuing Training Programs When changes are determined necessary by the training needs analysis, the needs analysis process should consider impacts to initial training program design and developed material for impact as well as for continuing training.
EVALUATION CRITERIA:
Training Program Approval:
The staff should verify that the licensee has a procedure documenting the systematic performance of the needs analysis process. The procedure should include:
Clear guidance meeting the guidance of section 1.1.5 of this ISG for consistent performance of maintaining the initial and continuing training programs.
See section 1.2 and 1.3 for job and task analysis results in determining initial training program compliance.
Training Program Inspection:
The staff should perform a review of training needs analysis that show analysis of a training request. Review the change against the following criteria per the approved training program:
Did the change identify a task or associated KSA that required retraining in the continuing training program?
Did the change impact a task or associated KSA that required updating initial training material?
Was the change made, or is there an approved tracking mechanism in place to ensure it will be updated prior to being taught? Were objectives and developed content modified according to the analysis and appropriate to the changes analyzed?
1.1.5. Needs Analysis Process Includes Analyzing Performance Gaps When conducting a needs analysis due to substandard job performance, the analysis reviews the cause(s) of the issue to determine if training is the appropriate solution. As specified in evaluation phase section 5.3.2, the analysis should review the conditions surrounding the event to determine if there was a true gap in knowledge or skill vice an external factor that contributed to the performance issue (faulty equipment, inadequate procedures, workforce attitudes, for example).
If training is determined necessary to close the performance gap, as part of the analysis the licensee is expected to identify quantifiable, measurable metrics for monitoring training effectiveness. For example, if training is identified to aid in reduction of equipment out of service time, an effectiveness measure could be a xx percent reduction in equipment out of service time, as measured over the next xx months. This metric will be used to evaluate the effectiveness of the training to close the performance gap in the evaluation phase, section 5.3.2.
The staff should verify licensee provides guidance to specify the need for training for closing performance gaps when a gap in knowledge or skill exists.
EVALUATION CRITERIA:
Training Program Approval:
The staff should verify that the licensee has a procedure documenting the systematic performance of the needs analysis process. The procedure should include:
Clear guidance meeting the guidance of section 1.1.6 of this ISG for consistent performance of analyzing performance gaps.
Performance gaps are not part of initial program review; therefore, no analysis resulting from performance gaps are reviewed during training program approval.
Training Program Inspection:
The staff should perform a review of training needs analysis that resulted from performance gaps. Review the performance gap and corresponding analysis against the following per the approved training program:
Did the analysis correctly identify that the performance gap was the result of a training deficiency? OR Did the analysis correctly identify that the performance gap was the result of some other gap (process or managerial) and not recommend training?
If training was recommended as a result of the analysis, did the analysis for training include a metric for measuring training effectiveness in the evaluation phase?
Did the corresponding training adequately target training for the correct population for the applicable KSA(s)?
1.1.6. Training Exemptions are Analyzed and Documented When determining the need for training, the training analysis process provides a method to exempt personnel from training when adequate justification exists for the exemption (note: the term exemption in this context means that the individual does not need to meet one or more training program requirements). Such exemptions are objective based with justification provided.
EVALUATION CRITERIA:
Training Program Approval:
The staff should verify that the licensee has a procedure documenting the systematic performance of the needs analysis process. The procedure should include:
Clear guidance meeting the guidance of section 1.1.7 of this ISG for consistent performance of analyzing and documenting training exemptions.
The staff should perform a review of any training exemptions filed by the licensee. Did the exemption correctly identify the trainees objective knowledge and/or experience and perform an accurate comparison against the exempted portion of the curriculum?
Training Program Inspection:
The staff should perform a review of training exemptions filed by the licensee. Did the exemption correctly identify the trainees objective knowledge and/or experience and perform an accurate comparison against the exempted portion of the curriculum per the approved training program?
1.1.7. Needs Analysis Documentation Needs analysis actions shall be documented to include actions taken and decisions made, including the rationale supporting those decisions.
EVALUATION CRITERIA:
Training Program Approval:
The staff should verify that the licensee has a procedure documenting the systematic performance of the needs analysis process. The procedure should include:
Clear guidance meeting the guidance of section 1.1.8 of this ISG for consistent performance of documenting needs analysis.
See section 1.2 and 1.3 for job and task analysis results in determining initial training program compliance.
Training Program Inspection:
The staff should perform a review of training needs analysis approved. Verify the analysis include the following per the approved training program:
Clearly document the training request Clearly document the decision made Clearly document the reason/basis for the decision Cleary document the actions taken The action(s) taken resolves the training need 1.2. Conducting Job Analysis A job analysis is conducted to produce a detailed list of duty areas, where applicable, and tasks for a specific job or position. Job analysis involves finding out exactly what the individual does on the job (performance and cognitive based tasks) rather than what the individual must know to perform the job.
In the current nuclear industry, traditional job analysis utilizes a performance-based approach to performing tasks. These requirements are defined in section 1.2.1 and remain applicable to training programs in development for new reactor designs.
However, with the increasing automation provided by advanced technology, the NRC staff recognizes the limitation of traditional job analysis in task identification, since it is reasonable that many plant processes and operations may be completed through computer initiation. In traditional task analysis, the operation of a single push button or mouse click on a computer screen may actuate a multistep sequence of actions that
initiates a process versus a specific task. The operators awareness of their roles, function, and responsibility when performing automation actuations is expected to be analyzed for performance tasking as well as cognitive based tasking.
In such cases, the NRC staff recognizes the need for cognitive tasks as integral in the job analysis process. Cognitive tasks provide recognition towards the cognitive awareness of the systems automation and processes despite the potential limited discreet steps taken by the performer. In the job analysis process, cognitive tasks will play an important role in the identification of the knowledge, skills and abilities required to achieve optimal performance of the job duty functions.
1.2.1. Job Analysis is a Process That Includes Training and Line Personnel Job analysis is a systematic, procedure-based process involving job incumbents/subject matter experts and qualified training staff. Job incumbents and subject matter experts are incorporated into the analysis process with qualified instructors in reviewing the job requirements and providing input into the job analysis process as defined in section 1.2 of this ISG. It is generally not sufficient, particularly for facilities where plant personnel jobs are relatively more complex, for one person (even someone who is expert in the job) to perform the analysis without input from others. Training and line supervision are incorporated into the job analysis process through review and approval of the job analysis results.
EVALUATION CRITERIA:
Training Program Approval:
The staff should verify that the licensee has a procedure documenting the systematic performance of the job analysis process. The procedure should include:
Clear guidance meeting the requirements of section 1.2.1 of this ISG for consistent performance of job analysis activities that include training and line personnel.
Guidance includes directions for qualified instructors and job incumbents/subject matter experts on how to conduct job analysis.
The licensee provided documentation on who conducted the job analysis, to include participants other than the job analyst. The following information should be included for each person involved in the job analysis: current position and/or title, years of experience in the fields, years of experience in current position, brief description of their areas of expertise.
Line and training supervision are included in the review and approval process of the job analysis process.
Training Program Inspection:
The staff should verify that the licensee has maintained the analysis procedure documenting the systematic performance of the job analysis process. Review the changes made to the procedure since the last inspection for changes that would omit required sections from the procedure.
1.2.2. Job Analysis Process Groups Tasks into Position/Role/Duty Areas
In development of the job scope, tasks are grouped together into positions, roles, and duty areas.
The job analysis is highly congruent to the task analysis element of the human factors engineering (HFE) program and used to understand the positions, roles and duties of the operators that controls the plant. This analysis can influence the appropriate positions needed and the number of personnel required to perform specific actions important to safe plant operation. The HFE design verification process is used to verify that HSIs support the tasks from the HFE task analysis. The job analysis process incorporates that information and breaks down the job requirements into discreet tasks that can be further analyzed for consideration in the training program.
Traditional job analysis positions, roles and duty areas have been established for current reactor designs and training programs. Each licensee may vary the organization of their training program grouping but the core concepts have been the same. For example, current fleet light water reactor designs have utilized control switch manipulation and plant monitoring of systems in the control room for operators (job - Control Room Operators). The job analysis includes the positions of Operator at the Controls (OAC) and Balance of Plant (BOP), both of which have multiple roles, such as performing operator duties requiring a license (operating control switches/equipment that directly affected reactivity) and performing duties that do not require a license (operating non-reactivity related equipment). The resulting job analysis produced a task list extensive enough to require all positions in the control room to be licensed operator positions because both operator roles (OAC and BOP) included the function of performing operator duties that require a license. Resultingly, the job, Control Room Operator, has been largely considered Licensed Control Room Operator. The duty areas are then broken down into groups of tasks that constitute a major subdivision of a job. They integrate specific knowledge, skills and abilities required by the job.
In performing position/role/duty area analysis for advanced reactor designs, it is expected that the large light water reactor model of job, position and role may be different based on the tasks performed related to that specific design. The job analysis should take into consideration the role each operator position is expected to perform and group corresponding duty areas accordingly. For example, due to reactor design and automation, the number of reactivity related tasks might be limited such that all tasks required to be performed by a licensed operator could easily be performed by a single dedicated position (the equivalent of the OAC) in addition to the assigned functions of performing duties that do not require an operator license. In such a case, the role of other designated control operator positions (the equivalent of the BOP) would not necessarily include performing duties that require an operator license as in the current LWR fleet training programs, and instead be considered a non-licensed SAT qualified operator staffed position.
Reference Appendix 1 for an example breakdown of the above job task analysis as delineated between current light water reactors and advanced reactors.
EVALUATION CRITERIA:
Training Program Approval:
The staff should verify that the licensee has a procedure documenting the systematic performance of the job analysis process. The procedure should include:
Clear guidance meeting the guidance of section 1.2.2 of this ISG for consistent performance of analyzing job tasks into position, role, and duty areas.
The staff should perform a review of the job analysis results to confirm the following:
Verify that the licensee has a procedure documenting the systematic performance of the job analysis process. The procedure should conform to section 1.2.2 of this ISG for consistent analysis of plant personnel jobs.
Perform a review of the job analysis results to confirm that the roles, responsibilities, and duty areas, if applicable, of major job categories are defined.
The position, role and duty areas are congruent with HFE analysis and are appropriate for the job.
Training Program Inspection:
The staff should review of this section is credited from activities performed in section 1.1.3.
1.2.3. Job Analysis Process Produces a Task List Job analysis should be conducted using a methodological process that results in consistent performance on different occasions by different people. Job analysis consists of considering performance-based actions the performer will be required to do. The outcome of Job analysis process produces a task list of actionable, definable tasks that clearly states the action required, indicates an object of the action to be performed, and includes the process/procedure when the task utilizes a standard (procedure).
The Task list derived from job analysis should be generated by training and incumbent personnel familiar with the job analysis process. Task statements resulting from job analysis should include the following systematic criteria:
Consists of a logically ordered set of steps Be observable and measurable or produce an observable and measurable result Have one action verb and one object Have a specific beginning and end Occur over a short period of time Can be executed with consistent results on different occasions by different people, Requires a record of review and/or approval for historical record, and Results in a consistently formatted, quality product.
1.2.3.1.
Initial Training Job Analysis Creation of a new training program requires full utilization of the job analysis process to identify the tasks associated with the job. There are several different methods that can be used to conduct job analysis, such as traditional analysis, verification analysis, document analysis, and table top analysis. Any method can be utilized by the licensee provided the standards of job analysis are met.
Job Analysis requires review of job scope as defined by existing job information, such as procedures, technical specifications, design basis/UFSAR documents,
human factors engineering, regulatory requirements, and other programs that credit human interaction.
1.2.3.1.1.
Initial Job Analysis Considerations Since the development of job tasks is expected to be separate and unique to the design of the plant, the applicant is encouraged to consider development of the initial training job analysis during human factor task analysis. Any cross reference/utilization of existing job task analysis from previous reactor designs, such as light water reactor designs, as a method to justify adequacy of a job analysis is not recommended due to potential dilution of the core task list and subsequent KSA development.
Cognitive Job Analysis With advanced reactor designs incorporating increased usage of task automation, the traditional definition of a task may produce a limited set of job tasks that does not accurately reflect the full knowledge, skills, and abilities necessary to perform job duties. With increasing automation, work activities are becoming increasingly cognitive, specifically, shifting away from easily observable physical actions to the domain of supervisory control. As nuclear power plant tasks continue to move towards greater use of supervisory control via increasing automation, the use of the next generation of cognitive methodologies will aid in achieving full scope of training program job analysis.
Accordingly, initial training program job analysis should incorporate human factors engineering elements, and as defined in NUREG 0711, cognitive task analysis (CTA) in the development of the initial task list, which will allow further knowledge, skill, and ability assessment in the task analysis section of the analysis phase of SAT.
For operations training program job analysis, consideration of task list development should include plant systems important to safety and operator interaction. Systems with operator interaction will result in, at a minimum, operator tasks corresponding to the manual actions required. Systems important to safe operation with no manual operator interaction will not produce a task list from operator action as traditionally defined by a task (discreet, observable steps); however, operators must monitor the systems for proper operation. For example, job analysis on an automated or passive system with no operator interaction could result in the task monitor the operation of the __ system, or operate the __ system in manual.
Foundational Theory of Plant Operations Job analysis for operations training programs shall also include foundational theory of plant operations in their task list. These include the following areas:
Reactor theory, thermodynamic principles and chemical theory associated with the technologies, materials, and processes of their reactor design Plant systems and components Reactivity management and manipulations Radiation control and safety Emergency, abnormal, and normal operations Administrative requirements and conditions of the facility license Technical specifications Foundational theory of plant operations as described above and systems important to safe plant operations are required to be included in the resulting task list and KA development for training program design. Examples of systems important to safety include fire protection and offsite power.
Operations Training programs shall document the basis for the scope of the job analysis. All items on Appendix 3 must be considered as part of the operations training program, through task and associated KSA development, as relevant to the design.
1.2.3.2. Existing Training Job Analysis For continuing training programs, at a minimum of every 6 years, a periodic review of the initial task list is conducted to validate the task list against job performance requirements. This is not a review of only the tasks selected for training, as identified in section 1.2.4; the review of the initial task list includes tasks created from the job analysis as defined in section 1.2.3 including those tasks which, through DIF analysis, exist in the task list but are not systematically selected for training. Potential gaps identified in the task list are documented and analyzed through the analysis process. The periodic review, conducted by line incumbents, qualified instructors, shall be documented and approved.
EVALUATION CRITERIA:
Training Program Approval:
The staff should verify that the licensee has a procedure documenting the systematic performance of the job analysis process. The procedure should include:
Clear guidance meeting the guidance of section 1.2.3 of this ISG for consistent performance of conducting job analysis.
Clear guidance to conduct the task list review on the full task list, not just those tasks selected for training per section 1.2.4.
The staff should perform a comprehensive review of the job analysis conducted by the licensee. Review results against the design basis and human factors task list criteria provided in Section 18 of the SRP. Validate the following for each accredited program:
Written instructions on the purpose and function of the job analysis process were provided to the personnel providing input into the job analysis.
The task list contains a comprehensive list of tasks per plant design.
The task list creation was in compliance with the procedures established in section 1.2.1.
Each task created meets the criteria specified in Section 1.2.3.
All critical safety related tasks as defined by the human factors, design basis, or PRA are included in the task list.
Cognitive tasks are included in the task list sufficient to produce a master task list that comprehensively describes the job being analyzed and to allow full knowledge, skill, and ability development in the task analysis process.
Cognitive tasks are included in the task list to include theory of plant operations in their task list, including at a minimum:
o Reactor theory, thermodynamic principles and chemical theory associated with the technologies, materials, and processes of their reactor design o Plant systems and components o Reactivity management and manipulations o Radiation control and safety o Emergency, abnormal, and normal operations o Administrative requirements and conditions of the facility license o Technical specifications Operations training programs shall document the basis for the scope of the job analysis for commission approval. All items on Appendix 3 must be considered as part of the operations training program, through task and associated KSA development, as relevant to the design.
Training Program Inspection:
The staff should perform a review of the periodic initial training task list review conducted by the licensee, if performed since the last inspection. Validate the following per the approved training program:
The task list review was performed by line incumbents and qualified instructors.
The task list review included all tasks in the program, including tasks selected for training and the tasks not selected for training.
Any changes made from the task list review were documented and analyzed through the training needs analysis process.
Any gaps identified by the results of the task list review were documented and analyzed through the training needs analysis process.
1.2.4. Tasks are Systematically Selected for Training The task list derived from job analysis is reviewed to determine which tasks require training. A systematic process is required to differentiate those tasks that are simple and easy to perform and therefore require no additional training against those tasks that require training to prevent performance errors or are of high consequence.
There are several techniques for selecting tasks for training. The traditional technique for selecting tasks involves determining the difficulty, importance (i.e. impact on safety and reliability), and frequency of each task performance. Reference Appendix 2 for a task selection for training process example. The job scope of the task should be reviewed against the design basis documents, the SER, FSAR, and human factors analysis for
determining difficulty, importance, and frequency. Other criteria can be used in the decision making of the analysis, provided there is objective rationale and consistent methodology used to select tasks for training.
The task list selected for training derived from job analysis should be generated by training and incumbent personnel familiar with the job analysis process. The task list should produce a recommendation for each task. Categories for training include:
Tasks that do not require formal training (not limiting the option of informal training)
Tasks which require initial formal training. (i.e. classroom, self-paced structured learning, lab, simulator, structured on-the-job training).
Tasks which require both initial training and continuing training.
Tasks that are considered specialty tasks and designated to be taught at a later or specific time. Specialty tasks may still require initial and continuing training, but their organization into the training curriculum might be different than routine tasks performed by the individual.
1.2.4.1.
Licensed Operator Training Includes Items Important to Safe Plant Operation For licensed operator training programs, tasks for foundational theory of plant operations and systems important to safety are required to be included in the resulting task list and KA development for training program design.
Cognitive tasks included in the KSA development include at a minimum the following subject areas:
o Reactor theory, thermodynamic principles and chemical theory associated with the technologies, materials, and processes of their reactor design o Plant systems and components o Reactivity management and manipulations o Radiation control and safety o Emergency, abnormal, and normal operations o Administrative requirements and conditions of the facility license o Technical specifications 1.2.4.2.
Licensed Operator Retrain Periodicity Licensed operator retraining programs must maintain a retrain periodicity of 2 years as specified in 10 CFR Part 53.
EVALUATION CRITERIA:
Training Program Approval:
The staff should verify that the licensee has a procedure documenting the systematic performance of the job analysis process. The procedure should include:
Clear guidance meeting the guidance of section 1.2.4 of this ISG for consistent performance of systematically selecting tasks for training.
The staff should perform a comprehensive review of the tasks selected for training conducted by the licensee. Review results against the design basis and human factors task list criteria provided in Section 18 of the SRP. Validate the following for each accredited program:
The training task list contains a list of all the tasks selected to train.
The training task list creation complies with the procedures established in section 1.2.4 Tasks selected for training meets the criteria specified in Section 1.2.4.
All critical safety related tasks as defined by the human factors, design basis, or PRA are included in the training task list.
Licensed operations training includes tasks for systems important to safety and the items listed in 1.2.4.1, as applicable to the design of the plant.
Licensed operator retrain periodicity is set at 2 years.
Training Program Inspection:
The staff should perform a review of the training program continuing training schedule.
Conduct a review of the program by reviewing the following:
Review the training program continuing training schedule. Given a training item, review the tasks linked to the training and validate their DIF data supports the retrain periodicity the licensee has selected.
Review the task list for the program. Given a task in the program, review the associated DIF data and, for tasks selected for retraining, validate that the initial and continuing training program curriculum includes those tasks.
Review the operations training program curriculum; validate systems important to safety and the items listed in 1.2.4.1 are included as per the original (or subsequent) analysis, as allowed by Section 1.1.4 Validate that the licensed operator retrain periodicity of 2 years is being met.
1.2.5. Job Analysis Documentation The results of job analysis require review and/or approval, and documentation must be maintained for historical record.
EVALUATION CRITERIA:
Training Program Approval:
The staff should verify that the licensee has a procedure documenting the systematic performance of the job analysis process. The procedure should include:
Clear guidance meeting the guidance of section 1.2.4 of this ISG for consistent performance of documenting the job analysis process.
The staff should perform a review of the following The training procedures established in 1.2.1 include retention requirements.
The documents supporting items 1.2.1 - 1.2.4 are available for review.:
Training Program Inspection:
The staff should perform a review of recent job analysis conducted by the licensee.
Validate that the requirements of 1.2.4 were met.
1.3. Conducting Task Analysis Tasks selected for training are analyzed to determine the scope of the activity. This analysis produces a defined list of the job attributes required for satisfactory accomplishment of the task.
1.3.1. Task Analysis is an Iterative Process that Includes Training and Line Personnel Task analysis is a systematic, procedure-based process involving job incumbents/
subject matter experts and qualified training staff. Task analysis should use an iterative process to synthesize and combine the information into an exhaustive list of the tasks that are part of a job as well as the respective KSAs inherent to the tasks. An effective task analytic process includes revisions, additions, and elimination of task information.
Thus, it is an iterative process. Determining the appropriate depth of analysis can be difficult. Reviews by SMEs should reveal if the right content to the right degree is present. Missing content can be difficult for SMEs to detect. Thus, it is better for the analysis to include the details that can later be eliminated rather than be excessively brief from the beginning.
Job incumbents and subject matter experts are incorporated into the analysis process with qualified instructors in reviewing the job requirements and providing input into the job analysis process as defined in section 1.3 of this ISG. It is generally not sufficient, particularly for facilities where plant personnel jobs are relatively more complex, for one person (even someone who is expert in the job) to perform the task analysis without input from others. Training and line management are incorporated into the job analysis process through review and approval of the job analysis results.
EVALUATION CRITERIA:
Training Program Approval:
The staff should verify that the licensee has a procedure documenting the systematic performance of the task analysis process. The procedure should include:
Clear guidance meeting the guidance of section 1.3.1 of this ISG for consistent performance of task analysis activities utilizing line and training personnel.
Guidance includes directions for qualified instructors and job incumbents/subject matter experts on how to conduct task analysis.
The licensee provided evidence that an iterative revision process was used in completing the task analysis after data collection.
The licensee provided documentation on who conducted the task analysis, to include participants other than the task analyst. The following information should be included for each person involved in the task analysis: current position and/or title, years of experience in the fields, years of experience in current position, brief description of their areas of expertise.
Line and training management are included in the review and approval process of the job analysis process.
Training Program Inspection:
The staff should verify that the licensee has maintained the analysis procedure documenting the systematic performance of the task analysis process. Review the changes made to the procedure since the last inspection for changes that would omit required sections from the procedure.
1.3.2. Task Analysis Produces Task Characteristics for Further Training Development The task analysis process includes reviewing the task statements in context to their plant application to identify the characteristics of the task. These characteristics are then used to design and develop the training program for incumbent qualification.
The following characteristics are considered during task analysis:
Initial conditions (prerequisites) required for task performance.
Standards (criteria) for acceptable task performance (i.e. limits, ranges, time requirements).
Critical elements (steps) that must be performed to accomplish the task properly Tools, equipment, and safety concerns related to task performance, Associated knowledge, skill, and ability statements required to perform particular elements of the task or the overall task.
Branch steps/alternate paths that result in additional actions or knowledge, skill, and ability to accomplish satisfactory performance.
1.3.2.1.
Operator Licensing Programs Produce a Comprehensive KSA List for Commission Approval Current LWR designs utilize a KA catalog, NUREG 1122, 1123, or 2103, that lists the knowledge required for operators, which the NRC staff uses in the operator license examination process. The NRC staff recognizes that advanced reactor designs will be significantly different then current large light water reactor designs, with varying systems and features important to safe plant operation. Based on this, the license examination process is expected to be design specific and tailored to the results of the job and task analysis per the SAT process as specific to the advanced reactor design.
Utilizing a review of the original KA catalog NUREGs as a starting point for KSA list development is indicative of bypassing the job and task analysis process of the advanced reactor design as required in section 1.2 and 1.3 of this ISG. Similarly, the utilization of a gap analysis comparing a KSA list to the existing fleet NUREG KA catalog as means for justification or approval of the proposed KSA list does not indicate that the job and task analysis per section 1.2 and 1.3 of this ISG were completed as necessary per the specific advanced reactor design being proposed.
When conducting task analysis, licensed operator training program job and task analysis must include all aspects of job performance, including but not limited to tasks important to safe plant operation and tasks derived from section 1.2.3.1.1 related to the foundational theory of plant operations.
The resulting KSA list is considered the comprehensive list of KSAs that licensed operators are expected to master for safe plant operations (not just those pertaining to tasks selected for training) and will input into the NRC operator examination process.
Following production of the KSA list, the licensee must develop a process to screen the KSA list to identify those tasks and associated KSAs important to safe plant operation and tasks derived from section 1.2.3.1.1 related to the foundational theory of plant operations. While the KSA list derived from this section shall input directly into the approved training program SAT process, the screening process described shall provide KSA input into the operator examination process. Further NRC guidance regarding the KSA ranking process described in this paragraph is found in DRO-ISG-2023-01, Operator Licensing Programs.
Reference Appendix 4 for example alignment for operations training program to operator examination guidance alignment.
EVALUATION CRITERIA:
Training Program Approval:
The staff should verify that the licensee has a procedure documenting the systematic performance of the task analysis process. The procedure should include:
Clear guidance meeting the guidance of section 1.3.2 of this ISG for consistent performance of producing task characteristics.
The staff should perform a review of the task analysis results against the following:
Each task selected for training has the following:
o clearly identified conditions and standards that define the prerequisites and acceptance criteria for task performance.
o Critical elements that must be performed to accomplish the task properly o Tools, equipment, and safety concerns related to task performance o The required knowledge, skills, and abilities necessary to achieve task performance are identified o Branch steps/alternate paths that result in additional actions or knowledge, skills and abilities are identified and listed Training Program Inspection:
The staff should perform a review of any new tasks selected for training since the last inspection. Validate the new tasks meet the training program approval criteria above per the approved training program.
1.3.3. Task Analysis Documentation The results of task analysis require review and/or approval and is maintained for historical record.
EVALUATION CRITERIA:
Training Program Approval:
The staff should verify that the licensee has a procedure documenting the systematic performance of the task analysis process. The procedure should include:
Clear guidance meeting the guidance of section 1.3.3 of this ISG for consistent performance of task analysis documentation, review, approval, and record keeping.
The staff should perform a review of the following The training procedures include retention requirements.
The documents supporting items 1.3.1 - 1.3.3 are available for review.:
Training Program Inspection:
The staff should review of this section is credited from activities performed in section 1.1.3.
Design Phase
- 2. Design Learning design includes defining the target student population, the creation of terminal objectives, enabling objectives, evaluation instruments, and instructional setting. The individual aspects of the design phase all center around the outcome of the Analysis phase; therefore, it is critical that the analysis was performed correctly to identify the training needs resulting from the needs, job and task analysis. For initial program design, the job and task analysis resulted in tasks, possibly grouped together by function; those selected for formal training were analyzed to identify what knowledge, skills, and abilities were necessary for a trainee to learn to achieve mastery of the task. The design phase utilizes these analysis results in the development of learning objectives and evaluation items.
2.1. Define Target Student Population The first process of the design phase is to define the target student population. The training program design is focused on the results of the analysis and the expected requisite knowledge, skills and/or abilities (KSAs) students are expected to bring to a course instruction. The target student population criterion are clearly written in the program description as prerequisites for candidate selection. This allows learning objectives and associated content to focus on knowledge transfer at the appropriate level of comprehension to the trainees.
EVALUATION CRITERIA:
Training Program Approval:
The staff should verify that the licensee has a procedure documenting the systematic performance of the design process. The procedure should include:
Clear guidance meeting the guidance of section 2.1 of this ISG for consistent performance of defining the target student population.
The staff should perform a review of the following Each training program includes a description of the target student population, listing the required requisite knowledge (KSAs) trainee candidates are expected to have prior to program entry.
The training program target student population documentation is written in the program description.
Validate that the requisite knowledge requirements established by the licensee are appropriate compared to the KSA list identified in 1.3.2.
Training Program Inspection:
The staff should perform a review of the program applicants against the target student population requirements. Validate that the trainees entering the program met the requisite knowledge (KSA) requirements per the approved training program.
If the licensee changes the program entry criteria, validate the following per the approved training program:
The changes were analyzed through a training needs analysis process as defined in section 1.1.
The changes were either conservative (more requisite KSAs required by the candidate) or the requisite KSAs required to be known by the applicant are appropriate to the training program curriculum.
2.2. Develop Learning Objectives Learning objectives are created from the job/task analysis results from the analysis phase.
2.2.1. Learning Objectives Contain Conditions, an Action, and Standards Learning objectives, both terminal and enabling learning objectives have three parts: a condition, action, and standard. The learning objective should be clearly written to distinguish all three parts of the objective.
2.2.1.1.
Learning Objective Action Statement Developing a learning objective first requires determining the action statement. The action statement consists of an action verb and a direct object. The action verb should identify trainee behavior that is observable and measurable. For example, in the action statement start the safety injection system, the action verb (start) and the direct object (safety injection system) are both observable and measurable.
Reference Appendix 6 for a list of action verb examples.
2.2.1.2.
Learning Objective Conditions A properly developed learning objective should clearly state the condition that will exist at the time of trainee performance. Conditions of performance define the facility situation, environmental aspects, and resources available to aid trainee performance.
Typical conditions may include the following:
Facility operating mode (The unit is operating at 100% power)
Safety considerations or hazards System and equipment status Tools or materials to be used References available (or from memory)
Environmental conditions Problem situation or contingencies (abnormal or emergency).
Learning objective conditions are derived from various job conditions identified during analysis. When developing learning objective conditions, adjustments may be necessary to reflect the degree of fidelity that can be achieved in the training setting.
For example, job conditions can be simulated with high fidelity during OJT and simulator training, because they mirror the actual job conditions. When classroom or self-pacing is used, the learning objective conditions are limited by the constraints of the classroom or self-paced environment.
For example, in the start the safety injection system, example above, the objective conditions could be With the plant operating at 100% power, and upon initiation of a loss of cooling accident. Note that this condition and standard would require high fidelity training, such as a simulator, to satisfy the learning objective requirements.
If an implied condition is used, it should be easily understood by all who read the objective. Reference Appendix 6 for examples of objective settings and their tie to learning objective conditions.
2.2.1.3.
Learning Objective Standards A well-prepared learning objective includes a standard for evaluating student performance. The trainees action should result in an output, and the required quantity or quality of that output is the standard of performance. Standards can include step-by-step processes that do not permit deviation (i.e., in accordance with procedure steps ___). Others may prescribe the product of performance and the factors for judging that product (i.e. (calculate heat up rate) within +/- 2F).
Standards are derived from job standards identified during analysis. Similar to the development process for conditions, learning objective standards also should be adjusted to reflect fidelity to job standards. In some cases, an implied standard may not be included in an objective. For example, an implied standard of without error may be assumed for a procedural step. If an implied standard is used, it should be easily understood by all who read the objective. If an action is required to be performed within a specified period of time, the standard must include the time requirement.
For example, in the with the plant operating at 100% power, and upon initiation of a loss of cooling accident, start the safety injection system, example above, the objective standard could be in accordance with emergency procedure __ and within
__ minutes. The procedure provides the guidance for performance standards (operating in compliance with the steps given in the procedure) and, if there was a time critical analysis to the emergency action as identified in the plant safety analysis, the maximum allowable time provided to the trainee to satisfactorily complete the objective requirements.
Reference Appendix 6 for a list of characteristics for standards, a description of what the characteristics specify, and an example of a learning objective including these standards.
EVALUATION CRITERIA:
Training Program Approval:
The staff should verify that the licensee has a procedure documenting the systematic performance of the design process. The procedure should include:
Clear guidance meeting the guidance of section 2..2.1 of this ISG for consistent performance of creating learning objectives with conditions, actions, and standards.
The staff should perform a review of the program objectives against the following criteria:
Each objective includes a condition, action, and standard.
The objectives created are directly tied to task KSAs that were selected for training in section 1.2.4, and there are no objectives for tasks that are not tied to a KSA.
Validate the list of KSAs from tasks selected for training are tied to the list of objectives created, and there are no KSAs for tasks selected for training that are not directly tied to an objective.
The objectives tied to KSAs cover the KSA topic.
The staff should perform a review of objective conditions against the following criteria:
Conditions should specify where the action occurs (control room, local), if there is more than one location that applies.
Conditions should specify whether the action is from memory or using references.
Conditions specify the appropriate details regarding objective performance to clearly denote how the trainee is expected to achieve mastery of the objective, as described in section 2.1.2 of this ISG.
The staff should perform a review of objective actions against the following criteria:
Objective actions should utilize an action verb such as those identified in Appendix
- 6.
Objective action statements should clearly state what learning is expected to occur from the objective.
The staff should perform a review of objective standards against the following criteria:
The objective standards clearly identify time limits associated with the task The objective standards identify the standard of performance, such as procedures, processes, or tolerances.
Training Program Inspection:
The staff should perform a review of initial and continuing training program objectives against the criteria in the Initial Training Approval section per the approved training program.
2.2.2. Learning Objectives Focus on Desired Results the Trainee is Expected to Achieve SAT is performance-based training; therefore, learning objective construction should focus on the desired results of the trainee following completion of the training under actual plant circumstances. Learning objectives describe what is to be learned in terms of the expected trainee performance under specified conditions to accepted standards.
EVALUATION CRITERIA:
Training Program Approval:
The staff should verify that the licensee has a procedure documenting the systematic performance of the design process. The procedure should include:
Clear guidance meeting the guidance of section 2.2.2 of this ISG for consistent performance of creating learning objectives focusing on the desired results the trainee is expected to achieve.
The staff should perform a review of program materials to verify that learning objectives are written to describe the actual plant performance expected of the trainee.
Training Program Inspection:
The staff should perform a review of program materials to verify that learning objectives are written to describe the actual plant performance expected of the trainee per the approved training program.
2.2.3. Program Design Includes Terminal Objectives In training program design, a terminal objective should be written for each major topic.
The terminal objective focuses on the overall results of the training curricula.
2.2.3.1.
Terminal Objectives Written at the Task Level One method of developing terminal objectives is to write terminal learning objectives for each task selected for training as identified in the task analysis. The enabling objectives are then written at the KSA level to effectively train the requisite knowledge and skills necessary for trainee mastery of the material.
For example, if a training program includes a task respond to steam generator level controller failures, with the following task conditions: plant operating at 100% power and a control input failure, with the standard prevent a reactor trip, a terminal objective action statement would be for the trainee to respond to a lowering steam generator level. The terminal objective condition and standard could be given a plant operating at 100% power and a control input failure, and to prevent a reactor trip.
2.2.3.2.
Terminal Objectives Written at the System Level Another method of developing terminal objectives is to establish a design structure that facilitates instruction of all system related KSAs in one (or more) lessons. In this case, the terminal objective can be written to address the KSAs needed across the range of task performance under all conditions. This is typically performed through the utilization of system-based design, where a lesson plan can cover all the required KSAs for the analyzed tasks associated with the system.
For example, if multiple tasks selected for training pertain to the safety injection system, a single lesson plan series that covers all the KSAs for the corresponding safety injection tasks may be developed. In this case, a terminal objective operate (or monitor the operation of) the __ system (with an appropriate condition and standard) may be used as a system level terminal objective that covers all the applicable KSAs. Another example would be the calibration of several different types of instruments might be individual tasks in the analysis phase but grouped together in the design phase to train and evaluate the material effectively and efficiently.
For example, in the task example, respond to steam generator level control failures, the corresponding terminal objective could be given a plant operating at 100% power and control input failure, respond to a transient steam generator level to prevent a reactor trip. The terminal objective would have supporting enabling objectives training the KSAs. The final evaluation of the terminal objective (a performance objective) could be through psychomotor based trained and evaluated in the simulator.
If there were other steam generator level control related tasks selected for training, such as control steam generator level during startup, it could be more efficient to train the KSAs under one common lesson plan that teaches the steam generator level control system. Therefore, a system terminal objective could be under all plant conditions, operate (or monitor the operation of) the steam generator level control system in accordance with (station procedures). This common system terminal objective would be used to train all the KSAs for the associated tasks selected for training and can be tied to a common cognitive task operate the steam generator level control system. The original task, respond to steam generator level control failures, would still be tied to a performance objective that could be psychomotor based trained and evaluated in the simulator.
The use of system level design is a method to consolidate the KSAs of a system that are analyzed to be trained as defined in the job and task analysis. Therefore, the ties to the required analysis KSAs and the objectives shall be clear. The use of system level design does not alleviate the responsibility of the licensee to clearly identify the requisite KSAs and design their training program around those specific required KSAs.
EVALUATION CRITERIA:
Training Program Approval:
The staff should verify that the licensee has a procedure documenting the systematic performance of the design process. The procedure should include:
Clear guidance meeting the guidance of section 2.2.3 of this ISG for consistent performance of creating terminal learning objectives.
The staff should perform a review of program design materials to verify the following:
Terminal learning objectives are written for each task selected for training Training Program Inspection:
The staff should perform a review of program materials to verify that terminal learning objectives are incorporated into the design structure of the program as described in the training program approval section per the approved training program.
2.2.4. Lesson Plans Include Enabling Objectives to Support the Terminal Objective Goal.
Each lesson will also include enabling objectives that comprise the materials that a trainee must master in order to be able to successfully complete the terminal objective.
Enabling objective creation is focused around the knowledge, skills and abilities identified in the job and task analysis.
For example, in the task example, respond to heat removal system flow control failures, knowledge items might include understand the heat removal flow control system inputs and understand flow controller operation, and a skill item might include control flow controller in manual. These could correspond into the following enabling objectives:
From memory, describe the inputs to the heat removal system flow control system in accordance with (system design specifications), and From memory, explain the operation of heat removal system flow controller in accordance with (station procedures), and Given a plant in a shutdown configuration, control heat removal system flow in manual within flow control band in accordance with (station procedures).
For the system terminal objective example, operate the heat removal system flow control system, the enabling objectives would include all KSAs for which the common system terminal objective is intended to train. Common KSAs between different tasks can be grouped together under one common objective, so long as the tie between the KSAs and objectives are clear.
For example, for the task control heat removal system flow during system startup, might have knowledge items understand the heat removal system flow control system inputs and understand flow controller operation in bypass, and a skill item might include control the bypass flow controller in manual. These knowledge items are similar to the KSAs for the task respond to heat removal flow control failures. In a common system lesson design structure, these could correspond into the following enabling objectives:
Classroom:
From memory, describe the inputs to the heat removal flow control system in normal and bypass mode in accordance with (system design specifications), and From memory, explain the operation of heat removal flow controller in normal and bypass mode in accordance with (station procedures).
Simulator:
Given a plant in a shutdown configuration, control heat removal system flow in manual within flow control band in accordance with (station procedures), and Given a plant in a shutdown configuration, control heat removal system flow in manual using the bypass controller within the flow control band in accordance with (station procedures).
Note that the first two enabling objectives are similar in design but now cover multiple KSAs. These knowledge items would be linked to the objective to clearly define the required knowledge that the objective is required to cover. The skill items required for each task are still independent; however, a simulator training scenario could incorporate the performance enabling objectives.
EVALUATION CRITERIA:
Training Program Approval:
The staff should verify that the licensee has a procedure documenting the systematic performance of the design process. The procedure should include:
Clear guidance meeting the guidance of section 2.2.4 of this ISG for consistent performance of creating enabling objectives that support the terminal objective.
Perform a review of program materials to verify the following:
Enabling objectives are written to support the targeted goal of the corresponding terminal objective and are related to the topic.
Enabling objectives are directly tied to an associated KSA Enabling objectives adequately cover the KSA topic.
There are no objectives not tied to a KSA There are no KSAs not tied to an objective Training Program Inspection:
The staff should perform a review of program materials to verify the following per the approved training program:
Enabling objectives meet the criteria of the training program approval section of this ISG.
Enabling objectives stemming from a training needs analysis that resulted in flexible training directly support the training need.
2.2.5. Enabling Objectives are Organized to Facilitate Student Learning Enabling objectives are organized in a lesson plan to facilitate student learning and mastery of the terminal objective; therefore, enabling objectives are sequenced in a lesson plan that best conveys conceptual learning.
For example, a trainee would not understand lesson material on steam generator level controller response to input errors without first learning what systems input into the controller. Likewise, having the student manually operate the steam generator level controller in a simulator without understanding the inputs or operation would diminish trainee learning and mastery of the concepts.
Tasks that are grouped together by function are often grouped by plant system; therefore, tasks may utilize a common system lesson plan to train the knowledge and abilities of the all the associated tasks, as identified by the job and task analysis. For example, a training program analysis might include tasks that include start the residual heat removal pump, operate the residual heat removal pump flow controller and respond to a residual heat removal pump trip.
Each of these tasks would include common knowledge items, such as state the purpose of the residual heat removal system and describe the residual heat pump operation. A common lesson plan can be utilized to capture all the requisite knowledge items for the corresponding tasks. The system lesson terminal objective would then be written at the knowledge comprehension level to delineate understanding to plant operations related to the associated system. The enabling objectives would be structured on the individual
knowledge items captured in the job and task analysis, for each task being consolidated into the common system lesson.
Initial training program design requires defining a methodology to organize and sequence the tasks selected for training into a learning objective hierarchy. Learning objectives are sequenced into an organized hierarchy that delineates the order in which they are taught. The order sequence can be chronological, simple to complex, or something different so long as it is focused on trainee mastery of the material. The methodology must result in a comprehensive list of objectives that adequately covers the tasks selected for training and the associated KSAs identified during task analysis.
EVALUATION CRITERIA:
Training Program Approval:
The staff should verify that the licensee has a procedure documenting the systematic performance of the design process. The procedure should include:
Clear guidance meeting the guidance of section 2.2.5 of this ISG for consistent performance of organizing learning objectives to facilitate student learning.
The staff should perform a review of program materials to verify the following:
The initial training program design is organized into a hierarchy that delineates the order in which they are taught.
The order of instruction established by the licensee is structured in a way that facilitates student learning.
Training Program Inspection:
The staff should perform a review of program materials to verify the following per the approved training program:
Enabling objectives meet the criteria of the training program approval section of this ISG.
2.2.6. Performance Objectives Maximize the Use of Performance Opportunity Performance objectives should be written to perform the objective items whenever possible. Simulating task performance is acceptable for tasks that cannot be performed without undue risk to operating plant status or safety.
The training program process should include a process for screening how performance objectives are trained and evaluated. Included in the program process should be a process for downgrading performance tasks through an approval process that limits deviations from the approved performance method.
EVALUATION CRITERIA:
Training Program Approval:
The staff should verify that the licensee has a procedure documenting the systematic performance of the design process. The procedure should include:
Clear guidance meeting the guidance of section 2.2.6 of this ISG for consistent performance of maximizing the use of performance opportunities for performance objectives.
Clear guidance to include a process for screening how performance objectives are trained and evaluated, including guidance on downgrading performance tasks through an approval process.
The staff should perform a review of program materials to verify the following:
Task performance objectives are performed in a realistic setting when safe and possible without inserting undue risk to the plant.
When performance of is not possible, another means of performance objective training and evaluation (simulation) is selected to convey task mastery.
Training Program Inspection:
The staff should perform a review of program materials to verify the following per the approved training program:
Enabling objectives meet the criteria of the training program approval section of this ISG.
2.2.7. Learning Objectives are Reviewed and Approved by Training and Line Supervision Learning objectives are reviewed and approved by line and training management.
EVALUATION CRITERIA:
Training Program Approval:
The staff should verify that the licensee has a procedure documenting the systematic performance of the design process. The procedure should include:
Clear guidance meeting the guidance of section 2.2.7 of this ISG for consistent performance of reviewing, and approving learning objectives.
The staff should perform a review of learning objectives; confirm they are reviewed and approved by line and training management.
Training Program Inspection:
The staff should perform a review of program learning objectives to verify the following per the approved training program:
Enabling objectives meet the criteria of the training program approval section of this ISG.
2.3. Develop Evaluation Items Once the terminal and enabling objectives are created, the evaluation items for each objective are created. This aspect of the design phase focuses instructor resources to identify what key aspects of the task and associated knowledges, skills and abilities need to be evaluated to ensure trainee mastery. If the learning objective informs the student of what they should be able to know or do at the outcome of the training, the evaluation items give the instructor confidence that the trainee did master the concepts.
2.3.1. Evaluation Items Evaluate the Topic of the Objective Test items should evaluate the subject or topic of the objective.
2.3.1.1.
Cognitive Evaluation Examples For cognitive exams, an objective that says describe the hazards of radiation should test a trainees knowledge of radiation hazards (i.e. cancer, illness, death): It should not ask the trainee to describe the source of the radiation hazard (the radioactive isotopes that cause hazards).
2.3.1.2.
Performance Evaluation Examples For performance exams, test items should evaluate the topic of the objective task statement. For example, a performance objective that states manually control feedwater bypass flow would require an evaluation of a trainees ability to control the bypass flow, not just set up, initiate, or monitor flow.
EVALUATION CRITERIA:
Training Program Approval:
The staff should verify that the licensee has a procedure documenting the systematic performance of the design process. The procedure should include:
Clear guidance meeting the guidance of section 2.3.1 of this ISG for consistent performance of evaluation items evaluating the topic of the objective.
The staff should perform a review of program evaluation items to verify the following:
Evaluation items are linked to an enabling objective Evaluation items evaluate the topic of the objective task statement.
Evaluation item(s) have been created for each objective in the training program curriculum Training Program Inspection:
The staff should perform a review of program evaluation items to verify the following per the approved training program:
Evaluation items are linked to an enabling objective Evaluation items evaluate the topic of the objective task statement.
Evaluation item(s) have been created for each objective in the training program curriculum 2.3.2. Evaluation Items are Leveled to the Objective.
Evaluation items must be leveled to the attributes of the learning objectives condition(s),
action, and standard(s). The action verb of the objective delineates the level of trainee mastery that is required. Action verbs used for skills or tasks should use performance-based evaluations, while objectives describing a cognitive knowledge should utilize an evaluation instrument that validates trainee comprehension.
2.3.2.1.
Cognitive Evaluation Item Leveling Test item difficulty is leveled to the learning objective action. For example, an objective that states describe the pump interlocks should test the trainees knowledge of the pump interlocks. It should not ask the trainee to evaluate integrated plant operations following scenario-based events pertaining to the pump interlocks (which would be evaluate plant conditions).
For example, an objective that states describe the xxx pump interlocks could include a question describe the impact of a lowering level in the xx tank on the xx system operation. This requires the trainee to understand the interlock and be able to describe its impact on the pump operation.
An exam question that would be improperly leveled would be:
The plant is operating at 100% power when a loss of offsite power occurred. All emergency diesel generators started as designed. 10 minutes later, a loss of coolant accident occurred. All equipment responded as designed. 30 minutes after the loss of coolant accident, assuming no operator action, what is the operating state of the xx pump?.
This question could be related to the fact that the scenario results in multiple integrated system automatic actions that may or may not drain down the xx tank, which could result in automatic shutoff of the pump depending on system design. However, the trainee was not trained on integrated plant operations based on the objective. This question would be better aligned to the objective (from memory or using xx procedure), analyze xx system response to plant abnormal actions (IAW system design specifications).
Objective leveling should include a basis for leveling standards. Industry standards such as Appendix 5, Blooms Taxonomy, may be referenced to establish effective objective leveling standards for consistent evaluation.
2.3.2.2.
Performance Evaluation Item Leveling For performance based learning objectives, the evaluation item must be matched to the objective. Appropriate methods of performance based evaluations include in-plant task performance evaluation (TPE), lab, or simulator. Other methods may be used so long as the method utilized adequately evaluates trainee mastery of the objective.
EVALUATION CRITERIA:
Training Program Approval:
The staff should verify that the licensee has a procedure documenting the systematic performance of the design process. The procedure should include:
Clear guidance meeting the guidance of section 2.3.2 of this ISG for consistent performance of leveling evaluation items to the objective.
The staff should perform a review of program evaluation items to verify the following:
Evaluation items are leveled to the tied enabling objectives action verb.
Training Program Inspection:
The staff should perform a review of program evaluation items to verify the following per the approved training program:
Evaluation items are leveled to the tied enabling objectives action verb.
2.3.3. Test Item Conditions and Standards Match the Learning Objectives Conditions and Standards Test item conditions and standards match the learning objectives conditions and standards.
2.3.3.1.
Cognitive Examples
For cognitive test items, if describe the pump interlocks objective had the condition, From memory, the trainee is expected to produce the answer to the test item without the use of aids (procedures, drawings, etc.). If the describe the pump interlocks objective had the standard in accordance with [pump design documentation], the cognitive objective standard evaluated to is without error, which is an implied standard for all objectives.
2.3.3.2.
Performance Examples For performance evaluation items, the performance objective given a plant operating at 100% power and a control input failure, respond to a lowering steam generator level to prevent a reactor trip would require a simulator scenario with the plant operating at 100% power and having the trainee respond to a control input failure. The standards for successful test item completion would be to stabilize SGWL prior to a reactor trip EVALUATION CRITERIA:
Training Program Approval:
The staff should verify that the licensee has a procedure documenting the systematic performance of the design process. The procedure should include:
Clear guidance meeting the guidance of section 2.3.2 of this ISG for consistent performance of matching test item conditions and standard to the learning objectives conditions and standards.
The staff should perform a review of program evaluation items to verify the following:
Evaluation items match the objectives conditions for trainee testing.
Training Program Inspection:
The staff should perform a review of program evaluation items to verify the following per the approved training program:
Evaluation items match the objectives conditions for trainee testing.
2.3.4. Test Item Construction is an Appropriate Method of Evaluation for the Objective Test item construction must utilize an appropriate method of evaluation. Cognitive objectives can utilize different test methods (short answer, fill in the blank, essay, oral boards, multiple choice).
2.3.4.1.
Multiple Choice is an Already Approved Standard The use of multiple choice, 4-part distractors is considered an industry standard, accepted practice that may be used for cognitive based questions provided the other attributes are appropriately constructed. If multiple choice questions are used, distractors must be plausible enough to allow effective discrimination of trainee comprehension of the topic. Other methods are allowed provided the licensee provides a level of analysis/psychometrics to distinguish between trainees who master the objective and those who do not.
EVALUATION CRITERIA:
Training Program Approval:
The staff should verify that the licensee has a procedure documenting the systematic performance of the design process. The procedure should include:
Clear guidance meeting the guidance of section 2.3.4 of this ISG for consistent performance of constructing test items that appropriately evaluate the objective.
The staff should perform a review of program evaluation items to verify the following:
Evaluation item construction utilizes an appropriate method of evaluation. The use of multiple choice, 4 part distractors is acceptable; if other methods are utilized, the licensee shall submit additional analysis to justify the method of evaluation selected.
Training Program Inspection:
The staff should perform a review of program evaluation items to verify the following per the approved training program:
Evaluation item construction utilizes an appropriate method of evaluation. The use of multiple choice, 4 part distractors is acceptable; if other methods are utilized, the licensee shall submit additional analysis to justify the method of evaluation selected.
2.3.5. Pass/Fail Criteria Test items should include pass/fail criteria for effective, consistent trainee evaluation.
Pass fail criteria should be written for objective measurement: successfully completes steps 1 - 3 of procedure xx to align system bypass flow, or establishes flow controlled at a flow rate of xx gpm.
EVALUATION CRITERIA:
Training Program Approval:
The staff should verify that the licensee has a procedure documenting the systematic performance of the design process. The procedure should include:
Clear guidance meeting the guidance of section 2.3.5 of this ISG for consistent performance of creating evaluation item pass/fail criteria.
The staff should perform a review of program evaluation items to verify each evaluation item includes pass/fail criteria for the objective.
For multiple choice exams, this can be clear identification of the correct answer.
For other methods of evaluating cognitive evaluation items, the answer must be clearly defined such that the instructor grading the exam item can provide consistent grading. This can be an answer key denoting point breakdown of key concepts for essay questions, drawing grading criteria (points per component) etc.
For performance evaluation items, the actions/steps required to achieve successful performance of the evaluation item must be clearly defined.
Training Program Inspection:
The staff should perform a review of program evaluation items to verify each evaluation item includes pass/fail criteria for the objective per the approved training program:
For multiple choice exams, this can be clear identification of the correct answer.
For other methods of evaluating cognitive evaluation items, the answer must be clearly defined such that the instructor grading the exam item can provide consistent grading. This can be an answer key denoting point breakdown of key concepts for essay questions, drawing grading criteria (points per component) etc.
For performance evaluation items, the actions/steps required to achieve successful performance of the evaluation item must be clearly defined.
2.3.6. Evaluation Items Must be Plausible.
Exam items must be created in a way that effectively determines student mastery of the objective content. Exam items must be credible to the objective and distractors (if options are provided to the trainee) are written in a way to ensure that alternatives serve as distractors which may be chosen by a trainee that has not achieved mastery of the objective but rejected by students who have mastered the objective content.
Reference appendix 7 for examples of exam item plausibility.
EVALUATION CRITERIA:
Training Program Approval:
The staff should verify that the licensee has a procedure documenting the systematic performance of the design process. The procedure should include:
Clear guidance meeting the guidance of section 2.3.6 of this ISG for consistent performance of creating plausible evaluation items.
The staff should perform a review of program evaluation items to verify evaluation items are plausible.
Training Program Inspection:
The staff should perform a review of program evaluation items to verify evaluation items are plausible per the approved training program.
2.3.7. Performance Evaluations Written for Individual Trainee Evaluation Test items should be written for individual trainee evaluation. For example, lab activity performance objectives cannot have an evaluation item that states, perform the activity as a group. The evaluation item must include specificity to allow for individual performance of the skill or task being evaluated.
EVALUATION CRITERIA:
Training Program Approval:
The staff should verify that the licensee has a procedure documenting the systematic performance of the design process. The procedure should include:
Clear guidance meeting the guidance of section 2.3.7 of this ISG for consistent performance of creating performance evaluations for individual trainee evaluation.
The staff should perform a review of program evaluation items to verify performance evaluation items are written to evaluate individual trainee performance.
Training Program Inspection:
The staff should perform a review of program evaluation items to verify performance evaluation items are written to evaluate individual trainee performance per the approved training program.
2.3.8. Test item Creation Includes Review and Approval by Training and Line Supervision Test item creation includes review and approval by training and line supervision.
EVALUATION CRITERIA:
Training Program Approval:
The staff should verify that the licensee has a procedure documenting the systematic performance of the design process. The procedure should include:
Clear guidance meeting the guidance of section 2.3.8 of this ISG for consistent performance of review and approval of test item creation by training and line supervision.
The staff should perform a review of exam items; confirm they are reviewed and approved by line and training management.
Training Program Inspection:
The staff should perform a review of exam items; confirm they are reviewed and approved by line and training management per the approved training program.
Development
- 3. Development The development phase of the Systems Approach to Training focuses on development of training materials necessary to present the material to the trainees. Materials include classroom lesson plans, simulator training guides, laboratory guides, self-led training materials, and associated evaluation items and exams. Key aspects in the development process include standards and/or procedures for developing training materials and evaluation materials, including examination materials.
3.1. Training Material 3.1.1. Training Material Development Standards:
3.1.1.1.
Training Material Development is Rooted in the Plants SAT Analysis Training materials are developed based on the plant design, as specified by the original job and task analysis and their corresponding KSAs, as analyzed and selected for training.
3.1.1.2.
Training Material Content and Consistency Training materials need to have the amount of detail and content necessary for trainees to successfully master the learning objective(s).Training materials contain enough detail to provide consistent delivery of the information needed to achieve trainee mastery of the learning objectives.
For example, if an objective states, given that a pressurizer relief valve is leaking, and using a Mollier diagram, determine the relief people. valve discharge piping thermodynamic steam properties, the lesson plan must include the concepts relating to use of the Mollier diagram and examples sufficient to provide instructors clarity on concepts to cover and examples of how to use the Mollier diagram to determine the status of the relief piping steam (superheated, saturated, etc.). Simply putting reference to the Mollier diagram with a statement review situations with trainees would be unacceptable.
3.1.1.3.
The Method of Delivery Ensures Effective Objective Mastery.
The method of delivering instructional content (e.g., classroom, lab, simulator, self-study) of needs to be appropriate to ensure effective knowledge transfer.
For example, self-study of written guidance for tasks that will be evaluated in the lab or the plant, with no guided practice in the lab or the plant, is not an appropriate delivery method. However, use of self-study through augmented, simulated, or virtual reality training that allows interactive demonstration of the skill might be acceptable based on the difficulty and importance of the topic being trained.
The method of delivery is appropriate for mastery of the objectives at the cognitive level of the objective. For example, an objective requiring analysis of a system or circuit fault cannot be supported with material that only discusses the purpose and general function of the system or circuit; the content must incorporate learning at the higher cognitive objective (I.e. how to analyze the system or circuit).
Training Program Approval:
The staff should verify that the licensee has a procedure documenting the systematic performance of the development process. The procedure should include:
Clear guidance meeting the guidance of section 3.1.1 of this ISG for consistent performance of development standards.
The staff should perform a sample review of lesson plans to include the following:
Lesson plans developed are focused on the objectives from tasks selected for training.
The program KSAs selected for training, as defined in the task analysis, that are linked to each objective are covered in the lesson plan content.
The content in the lesson plans cover the lesson plan objectives and align with the objectives condition, action and standard.
The lesson plan content adequately conveys the information required to learn the objective and covers the exam items that are tied to each objective in such a way to ensure a trainee can successfully pass the exam item if they master the objective.
The content of the lesson plan is sufficient to allow repeatable, consistent delivery of the training material from one instructor to another.
The method of delivery (e.g., classroom, lab, simulator, self-study, etc.) is appropriate for the trainee to master the content at the cognitive level of the objective.
Training Program Inspection:
The staff should perform a review of lesson plans to verify the following per the approved training program per the approved training program:
Revised or created lesson plans originated from the analysis phase.
The lesson plan content covers the lesson plan objectives at a level that matches the objectives condition, action and standard.
The lesson plan content adequately conveys the information required to learn the objective and covers the exam items that are tied to each objective in such a way to ensure a trainee can successfully pass the exam item if they master the objective.
The content of the lesson plan is sufficient to allow repeatable, consistent delivery of the training material from one instructor to another.
The method of delivery (classroom, lab, simulator, self-study,etc.) is appropriate for the trainee to master the content.
The method of delivery for a lesson plan is developed at the cognitive level of the objective, with supporting information sufficient to adequately convey the objective at the cognitive level.
3.1.2. Training Material Content Key aspects of training material development include the following:
3.1.2.1.
Lesson Plan Content Lesson plans include the following:
Learning objectives Estimated duration Content Learning activities References Method of evaluation Training equipment Materials needed for training and guidance for their use Revision and approval history EVALUATION CRITERIA:
Training Program Approval:
The staff should verify that the licensee has a procedure documenting the systematic performance of the development process. The procedure should include:
Clear guidance meeting the guidance of section 3.1.2 of this ISG for consistent performance of training material content requirements.
If available, validate that lesson plans include the items listed in 3.1.2.1.
Training Program Inspection:
The staff should validate that lesson plans include the items listed in 3.1.2.1 per the approved training program.
3.1.3. Training Material Review, Approval, and Accuracy:
Training materials are reviewed for accuracy and adequacy prior to delivery, by line management and training management, if applicable, prior to instruction.
EVALUATION CRITERIA:
Training Program Approval:
The staff should verify that the licensee has a procedure documenting the systematic performance of the development process. The procedure should include:
Clear guidance meeting the guidance of section 3.1.3 of this ISG for consistent performance of reviewing and approving accurate training material.
Training Program Inspection:
The staff should review a sampling of training materials; validate that the material met compliance with section 3.1.3 of this ISG.
3.1.4. Curriculum Organization 3.1.4.1.
Delivery Timeframe The lesson plan development process includes establishing an estimated delivery timeframe for each course or training item (e.g., as defined in a task-to-training matrix in the design phase).
3.1.4.2.
Curriculum Sequencing The lesson plans within the curriculum should be grouped and sequenced according to topic and difficulty to facilitate an exam structure that allows adequate evaluation of the objectives, as defined in the exam development standards.
EVALUATION CRITERIA:
Training Program Approval:
The staff should verify that the licensee has a procedure documenting the systematic performance of the development process. The procedure should include:
Clear guidance meeting the guidance of section 3.1.4 of this ISG for consistent performance of curriculum organization.
The staff should perform a review of the following Lesson plans include an approximate delivery time for each course or training item.
Lesson plans are group and sequenced according to topic and difficulty to facilitate an exam structure that allows adequate evaluation of the objectives.
Training Program Inspection:
The staff should review newly developed or revised lesson plans for the following per the approved training program:
Lesson plans include an approximate delivery time for each course or training item.
Lesson plans are group and sequenced according to topic and difficulty to orchestrate an exam structure that allows adequate evaluation of the objectives.
3.2. Exam Development Standards Trainee examination of course objectives are conducted through evaluations utilizing the exam items created in the design phase. Exams include cognitive evaluations and performance evaluations. Refer to the Operator Licensing Program ISG for additional guidance for operator licensing examinations.
3.2.1. Cognitive Evaluations Key aspects of cognitive exam development include:
3.2.1.1.
Objective Sampling Cognitive evaluations need to adequately sample the course objectives to ensure trainee mastery of the course content requisite knowledge. Lesson plan sequencing and course curriculum, the number of objectives, and objective difficulty should be considered when developing exams. A sufficient number of test items from each lesson plan learning objectives must be included on the exam to adequately assess student comprehension and mastery of the content.
For objectives that cover multiple KSAs, the exam sufficiently sample the KSAs tied to the objective. There should be consideration of the number of KSAs tied to the objective and the associated KSAs importance to risk and safety when choosing KSAs for exam development.
For exams covering only a few lessons, effort should be made to include most, if not all, learning objectives in the evaluation. For exams that cover multiple lessons, it can be appropriate to include learning objectives that will adequately cover the content and confirm overall mastery of the material.
Consideration should be taken to include higher cognitive learning objectives that build on the understanding of lower cognitive learning objectives. For example, if an exam includes a question on system operation and interlocks, it may reasonable to omit an exam question on a lower cognitive objective like state the purpose of the system.
3.2.1.2.
Multiple Exams are Created With >40% Differing Questions.
If different versions of the same exam are being given in a short period, with the questions on each version of the written exam differ by at least >40%. For example, if continuing training sessions are evaluating trainees in groups over several weeks, then at least two different versions of a written exam should be created and administered to the trainees with >40% different questions on each exam.
3.2.1.3.
An Exam Question Selection Process Exists.
An exam question selection process exists. Exams that include manually selected questions (i.e., questions individually selected for the exam by the training staff and not randomly selected by an exam computer program) should be reviewed to ensure exam questions are not duplicated.
Exam questions are reviewed for independence from each other, to ensure one test item does not aid in answering another.
3.2.1.4.
Clear pass/fail standards exist.
For written exams, the use of 80% as a minimum passing score is considered an industry standard accepted practice. Other passing score criteria below 80% may be allowed provided sufficient justification and analysis is provided for the basis of the chosen pass rate.
3.2.1.5.
Clear Grading Methods Exist.
For most test items, like multiple choice exams, the correct answers are clearly identified during test item creation in the design phase. Other cognitive evaluation items, such as oral board, or walkthroughs require a grading methodology sufficient to ensure consistent, defendable, and quantifiable criteria exist.
For example, oral boards should include the range of questions available for student evaluation, with guidance for consistent selection of the questions, and guidance to grade trainee mastery of the objective using a clear grading scale. Whenever possible, oral boards should be administered by a panel of graders instead of just a single grader, to help ensure accurate grading.
EVALUATION CRITERIA:
Training Program Approval:
The staff should verify that the licensee has a procedure documenting the systematic performance of the development process. The procedure should include:
Clear guidance meeting the guidance of section 3.2.1 of this ISG for consistent performance of standards for cognitive evaluations.
The staff should perform a review developed exams against the following criteria:
Objective sampling Repeat versions of an exam differ with >40% differing questions.
A process exists for exam question selection Developed exam questions are independent from each other, so no one test item aids in answering another.
A passing score is provided. The use of 80% as a minimum passing score is considered acceptable without further analysis. Other passing score criteria may be considered; however, the use of a minimum passing score less than 80% requires sufficient justification and analysis for the basis of the chosen pass rate.
Exams include a grading criteria.
o Multiple choice exams, this requires an answer key denoting the correct answer;
o Other cognitive evaluation items require a grading methodology described in the answer key with enough clarity to ensure consistent, defendable, and quantifiable grading criteria exists.
Training Program Inspection:
The staff should perform a review of recently developed exams against the training program approval criteria per the approved training program.
3.2.2. Performance Evaluations Performance evaluations include laboratory, simulator, job performance measures (JPM) or task performance evaluations (TPE) requiring hands-on or simulated performance by the trainee to demonstrate mastery. Performance objectives derived from tasks or skills selected for training are evaluated with performance evaluations as identified in the design phase. Performance evaluations may be conducted by instructor staff or line personnel who are qualified to conduct the evaluation method selected.
Key aspects to performance evaluations include:
3.2.2.1.
Individual Performance and Evaluation of Performance Objectives The evaluation process must include individual performance and evaluation of performance objectives as defined by the job and task analysis. For example, if a job task analysis identified the task, respond to steam generator level controller failures, and that task was selected for initial training, then each trainee in the program should demonstrate performance of the evaluation instrument selected for the task, as defined in the design phase. Note that this does not preclude the ability to perform crew scenarios or crew evaluations for licensed operator training of integrated plant operations.
3.2.2.2.
Clear Pass/Fail Standards to Allow Consistent Evaluation Clear examination pass/fail standards exist to allow consistent evaluation by different instructors. The criteria for passing needs to include the trainee being able to successfully perform the task IAW the conditions, standards and actions in the associated LO(s).
3.2.2.3.
Guidance Provides Reproducible Consistency Between Evaluators Evaluation materials include detailed guidance sufficient to ensure consistency between evaluators (e.g., initiating conditions, failures, cues, equipment, or resources provided, etc. are specified).
EVALUATION CRITERIA:
Training Program Approval:
The staff should verify that the licensee has a procedure documenting the systematic performance of the development process. The procedure should include:
Clear guidance meeting the guidance of section 3.2.2 of this ISG for consistent performance of standards for performance evaluations.
The staff should perform a review of performance evaluations against the following criteria:
The performance evaluation is written for individual trainee of the evaluation.
There are clear pass/fail standards written into the performance evaluation to allow consistent evaluation by different instructors.
Enough guidance exists in the performance evaluation to provide reproducible consistency between evaluators, such as initiating conditions, failures, cues, equipment, or resources provided.
Training Program Inspection:
The staff should perform a review of recently developed performance evaluations against the training program approval criteria per the approved training program.
3.2.3. Evaluation Item Review, Validation, and Approval Both cognitive and performance evaluation instruments should be reviewed, validated, and approved prior to implementation. Approvals should be documented and retained. The use of subject matter experts ensures that the evaluation item is consistent with job performance requirements as well as the technical accuracy of the materials.
EVALUATION CRITERIA:
Training Program Approval:
The staff should verify that the licensee has a procedure documenting the systematic performance of the development process. The procedure should include:
Clear guidance meeting the guidance of section 3.2.3 of this ISG for consistent performance of standards for evaluation item review, validation, and approval.
The staff should perform a review of the following The cognitive and performance evaluation items are reviewed, validated and approved prior to implementation.
A process exists to document and record reviews, validation, and approvals for historical records.
Training Program Inspection:
The staff should perform a review of recently developed cognitive and performance evaluations against the training program approval criteria per the approved training program.
IMPLEMENTATION
- 4. Implementation
The implementation phase of the Systems Approach to Training process involves delivery of the training and evaluation materials created in the development phase. Successful performance of the Implementation phase requires the training instructors to promote trainee mastery of the learning objectives and to ensure a transfer of student knowledge and skills from the instructional setting to the job. Implementation of training involves preparation and scheduling, delivery of training, exam administration and remediation and collecting feedback post training.
4.1. Preparation and Scheduling During Implementation, training materials created in the development phase are ready to be presented to the target audience, as defined in the analysis phase.
4.1.1. Fixed vs Flexible Training Scheduling The training material can be content created for the initial training of new personnel entering the program, continuing training based on tasks selected for continuing training based on their difficulty, importance, and frequency (DIF) analysis, or new continuing training material created based on a training need identified through training needs analysis in an ongoing training program, such as introduction of new equipment via plant design changes.
4.1.2. Schedules Approved by Training and Line For all training to be conducted, the course offering must be scheduled and prepared for by the training organization. Schedules for initial and continuing training are approved by line and training management. The content of the scheduling shall match the purpose of the original analysis of the training need, as identified in the analysis phase.
EVALUATION CRITERIA:
Training Program Approval:
The staff should verify that the licensee has a procedure documenting the systematic performance of the implementation process. The procedure should include:
Clear guidance meeting section 4.1.1 of this ISG for consistent performance of preparation and scheduling of training.
The staff should perform a review of the initial training program schedule for the following:
Initial training schedules are developed around the approved training curriculum from the analysis and design phases.
All tasks selected for initial training are trained in the training schedule.
The training schedule is approved by line and training management.
The staff should perform a review of the continuing training program schedule for the following:
Training schedules are developed around the approved training curriculum from the analysis and design phases.
All tasks selected for continuing training are trained in the training schedule.
Flexible training components in the training schedule are supported with approved training needs analysis that document the basis for the training.
The training schedule is approved by line and training management.
Training Program Inspection:
The staff should perform a review of training program scheduling per the approved training program::
Validate the procedure complies with the guidance of section 4.1.1 of this ISG.
The staff should perform a review of the initial training program schedule for the following:
Initial training schedules are developed around the approved training curriculum from the analysis and design phases.
All tasks selected for initial training are trained in the training schedule.
The training schedule is approved by line and training management.
The staff should perform a review of the continuing training program schedule for the following:
Training schedules are developed around the approved training curriculum from the analysis and design phases.
All tasks selected for continuing training are on the training schedule.
Flexible training components in the training schedule are supported with approved training needs analysis that document the basis for the training.
The training schedule is approved by line and training management.
4.2. Delivery of Training 4.2.1. Instructors are Trained and Qualified Instructors who deliver formal training are expected to be proficient in the methods and techniques for successful presentation in the particular training setting they are using.
EVALUATION CRITERIA:
Training Program Approval:
The staff should verify that the licensee has a procedure documenting the systematic performance of the implementation process. The procedure should include:
Clear guidance meeting the guidance of section 4.2.1 of this ISG for consistent guidance on instructor training, qualifications, and experience.
The staff should verify that the licensee has a method, such as an instructor training qualification program, to ensure that instructors are proficient in the methods and techniques for successful presentation in the setting they are using.
Training Program Inspection:
The staff should perform a review of training program delivery per the approved training program:
Validate the procedure complies with the guidance of section 4.2.1 of this ISG.
The staff should perform a review of training records, validate that the instructors who taught the class were qualified to teach in the setting they were in.
4.2.2. Deliver Effective Training
Instructors should be prepared, as evident by their performance in the classroom using questioning skills, coaching skills, and learning techniques for optimal trainee mastery of the material. Effective delivery of training includes consistent adherence to the approved lesson plan material and reinforcing line standards where appropriate.
EVALUATION CRITERIA:
Training Program Approval:
The staff should verify that the licensee has a procedure documenting the systematic performance of the implementation process. The procedure should include:
Clear guidance meeting the guidance of section 4.2.2 of this ISG for consistent performance of delivering effective training.
Training Program Inspection:
The staff should perform a review of training effectiveness per the approved training program::
Validate the procedure complies with the guidance of section 4.2.2 of this ISG.
Perform observations of training delivery; validate that the instructor delivers effective training per the approved training program.
4.3. Exam Administration and Remediation 4.3.1. Exam Administration Standards - All Formal Training Requires Evaluation The examination process is used to verify trainee comprehension of the topics and mastery of the learning objectives. Examinations can be cognitive, or performance based, as defined by the objective and the test items created in design phase and collected into an examination in the development phase.
All formal training, derived from the analysis, design, and development phases, requires trainee evaluation. Successful execution of training examinations includes exam security standards, examination administration and grading, and remediation.
EVALUATION CRITERIA:
Training Program Approval:
The staff should verify that the licensee has a procedure documenting the systematic performance of the implementation process. The procedure should include:
Clear guidance meeting the guidance of section 4.3.1 of this ISG for consistent performance of exam administration standards.
The staff should review the approved training schedule produced from section 4.1.2.
Validate that the topics being trained have trainee evaluations after the training session, as appropriate.
Training Program Inspection:
The staff should perform a review of exam administration standards per the approved training program:
Validate the procedure complies with the guidance of section 4.3.1 of this ISG.
Review the approved training schedule produced from section 4.1.2. Validate that the topics being trained have trainee evaluations after the training session, as appropriate.
4.3.2. Exam Security Standards Examination administration requires establishing and implementing a standard that ensure test integrity. The examination standard shall include provisions sufficient to ensure that exams are not compromised during exam development, exam administration, or post exam activities. NUREG-1021 provides one acceptable method for exam security.
EVALUATION CRITERIA:
Training Program Approval:
The staff should verify that the licensee has a procedure documenting the systematic performance of the implementation process. The procedure should include:
Clear guidance meeting the guidance of section 4.3.2 of this ISG for consistent performance of exam security standards.
Training Program Inspection:
The staff should perform a review of exam security standards per the approved training program:
Validate the procedure complies with the guidance of section 4.3.2 of this ISG.
The staff should perform observations (if available) of exam security practices conducted by the licensee for training activities currently in session. Validate the practices comply with the approved procedure.
4.3.3. Exam Administration During performance of an examination, the proctor is required to follow the exam security standards and administer the exam in accordance with the exam instructions.
Cognitive and performance-based examinations are to be conducted in such a manner that ensures the trainee is effectively evaluated to the minimum standards of the exam. If references are allowed during an exam, as identified in design through review of the objective condition (e.g., using a calculator, from memory or using procedure XYZ), only the appropriate references, as identified by the objective condition, are made available to the trainee. References provided to the trainee need to be free of any markings or notes that deviate from the original version of the reference, unless clearly specified in the objective condition per the design phase.
For example, if an objective trains on a procedure starting midway through the steps, then the design of the objective condition might be stated as, using xxx procedure starting at xxx step. In such a case, it would be appropriate for the procedure to be marked up to the starting point of the objective for the training and subsequent exam.
However, the procedure marking should not include notes or cues written on the procedure during the training session that would not be provided in the plant if the trainee was performing those steps with a new copy of the procedure.
EVALUATION CRITERIA:
Training Program Approval:
The staff should verify that the licensee has a procedure documenting the systematic performance of the implementation process. The procedure should include:
Clear guidance meeting the guidance of section 4.3.3 of this ISG for consistent performance of exam administration standards.
The staff should review an approved training exam for the following:
The exam package includes instructor and student instructions on how to administer the exam.
The exam includes the appropriate references according to the objectives being examined.
Training Program Inspection:
The staff should perform a review of exam administration practices per the approved training program:
Validate the procedure complies with the guidance of section 4.3.3 of this ISG.
The staff should perform observations (if available) of exam administration practices conducted by the licensee for training activities currently in session. Validate the practices comply with the approved procedural guidance related to section 4.3.3 of this ISG.
4.3.4. Exam Process to Include Test Review with Trainees Once trainees complete the exam, the exam is graded in accordance with the exam key, and an exam review is performed to review missed questions and close knowledge gaps. Key concepts can be reviewed as necessary to ensure trainee comprehension.
EVALUATION CRITERIA:
Training Program Approval:
The staff should verify that the licensee has a procedure documenting the systematic performance of the implementation process. The procedure should include:
Clear guidance meeting the guidance of section 4.3.4 of this ISG for consistent performance of exam review with trainees.
Training Program Inspection:
The staff should perform a review of exam review practices per the approved training program:
Validate the procedure complies with the guidance of section 4.3.4 of this ISG.
The staff should perform observations (if available) of exam administration practices conducted by the licensee for training activities currently in session. Validate the test review practices comply with the approved procedural guidance related to section 4.3.4 of this ISG.
4.3.5. Exam Remediation Process
Trainees who achieve a score less than the minimum passing criteria need to be remediated if they will remain in the training program. The implementation procedure needs to include a process for consistent trainee remediation. Line personnel attending continuing training who fail an exam covering tasks they are qualified to perform should be removed from operational duties until successfully remediated.
EVALUATION CRITERIA:
Training Program Approval:
The staff should verify that the licensee has a procedure documenting the systematic performance of the Implementation process. The procedure should include:
Clear guidance meeting the guidance of section 4.3.5 of this ISG for consistent performance of exam remediation process.
Training Program Inspection:
The staff should perform a review of exam remediation practices per the approved training program:
Validate the procedure complies with the guidance of section 4.3.5 of this ISG.
The staff should review recent exam remediation paperwork conducted by the licensee for the following:
o The individuals affected qualifications were removed o The individual did not do unqualified work 4.3.6. Exam Remediation Standards Remediation includes, at a minimum, reviewing the missed content of the exam for knowledge gaps, studying, and reattempting a new examination. Additional review or practice of the objective content might be necessary based on the trainees grasp of the objective concepts. Once the instructor and line agree the trainee is ready for reexamination, another exam is administered to validate trainee mastery of the missed learning objectives. At a minimum, the remediation exam must retest on the concepts missed by the trainee on the original exam through new exam items covering those objectives. Additional learning objectives may be re-evaluated on the remediation exam, however, a maximum of 30% of the test items may overlap the original exam.
EVALUATION CRITERIA:
Training Program Approval:
The staff should verify that the licensee has a procedure documenting the systematic performance of the implementation process. The procedure should include:
Clear guidance meeting the guidance of section 4.3.6 of this ISG for consistent performance of exam remediation standards.
Training Program Inspection:
The staff should perform a review of exam remediation standards per the approved training program:
Validate the procedure complies with the guidance of section 4.3.6 of this ISG.
Review recent exam remediation paperwork conducted by the licensee for the following:
o The individuals remediation package covered gaps identified on the first examination.
o The remediation exam covered the missed learning objectives through new examination items.
o The maximum overlap of test items was 30% between the failed exam and the remediation exam.
o Line and training personnel engaged in the remediation process.
4.4. Post Training Activities Following examination activities, the final process of Implementation phase includes collecting feedback on the training and documenting the training occurrence.
4.4.1. Student Feedback Solicited Post Training for Evaluation Feedback on the training should be obtained from trainees to gauge the immediate effectiveness of the training. The material collected is for course evaluation in the evaluation phase of the SAT process.
EVALUATION CRITERIA:
Training Program Approval:
The staff should verify that the licensee has a procedure documenting the systematic performance of the implementation process. The procedure should include:
Clear guidance meeting the guidance of section 4.4.1 of this ISG for consistent performance of soliciting feedback after training sessions.
Training Program Inspection:
The staff should perform a review of the trainee feedback process per the approved training program:
Validate the procedure complies with the guidance of section 4.4.1 of this ISG.
Validate that training feedback was collected on course offerings and retained for evaluation.
4.4.2. Document the Training Occurrence 4.4.2.1.
Update Training Records and Qualifications The final activity in the Implementation phase is to document completion of training. The documentation should include, as applicable, the lesson plans or curriculum completed, the date training was completed, and names of trainees who completed the training.
Qualification records should be updated, as applicable. This information is retained.
EVALUATION CRITERIA:
Training Program Approval:
The staff should verify that the licensee has a procedure documenting the systematic performance of the implementation process. The procedure should include:
Clear guidance meeting the guidance of section 4.4.2 of this ISG for consistent performance of documenting the training occurrence.
Training Program Inspection:
The staff should perform a review of training documentation process per the approved training program:
Validate the procedure complies with the guidance of section 4.4.2 of this ISG.
Validate that, for a training session, the training records show complete and accurate documentation of the course offering, the lesson plans or curriculum completed, the date training was completed, the names of trainees who completed the training, and the status of qualification records following training.
Evaluation Phase
- 5. Evaluation The purpose of the evaluation phase of the Systems Approach to Training is to determine the effectiveness and efficiency of an instructional program. Evaluation is the feedback component of the performance-based training model. The evaluation phase activities are evaluation intake, information assessment, and initiate corrective actions. The outcome of the evaluation phase is the initiation of necessary actions to improve gaps identified in the training program.
5.1. Evaluation Intake During the evaluation phase, training and line personnel should be aware or conditions and events that indicate training effectiveness. The method of data collection and review should be continuous to ensure the currency and adequacy of the training program to sustain program effectiveness in line performance.
5.1.1. Collect and Analyze Incumbent and Management Feedback Training evaluation consists of receiving and analyzing feedback on the effectiveness of the training program. There are multiple methods of receiving feedback, including from trainees in the program, management observations of the program, analyzing exam results, and assessing the effectiveness of the training program by evaluating on-the-job performance of personnel who completed the training program.
5.1.1.1.
Training Feedback Analysis Training personnel should collect trainee feedback as part of the Implementation phase. Effective evaluation of the training program includes
reviewing the trainee feedback and initiating actions, when necessary, to improve the training program curriculum for future course offerings.
5.1.1.2.
Management Observations of Training Licensee management should observe training delivery on a routine basis and provide feedback on the adequacy of the training. Effective evaluation of the training program includes reviewing the feedback for potential improvements.
5.1.1.3.
Exam Item Analysis Following exam administration, an exam item analysis should be conducted to review the effectiveness of the exam. Exam item analysis should include review of high miss questions (e.g., >50% miss rate) to ensure that the topic was adequately addressed by the training curriculum, the question was leveled to the objective, and the exam question is not flawed. Appropriate actions should be taken to update exam material and trainee results when necessary.
5.1.1.4.
Post Training Performance Review When a trainee completes the initial training program, an on-the-job performance review is conducted to assess the effectiveness of the training program to prepare the trainee for the job. Feedback from the trainee, job incumbents, and management is included in the evaluation to assess how well the training program prepared the trainee for independent job performance.
This information is typically collected six months to a year following course graduation, or as specified by the licensee.
EVALUATION CRITERIA:
Training Program Approval:
The staff should verify that the licensee has a procedure documenting the systematic performance of the evaluation intake process. The procedure should include:
Clear guidance meeting the guidance of section 5.1.1 of this ISG for consistent performance of analyzing incumbent and management feedback.
Training Program Inspection:
The staff should perform a review of the trainee feedback process per the approved training program:
Validate the procedure complies with the guidance of section 5.1.1 of this ISG.
Perform a review of training feedback analysis, management observation of trainings, exam item analysis and post training performance review assessments recently completed by the licensee. Validate that the collection of the forms were completed in accordance with site procedures and this ISG.
5.1.2. Facility Issues and Events Facility events should be evaluated for potential training program impact. Human errors, equipment damage or unavailability, or rework could be an indicator that the associated training program did not adequately train (or improperly trained) a task or identify a task
that needs to be trained. For significant events, as defined by the facilitys corrective action program, the licensee shall evaluate whether training was a causal factor.
EVALUATION CRITERIA:
Training Program Approval:
The staff should verify that the licensee has a procedure documenting the systematic performance of the evaluation intake process. The procedure should include:
Clear guidance meeting the guidance of section 5.1.2 of this ISG for consistent performance of analyzing facility issues and events.
Training Program Inspection:
The staff should perform a review of the trainee feedback process per the approved training program:
Validate the procedure complies with the guidance of section 5.1.2 of this ISG.
Perform a review of facility issues and event assessments recently completed by the licensee. Validate that the events were identified and entered into the evaluation process accordance with site procedures and this ISG.
5.1.3. Inspection/Assessment/Evaluation Reports Any report, assessment, or corrective action program document that explicitly mentions a training weakness needs to be evaluated. Facility reports from causal evaluation, internal and external inspections and evaluations, and routine program assessments should be reviewed for potential training related weaknesses or recommendations.
EVALUATION CRITERIA:
Training Program Approval:
The staff should verify that the licensee has a procedure documenting the systematic performance of the evaluation intake process. The procedure should include:
Clear guidance meeting the guidance of section 5.1.3 of this ISG for consistent performance of analyzing inspection, assessment, and evaluation reports.
Training Program Inspection:
The staff should perform a review of the trainee feedback process per the approved training program:
Validate the procedure complies with the guidance of section 5.1.3 of this ISG.
Perform a review of facility inspections, assessments, and evaluations recently completed by the licensee. Validate that the report were identified and entered into the evaluation process accordance with site procedures and this ISG.
5.1.4. Facility Modifications and Procedure Changes Facility design changes, modifications or procedure changes that impact equipment operation, maintenance or user interface could alter the original information assessed in the program job and task analysis and associated knowledge, skill, and abilities that the training curriculum was designed to; consequently, any design changes to the facility must be reviewed for potential impact to the training program.
EVALUATION CRITERIA:
Training Program Approval:
The staff should verify that the licensee has a procedure documenting the systematic performance of the evaluation intake process. The procedure should include:
Clear guidance meeting the guidance of section 5.1.4 of this ISG for consistent performance of analyzing facility modifications and procedure changes.
Training Program Inspection:
The staff should perform a review of the trainee feedback process per the approved training program:
Validate the procedure complies with the guidance of section 5.1.4 of this ISG.
Perform a review of facility modifications and procedure changes recently completed by the licensee. Validate that the changes were identified and entered into the evaluation process accordance with site procedures and this ISG.
5.1.5. Industry Regulatory and Operating Experience Regulatory changes and industry operating experience shall be reviewed for applicability and possible incorporation into the associated training programs.
EVALUATION CRITERIA:
Training Program Approval:
The staff should verify that the licensee has a procedure documenting the systematic performance of the evaluation intake process. The procedure should include:
Clear guidance meeting section 5.1.5 of this ISG for consistent performance of analyzing industry regulatory and operating experience.
Training Program Inspection:
The staff should perform a review of the trainee feedback process per the approved training program:
Validate the procedure complies with the guidance of section 5.1.5 of this ISG.
Perform a review of industry regulatory and operating experience reviews recently completed by the licensee. Validate that the changes were identified and entered into the evaluation process accordance with site procedures and this ISG.
5.2. Assess Information Program evaluation information needs to be assessed before it can be used to make changes in training. The information collected through evaluation intake can be reviewed through various methods so long as pertinent information is collected sufficient to justify the solution.
5.2.1. Assessing the Approved Training Program Effectiveness
Information should be reviewed for any potential objective, factual data related to training program performance in training and evaluating the analyzed knowledge, skills and abilities associated with the tasks selected for training or training needs as identified in the training analysis phase. Data can be identified through any intake method, such as trainee feedback, line performance or assessments.
EVALUATION CRITERIA:
Training Program Approval:
The staff should verify that the licensee has a procedure documenting the systematic performance of the evaluation assessment process. The procedure should include:
Clear guidance meeting the guidance of section 5.2.1 of this ISG for consistent performance of assessing the approved training program effectiveness.
Training Program Inspection:
The staff should perform a review of the trainee feedback process per the approved training program:
Validate the procedure complies with the guidance of section 5.2.1 of this ISG.
Perform a review of the information collected by the licensee from section 5.1 of this ISG and the subsequent evaluation. Review for the following:
o Did the licensee appropriately identify information or feedback that might impact the current approved program?
o Did the licensee appropriately evaluate the information to identify what task, KSA, or other aspect of the training program is potentially impacted by the data?
o Did the licensee appropriately disposition the data to initiate corrective actions, as necessary?
5.2.2. Assessing the Approved Training Program Scope Information can also be reviewed for any potential changes in training program job requirements (tasks or knowledge, skills, and abilities) that are not currently selected for training but have been flagged as being affected by a change or potentially contributed to an event. Data can be identified through any intake method, such as trainee feedback, facility modifications and procedure changes, industry event reports or inspection results.
EVALUATION CRITERIA:
Training Program Approval:
The staff should verify that the licensee has a procedure documenting the systematic performance of the evaluation assessment process. The procedure should include:
Clear guidance meeting the guidance of section 5.2.2 of this ISG for consistent performance of assessing the approved training program scope.
Training Program Inspection:
The staff should perform a review of the trainee feedback process per the approved training program:
Validate the procedure complies with the guidance of section 5.2.2 of this ISG.
Perform a review of the information collected by the licensee from section 5.1 of this ISG and the subsequent evaluation. Review for the following:
o Did the licensee appropriately identify information or feedback that might indicate an issue that is outside of the current approved program scope?
o Did the licensee appropriately evaluate the information to identify what, if any, task and associated KSAs, in the training program but not selected for training is potentially impacted by the data?
o Did the licensee appropriately identify that the information to identify that the event is outside of the scope of the current training program?
o Did the licensee appropriately disposition the data to initiate corrective actions, as necessary?
5.3. Initiate Corrective Actions 5.3.1. Appropriate Actions are Taken to Improve the Training Program If the information assessment identifies an item or event that affects the training program, actions commensurate with the significance of the issue are identified.
5.3.1.1.
Actions That Initiate Training Needs Analysis Any recommendation to modify the analysis, design or development phases need to enter the training needs analysis process. Examples of this include:
(1) Analysis - recommended changes that impact the job or task analysis process, including task list, DIF data, tasks selected for training, or KSAs.
(2) Design - modification of objectives, training program requirements, or curriculum (3) Development - requests for development of any training materials.
5.3.1.2.
Actions That Do Not Initiate Training Needs Analysis Actions to enhance the training program without affecting the core SAT processes can be made from the evaluation phase without initiating a training needs analysis.
Examples of this include:
(1) Analysis - correction of typographical errors (2) Design - creation of additional exam items or updating/correcting an exam item based on exam item feedback (3) Development - modification or correction of developed training material that does not affect the purpose of the material or the ability of the material to cover the objective. This includes typographical corrections, minor enhancements to improve clarity and knowledge transfer (such as adding pictures, examples, practice problems, etc.). Note that modification of
developed training material still requires material review and approval through the development process, but in this instance it would not need a needs analysis prior to implementing the changes.
EVALUATION CRITERIA:
Training Program Approval:
The staff should verify that the licensee has a procedure documenting the systematic performance of the initiating corrective actions. The procedure should include:
Clear guidance meeting the guidance of section 5.3.1 of this ISG for consistent performance of initiating actions to improve the training program.
Training Program Inspection:
The staff should perform a review of initiating corrective actions per the approved training program:
Validate the procedure complies with the guidance of section 5.3.1 of this ISG.
Perform a review of the information collected by the licensee from section 5.1 of this ISG and the subsequent evaluation in section 5.2. Review for the following:
Did the licensee appropriately determine if training program material required changes based on the evaluation conducted?
Given the scope of the information and the evaluation, did the licensee enter the needs analysis process when required?
5.3.2. Performance Gaps Produce Training Effectiveness Metrics When training is identified as a solution to close a performance gap, training effectiveness shall be measured through an effectiveness review commensurate with the performance gap. When practicable, the licensee shall identify quantifiable, measurable metrics for monitoring training effectiveness. For example, if training is identified to aid in reduction of equipment out of service time, an effectiveness measure could be a xx% reduction in equipment out of service time, as measured over the next xx months. If the performance gap is identified only through qualitative management observation, then an appropriate effectiveness review could be further qualitative management observation or measurable surveys of workers. Failure to meet the effectiveness measure should result in re-evaluating the performance gap and initiating follow-up actions to address the issue.
EVALUATION CRITERIA:
Training Program Approval:
The staff should verify that the licensee has a procedure documenting the systematic performance of initiating corrective actions. The procedure should include:
Clear guidance meeting the guidance of section 5.3.2 of this ISG for consistent performance of evaluating performance gaps through training effectiveness metrics.
Training Program Inspection:
The staff should perform a review of initiating corrective actions per the approved training program:
Validate the procedure complies with the guidance of section 5.3.2 of this ISG.
Perform a review of the information collected by the licensee from section 5.1 of this ISG and the subsequent evaluation in section 5.2. Review for the following:
o Did the licensee appropriately identify performance gaps in station personnel?
o If the licensee evaluated the gap for and determined that training could be a potential solution, did the licensee enter the training needs analysis process?
o In the evaluation of the performance gap, did the licensee identify, through quantifiable measurable metrics, what the performance gap is and an effectiveness measure to apply to the requested training?
5.3.3. Training Evaluation Documentation and Approval Training evaluation actions, including assessment, disposition and actions created, should be reviewed, approved by line and training management, documented and retained.
EVALUATION CRITERIA:
Training Program Approval:
The staff should verify that the licensee has a procedure documenting the systematic performance of the Implementation process. The procedure should include:
Clear guidance meeting the guidance of section 5.3.3 of this ISG for consistent performance of preparation and scheduling of training.
Training Program Inspection:
The staff should perform a review of training evaluation documentation and approvals per the approved training program:
Validate that the material reviewed in sections 5.1 and 5.2 of this ISG were reviewed and approved by line and training management and documented.
5.4. Conclusion Evaluation phase includes a continual monitoring of program health. The outcome of the evaluation phase is closure of follow-up training related actions, such as training feedback and exam item analysis, actions created to initiate training needs, job or task analysis, or closure of training effectiveness metrics to validate training effectiveness.
B.
Facility Training Programs In development of a facility training program for an advanced reactor design, the following guidance is provided, including for review of submittals using a specific operator licensing
approach that includes the roles and requirements of reactor operators and senior reactor operators:
- 1. Program
Description:
1.1. General Requirements The applicant should provide descriptions of the training program, if applicable. The training program description should contain the following elements:
- a. The purpose of the program
- b. The job positions that are credited towards each role, as defined by the job and task analysis.
- c. The training organization teaching the course or supervising instruction of the course material.
- d. The qualification requirements of the training staff personnel.
- e. The course curriculum 1.2. Licensed Operator Programs The applicant should provide the additional information for licensed operator training programs:
- a. The course curriculum and scheduling for each course required to achieve a license (RO and SRO), as identified in the SAT analysis.
- b. A chart showing the proposed schedule for licensing personnel prior to criticality.
The schedule should be relative to the expected fuel load date and should also display the preoperational test period.
EVALUATION CRITERIA:
Training Program Approval:
The staff should verify that the licensee has documented the program description of the associated training facility training program. The description should include:
Clear guidance meeting the section B.1.1 of this ISG that contains the elements described.
For licensed operator programs, the program description meets the additional requirements of section B.1.2 of this ISG.
Training Program Inspection:
The staff should perform a review of the training program description to validate that the meets the requirements of section B.1.
- 2. Program Eligibility 2.1. General Requirements The applicant should provide descriptions of the eligibility requirements for each position in the training program.
2.2. Licensee Operator Requirements:
2.2.1. Procedures The licensee shall produce a list of procedures governing the eligibility requirements for licensed operator training programs for each position (RO and SRO).
2.2.2. Educational and Experience Requirements The application should describe the educational and experience requirements for licensing reactor operators and senior reactor operators, as identified in their SAT analysis. The application should describe any additional educational or experience requirements for positions within the control room team, such as shift manager or shift technical advisor.
The education and experience requirements should be based on endorsed guidance, or a justification provided. Any justification should provide a basis for the requirements.
While the licensee and the training program try to anticipate and address all conceivable situations which could arise, there will always be the potential for situations to arise which are not covered through training or operating procedures. Therefore, the licensee should have some personnel on shift who have sufficient understanding of the systems-level performance of a nuclear power plant, typically provided in an educational setting. It is also desirable to have senior operators on shift who have progressed through the typical experience path including the auxiliary operator and reactor operator positions.
Ideally, licensees should strive to have in the control room individuals with a mix of education, training, and experience in plant operations. The application should describe and justify how their education and experience requirements meet these criteria.
EVALUATION CRITERIA:
Training Program Approval:
The staff should verify that the licensee has documented the program requirements of the associated training facility training program. The requirements should include:
Clear guidance meeting the section B.2.1 of this ISG that contains the elements described.
For licensed operator programs, the program description meets the requirements of section B.2.2 of this ISG, to include procedures governing the eligibility requirements and the educational and experience requirements.
Training Program Inspection:
The staff should perform a review of the training program description to validate that the meets the requirements of section B.2.
- 3. Initial Training Programs 3.1. General Requirements Training program design should provide enough specificity to link the SAT design into the development phase, with a curriculum that shows clear achievement of desired qualifications. Additionally, initial training program content shall include topics beyond those analyzed in the job and task analysis to ensure new personnel to the plant are familiar with generalized topics.
General plant access training topics should be included in training programs for new hire personnel to ensure familiarization with key concepts of the facility. Topics in this training include, as necessary for the personnel:
General access requirements Rad worker training Security Fitness for duty Work hour rules Safety Emergency planning Quality assurance processes 3.2. Licensed Operator Training 3.2.1. For licensed operator training programs, tasks for foundational theory of plant operations and systems important to safety are required to be included in the resulting task list and KSA development for training program design, as applicable to the design of the plant and as analyzed in Part A, Systems Approach to Training, of this ISG.
Reactor theory, thermodynamic principles, and chemical theory associated with the technologies, materials, and processes of their reactor design Plant systems and components important to safety Reactivity management and manipulations Radiation control and safety Emergency, abnormal, and normal operations Administrative requirements and conditions of the facility license Technical specifications 3.2.2. Licensed operator training programs should include the requisite task training, as defined by the SAT analysis, to achieve qualification status. Included in the initial training program design should include timelines for the following, as applicable:
classroom (or equivalent knowledge) training hands on (OJT/TPE, simulator, or equivalent) training proficiency training (Under Instruction watches) program exams EVALUATION CRITERIA:
Training Program Approval:
The staff should verify that the licensee has documented the initial training program requirements of the associated training facility training program. The requirements should include:
clear guidance meeting the section B.3.1 of this ISG that contains the requirements described.
For licensed operator programs, the program description meets the requirements of section B.3.2 of this ISG, to include procedures governing the eligibility requirements and the educational and experience requirements.
Training Program Inspection:
The staff should perform a review of the training program description to validate that the meets the requirements of section B.3.
- 4. Requalification Programs 4.1. General Requirements Training programs should include requalification requirements for training programs to include all tasks selected for retraining, as defined by the SAT analysis. The requal program should include the following:
- a. Task list requiring retraining
- b. Retraining schedule according to the task retrain frequency requirements
- c. Scope of required training 4.2. Licensed Operator Requalification Training 4.2.1. In addition to the above, the requalification program must include training for performance and cognitive based tasks as identified in the job and task analysis for tasks selected for retrain.
4.2.2. Licensed operator requalification training shall include the following:
- a. A retrain periodicity not to exceed 24 months for specifically licensed operator Training programs.
- b. A process for review and maintenance of the program.
EVALUATION CRITERIA:
Training Program Approval:
The staff should verify that the licensee has documented the requalification training program requirements of the associated training facility training program. The requirements should include:
clear guidance meeting the section B.4.1 of this ISG that contains the requirements described.
For licensed operator programs, the program description meets the requirements of section B.4.2 of this ISG, to include retrain requirements for tasks selected as identified in the job and task analysis, a retrain periodicity not to exceed 24 months for specifically licensed operator training programs, and a process for review and maintenance of the program.
Training Program Inspection:
The staff should perform a review of the training program description to validate that the meets the requirements of section B.4.
VI.
IMPLEMENTATION The NRC staff will use the information discussed in this ISG to review advanced reactor applications for CPs, OLs, COLs, DCs, and MLs under 10 CFR Part 50 and 10 CFR Part 52.
The NRC staff intends to incorporate this guidance in updated form in the RG or NUREG series, as appropriate.
VII.
BACKFITTING AND ISSUE FINALITY DISCUSSION DRO-ISG-2023-04, if finalized, would not constitute backfitting as defined in 10 CFR 50.109, Backfitting, and as described in MD 8.4; constitute forward fitting as that term is defined and described in MD 8.4; or affect the issue finality of any approval issued under 10 CFR part 52, Licenses, Certificates, and Approvals for Nuclear Power Plants. The guidance would not apply to any current licensees or applicants or existing or requested approvals under 10 CFR Part 50 or 10 CFR part 52, and therefore its issuance cannot be a backfit or forward fit or affect issue finality. Further, applicants and licensees would not be required to comply with the positions set forth in this ISG.
VIII.
CONGRESSIONAL REVIEW ACT Discussion to be provided in the final ISG.
IX.
FINAL RESOLUTION The NRC staff will transition the information and guidance in this ISG into the RG or NUREG series, as appropriate. Following the transition of all pertinent information and guidance in this document into the RG or NUREG series, or other appropriate guidance, this ISG will be closed X.
APPENDIX A.
Job Analysis Breakdown Example B.
Task Selection for Training Process Example C.
Operator Job and Task Analysis Topic Consideration D.
Operator Examination KA Screening Process Example E.
Blooms Taxonomy F.
Learning Objective Structure Examples G.
Exam Plausibility H.
Resolution of Public Comments I.
References
66 Glossary Ability - The quality of being able to perform specific mental or physical action.
Action - One of the three components of a learning objective. In a learning objective, the action statement consists of an action verb and a direct object. The action verb should identify trainee behavior that is observable and measurable. For example, in the action statement start the main feed pump, the action verb (start) and the direct object (main feed pump) are both observable and measurable. Reference Appendix 6 for learning objective structure examples.
Adequate sampling - The process of selecting exam items from the learning objectives being trained during the creation of a cognitive evaluation. An approximate value of >80% objective evaluation is considered adequate.
Affective domain - An attribute of a learning objective as it pertains to the related KSA list.
Objectives written in this domain are intended to change the attitudes that affect behavior. The affective domain of learning deals with learning objectives on an emotional level, to include feelings, appreciation, enthusiasm, attitudes, and motivation.
Alternate path - A task that may include more than one set of elements to achieve satisfactory completion, such as manually opening a discharge valve if it fails to open after starting a pump.
For tasks that involve multiple actions based on various plant response (branch steps or alternate paths), all actions must be reviewed to ascertain which actions are required to be bounded in the scope of the task analysis.
Branch step - Synonymous to alternate path Cognitive domain - An attribute of a learning objective as it pertains to the related KSA list.
Cognitive learning is demonstrated by recall of knowledge and other intellectual skills such as applying knowledge in a new situation, displaying comprehension of information, problem solving, organizing information, analyzing, synthesizing, and evaluating ideas or actions. The lower levels of this domain require a student to recall, comprehend, or apply knowledge. In higher levels, students must analyze, synthesize, or evaluate information. Cognitive learning focuses on the knowledge aspect of the KSA list.
Cognitive evaluation - an examination process that assesses the trainees mastery of knowledge related to task performance. A cognitive examination determines if the trainee knows how to perform the required job tasks and skills. Cognitive examinations can be conducted through evaluation instruments such as written examinations, oral boards, or in-plant walkthroughs, provided a method of consistent trainee evaluation is provided.
Condition: In a learning objective, the condition defines the facility situation, environmental aspects, and resources available to aid trainee performance. Learning objective conditions are derived from job conditions identified during analysis and clarify how the trainee is expected to complete the learning objective action. For example, in the learning objective action multiply two three-digit numbers, the condition could be using a calculator, or without the use of a calculator.
Consistent - Acting or done in the same way over time, especially so as to be fair or accurate
67 Difficulty - An attribute of the DIF analysis process. Difficulty is a measurement of the challenge an individual has in performing the task being analyzed.
Discrimination - An exam item characteristic that differentiates between those trainees who have mastered a concept or not. Exam item discrimination is typically based through means of comparison of all trainees who have taken the exam item. Effective use of exam item discrimination, often through computed statistical analysis, can aid instructor review of exam item quality.
DIF Analysis (Difficulty - Importance - Frequency) (DIF) - A screening methodology that utilizes three main factors in determining whether or not a task, as analyzed in the job analysis process, will provide benefit to the licensee organization to conduct formal training in the SAT based training program. See each term definition individually.
Duty Area - A grouping methodology used in the analysis phase to concentrate task creation during job analysis. Duty areas can be created as pertinent to the design, but typically correspond to systems, structures, components, or major topically focused areas of work that a job incumbent is expected to perform. Example duty areas can be systems (such as the fire protection system), from which tasks related to operating or monitoring the fire protection system equipment would be grouped.
Element - Steps that must be performed to successfully complete the task. They are bounded by the initial conditions (starting point) of the task and standards (criteria for successful task performance, or completion). The elements of a task may reference procedure steps provided the scope of the procedure is adequate and detailed enough to allow consistent performance of the task.
Enabling objective - discrete Learning objectives that, when combined into a lesson, will support mastery of the terminal objective.
Familiar - An individual who is knowledgeable and aware of a process or function.
Knowledgeable and aware is defined as having reviewed the concepts of the process prior to providing input. Being familiar is not the same as being qualified. Familiarity is to ensure that the input and feedback provided is consistent with others providing input in the same role.
Familiarity can be assumed through reviewing written instructions and directions prior to starting the function.
Formal training: Any training that has been determined to be necessary for a trainee to reach a satisfactory level of task performance. Requirements for formal training in the training program are identified during the needs and job analysis process of the analysis phase. Formal training includes the use of all 5 phases of the Systems Approach to Training process.
Incumbent - An individual who is a member of the line organization and is a participant in the training process associated with the position. An experienced incumbent will have successfully passed the training course and has performed the job as a qualified individual for some time.
The use of incumbents in the training program is critical in maintaining a training program that is effective in meeting the needs of the line for sustained performance.
Independence - An exam characteristic that ensures one exam question will not answer another. For example, if one exam question states what is the low level setpoint of the __
tank?, and another exam question states what happens when the ___ tank reaches the low
68 level setpoint of x?, the first question is answered by the second question. In such a case, the first question is disqualified because there is no discrimination in trainees to effectively determine knowledge retention.
Informal training - Any training activity that is beyond the scope of the formal training process.
Informal training is utilized for job tasks that are not selected for formal training in the job analysis process through the task selection for training process. Informal training is not considered necessary for satisfactory job incumbent performance and therefore is not monitored through the five phases of the SAT process. For example, the task action read a thermometer might be analyzed as no formal training required due to its simplicity; therefore, any informal training received (such as guidance from another operator while on under instruction training watches) is not required to be monitored or evaluated.
Instructor(s) - The individuals who are responsible for implementing the programmatic requirements of a SAT based training program and its administration. Instructors are typically tasked with conducting training activities in coordination and partnership with line personnel.
Instructors, and associated supervision, are collectively referred to as taining staff in this ISG.
Job - A grouping of tasks and functions that are assigned to a personnel position.
Job analysis - A systematic process of collecting all information about a job or role in an organization and identifying key performance-based tasks related to successful performance of the assigned duties. The outcome of job analysis is a task list, followed by task analysis.
Knowledge - Information required to effectively accomplish a step, task, or job. Knowledge involves storing and recalling information and refers to the learning of names, facts, processes, and principles.
Learning objective - an outcome statement that captures specifically what knowledge, skills, or attitudes a learner should be able to exhibit or perform following instruction. Learning objectives can be categorized into three domains or general areas: cognitive, affective, and psychomotor.
Classifying instruction into a domain allows developers to design, develop and select activities and strategies that match the objectives. The cognitive domain includes all intellectual process, from knowing to evaluating. The affective domain includes values, attitudes, beliefs, emotions, motivation, and interests. The affective domain includes emotional responses rather than intellectual ones; therefore, it is the most difficult to describe and assess. The psychomotor domain includes physical performance of a task and is often supported by requisite cognitive and/or affective domain learning objectives. All three domains of learning are addressed in learning objectives, and the domain of the objective must be considered when developing the evaluation items and supporting training program content to maintain adequate leveling in the program.
Leveled - A characteristic of training program design that signifies alignment between the objective, the lesson plan content that supports the objective, and the evaluation items that test trainee mastery of the objective. Leveled training program design focuses the program on the action level of the objective, such as through Blooms taxonomy, and will have material and evaluation supporting to that level of the objective. An example of a leveled program design would be an objective state the purpose of the __ system. state is a low cognitive level action verb for an objective, and therefore the supporting lesson plan material should be simple in its description of the system and the subsequent examination would test whether the trainee can adequately state what the purpose of the system is. An unleveled design would include the
69 same objective, but in-depth lesson plan content and scenario-based exams linked to that objective to test the trainees understanding of system interrelations. Similarly, a higher cognitive objective analyze system failure response would require in-depth lesson plan material supporting trainee understanding of the system, its failure modes, responses, and how to analyze different scenarios. The corresponding examination item would require in-depth questioning, most likely in a scenario-based example that requires the trainee to comprehend the surrounding system performance and subsequent failure and adequately determine if the response was appropriate to system design.
Line management - An individual with leadership authority (supervisor or higher) in the department which is the recipient of the training.
Needs analysis - A systematic process of identifying and evaluating a training request to establish a comprehensive understanding of the issue or situation and how it relates to the requisite job performance. Recommendations for formal training to address gaps identified in job tasks or knowledge, skills, and abilities necessary for improved or sustained job performance.
Performance evaluation - an examination process that deals with assessment of technical or psychomotor skills, duplicates the actual behavior of the student by using the same equipment, resources, setting, or circumstances that the trainee will encounter on the job. A performance evaluation can be done in multiple environments, including in-plant, simulator, laboratory, or a classroom. For example, filling out an emergency response form is a performance evaluation that can be done in any environment that provides the required trainee resources for consistent evaluation.
Plausible - Exam item characteristic that requires that the evaluation must be credible to the objective and distractors (if options are provided to the trainee) are written in a way to ensure that alternatives serve as distractors which may be chosen by a trainee that has not achieved mastery of the objective but ignored by students who have mastered the objective content.
Position - A grouping methodology used in the analysis phase to concentrate task creation during job analysis. Position is utilized in terms of staffing related to the organizational business structure. Utilization of position during job analysis is beneficial because it provides focus towards what exactly each individual in the corresponding job will be required to do, depending on where they are in the plant and what their responsibilities are. Examples of positions in a control room can be Operator at the Controls and Balance of Plant operator. Examples of positions outside of the control room could be specific to the building or area that the operator is expected to perform actions, such as the turbine building operator and reactor building operator.
Psychomotor - The psychomotor domain includes physical movement, coordination, and mental skills as it pertains to the job. Psychomotor learning focuses on the tasks or skills associated with the KSA list and, through objective leveling, will focus on performance of the action being trained.
Qualified - An individual who has successfully passed the applicable training program requirements and is able to perform the job as defined by the job and task analysis.
Role - A grouping methodology used in the analysis phase to concentrate task creation during job analysis. Roles allow grouping of duty areas specific to the different responsibilities a position might require to perform, depending on the circumstances. Effective roles will aid in
70 grouping duty areas. Examples of roles could be performing licensed operator duties and performing duties that do not require a license. These roles allow duty/task development based on the position being analyzed, to ensure the correct level of qualification is maintained during the job analysis process.
Skill - The ability to perform an activity that contributes to the accomplishment of the step, task, event, or job.
Specialty task - A task that, through DIF analysis, is determined to require formal training but is not necessary to be included in the initial training program. Specialty tasks are often developed into a training curriculum independently, allowing personnel to qualify on the task as they meet the individual specialty task prerequisites. The assignment of a task as a specialty task still requires the task to be formally trained through the SAT process.
Subject matter experts - an individual with significant experience or understanding of the topic being discussed. Subject matter experts can be utilized during initial training program development for advanced reactors when the there are no qualified personnel.
Standards - The guidance or criteria that is used to determine successful task performance.
Standards are utilized in tasks and objectives. The standard for objectives is used to determine the measure of success in evaluation item grading. Objective standards must be clearly written to allow clarity in what establishes performance measurement, for example, in accordance with
[a procedure] or in accordance with [station design criteria] can be utilized. For cognitive objectives, without error is an implied objective standard that does not need to be written out.
Target student population - The trainees for which the program has been analyzed and designed for in terms of required knowledge and/or experience that the trainee must have in order to enter the training program.
Task - A discrete performance-based activity that is required for successful job performance.
Tasks include the following criteria:
Are measurable Contain a specific, small number of steps Contain a beginning and end Task analysis - A systematic review of tasks that identifies the requisite knowledge, skills, abilities, and elements to the job task. Task analysis also considers key attributes like task conditions and standards required for successful task completion. Task analysis is conducted with line and training personnel Task deletion - An action taken through the job analysis process through which a task is removed from the training program. In task deletion, the task is permanently removed from the approved task list, including all corresponding attributes identified in the job and task analysis processes (DIF Data, elements, KSAs, etc.). Task deletion can occur on a task that is, or is not, selected for training.
Task deselection - An action taken through the job analysis process in which a task on the approved task list is removed from the list of tasks selected for training. In task deselection, the task is no longer trained in the training program curriculum via formal training but remains on the approved task list; therefore, all corresponding attributes identified in the job and task analysis processes (DIF Data, elements, KSAs, etc.) remain.
71 Task list - A list of tasks as analyzed by the job analysis process that identifies all required tasks for the job position. A task list from job analysis is evaluated to distinguish which tasks are selected for training, which are then analyzed using task analysis.
Terminal objective - Learning objective that defines the final competencies expected of the trainee, typically at the task performance level, following instruction.
Training management - An individual of leadership authority (supervisor or greater) in the department responsible for the creation of the training program materials.
Training request - A request for training by any employee, manager, or stakeholder. Training requests can be voluntary/informally requested based on, for example, a trainee suggestion for improvement of the program, or process initiated based on plant design changes.
Training staff - The department, organization, or group with the responsibilities of executing the SAT based training program. The training staff personnel include, at a minimum, instructors and supervisory oversight with a qualification process to ensure that the training program requirements will be performed in accordance with the SAT process.
72 Appendix 1 Job Analysis Breakdown Example:
Control Room Operator vs. Control Operator Traditional breakdown of the Control Room Operator as defined in current reactor designs would result in the following:
Job: Control Room Operator o Position: Operator at the Controls (OAC) (licensed operator required)
Role: Performing operator duties requiring a license Duty Area: Rod control system o Tasks: Latch rods, Withdraw control rods Role: Performing duties that do not require a license Duty Area: Primary cooling water system o Tasks: Start primary cooling water pump o Position: Balance of Plant Operator (BOP) (licensed operator required)
Role: Performing operator duties requiring a license Duty Area: Turbine control system o Tasks: Raise turbine load, Trip the turbine Role: Performing duties that do not require a license Duty Area: Secondary cooling water system o Tasks: Start secondary system cooling water pump Duty Area: Turbine building ventilation system o Tasks: Start ventilation system fan, adjust ventilation system flow Breakdown of the Control Operator job in advanced reactor designs through increased use of automation could result in the following:
Job: Control Operator o Position: Operator at the Controls (OAC) (licensed operator required)
Role: Performing operator duties requiring a license Duty Area: Rod control system o Tasks: Latch rods, Withdraw control rods Duty Area: Turbine control system o Tasks: Raise turbine load, Trip the turbine o Position: Balance of Plant Operator (BOP)
Role: Performing duties that do not require a license Duty Area: Primary cooling water system o Tasks: Start primary cooling water pump Duty Area: Secondary cooling water system o Tasks: Start secondary system cooling water pump Duty Area: Turbine building ventilation system o Tasks: Start primary ventilation system fan, adjust ventilation system flow
73 Appendix 2 Task Selection for Training Process Example (Ref 7)
DIF Flowchart
74 Appendix 2 Task Selection for Training Process Example Task Rating System Difficulty in Performing Task:
Minimum
- 1. "Very easy" to perform.
- 2. "Somewhat easy" to perform.
- 3. "Moderately difficult" to perform.
- 4. "Very difficult" to perform.
Maximum
- 5. "Extremely difficult" to perform.
Importance of Task:
Minimum
- 1. Consequences of improper performance are negligible.
- 2. Consequences of improper performance are undesirable.
- 3. Consequences of improper performance are serious.
- 4. Consequences of improper performance are severe.
Maximum
- 5. Consequences of improper performance are extremely severe.
Frequency of Performing Task:
Minimum
- 1. Less than once per year.
- 2. Once every five to twelve months.
- 3. Once every three weeks to four months.
- 4. Once every one to two weeks.
Maximum
- 5. More frequently than once per week.
75 Appendix 3 Operator Job and Task Analysis Topics of Consideration Operator job and task analysis should include the knowledge needed to perform licensed operator and senior licensed operator duties. The job and task analysis should be derived from a systematic analysis of licensed operator and senior licensed operator duties. Include in the analysis consideration the following items, as relevant to the design:
(1) Fundamentals of reactor theory, including neutron multiplication, source effects, criticality indications, reactivity coefficients, and poison effects.
(2) General design features of the reactor and fuel, including structure, instrumentation, and coolant flow.
(3) Facility operating characteristics during steady state and transient conditions, causes and effects of temperature, pressure and reactivity changes, effects of load changes, and operating limitations and reasons for these operating characteristics.
(4) Principles of heat transfer thermodynamics and fluid mechanics.
(5) Chemical theory principles, to include those physical, chemical, and nuclear phenomena relevant to coolant or radionuclides.
(6) Mechanical components and design features of the reactor primary system.
(7) Secondary coolant and auxiliary systems that affect the facility.
(8) Design, components, and functions of control, protection, and safety systems, including instrumentation, signals, interlocks, failure modes, and automatic and manual features.
(9) Active safety features, passive safety features, and inherent safety characteristics incorporated into the plant design.
(10) Shielding, isolation, and containment design features, including access limitations.
(11) Design, components, and functions and operation of reactivity control mechanisms and instrumentation.
(12) Purpose and operation of radiation monitoring systems, including alarms and survey equipment.
(13) Radiological control, safety principles, and procedures.
(14) Procedures and equipment available for handling and disposal of radioactive materials and effluents.
(15) Administrative, normal, abnormal, and emergency operating procedures for the facility.
(16) Technical specifications.
(17) Applicable portions of title 10, chapter I, Code of Federal Regulations.
Additionally, senior licensed operator job and task analysis should include the knowledge needed to perform senior licensed operator duties. The job and task analysis should be derived from a systematic analysis of licensed operator and senior licensed operator duties.
Include in the analysis consideration the following items, as relevant to the design:
(1) Conditions and limitations in the facility license.
(2) Facility operating limitations in the technical specifications and their bases.
(3) Facility licensee procedures required to obtain authority for design and operating changes in the facility.
76 (4) Radiation hazards that may arise during normal and abnormal situations, including maintenance activities and various contamination conditions.
(5) Assessment of facility conditions and selection of appropriate procedures during normal, abnormal, and emergency situations.
(6) Procedures and limitations involved in determination of various internal and external effects on core reactivity.
(7) Fuel handling facilities and procedures.
Licensed operator job and task analysis should include the ability to perform licensed operator and senior licensed operator duties. The job and task analysis should be derived from a systematic analysis of licensed operator and senior licensed operator duties. Include in the analysis consideration the following items, as relevant to the design:
(1) Identify the significance of instrumentation readings and annunciators and perform remedial actions where appropriate.
(2) Safely operate the facility's heat removal systems, including decay heat removal systems, and identify the relations of the proper operation of these systems to the operation of the facility.
(3) Observe and safely control the operating behavior characteristics of the facility during the following:
(A) Plant or reactor startups (B) Plant shutdown.
(C) Significant (10 percent) power changes.
(4) Perform pre-startup procedures for the facility, including operation of those controls associated with plant equipment that could affect reactivity.
(5) Demonstrate the use and function of the facility's radiation monitoring systems, including fixed radiation monitors and alarms, portable survey instruments, and personnel monitoring equipment.
(6) Perform control manipulations required to obtain desired operating results during normal, abnormal, and emergency situations.
(7) Safely operate the facility's auxiliary and emergency systems, including operation of those controls associated with plant equipment that could affect reactivity or the release of radioactive materials to the environment.
(8) Perform defense-in-depth actions that may be required under emergency conditions.
(9) Demonstrate the ability to diagnose and mitigate the following conditions:
(A) Loss of coolant.
(B) Loss of instrument air.
(C) Loss of electrical power (or degraded power sources).
(D) Loss of core coolant flow/natural circulation.
(E) Loss of feedwater (normal and emergency).
(F) Loss of service water, if required for safety.
(G) Loss of shutdown cooling.
(H) Loss of component cooling system or cooling to an individual component.
(J) Loss of protective system channel.
(K) High activity in reactor coolant or offgas.
(L) Turbine or generator trip.
(M) Malfunction of an automatic control system that affects reactivity.
(N) Reactor trip.
77 (O) Main steam line break (P) Nuclear instrumentation failure.
(9) Demonstrate knowledge of significant radiation hazards, including permissible levels in excess of those authorized, and ability to perform other procedures to reduce excessive levels of radiation and to guard against personnel exposure.
(10) Demonstrate knowledge of the emergency plan for the facility, including, as appropriate, the licensed operator's or senior licensed operator's responsibility to decide whether the plan should be executed and the duties under the plan assigned.
(11) Demonstrate the applicant's ability to function as appropriate to the assigned position, in such a way that the facility licensee's procedures are adhered to and that the limitations in its license and amendments are not violated.
Below is a corresponding example of an updated, technology-inclusive version of the required Generally Licensed Operator topics for advanced reactors.
The analysis should include the required knowledge needed to perform generally licensed operator duties. These selections should be made from among both learning objectives or KSAs derived from a systematic analysis of operator duties and the following items, as relevant to the design:
(1) Conditions and limitations in the facility license.
(2) Facility operating limitations in the technical specifications and their bases.
(3) Facility licensee procedures required to obtain authority for design and operating changes in the facility.
(4) Radiation hazards that may arise during normal and abnormal situations, including maintenance activities and various contamination conditions.
(5) Assessment of facility conditions and selection of appropriate procedures during normal, abnormal, and emergency situations.
(6) Procedures and limitations involved in determination of various internal and external effects on core reactivity.
(7) Fuel handling facilities and procedures.
(8) Fundamentals of reactor theory, including neutron multiplication, source effects, criticality indications, reactivity coefficients, and poison effects.
(9) General design features of the reactor and fuel, including structure, instrumentation, and coolant flow.
(10) Facility operating characteristics during steady state and transient conditions, causes and effects of temperature, pressure and reactivity changes, effects of load changes, and operating limitations and reasons for these operating characteristics.
(11) Principles of heat transfer thermodynamics and fluid mechanics.
(12) Chemical theory principles, to include those physical, chemical, and nuclear phenomena relevant to coolant or radionuclides.
(13) Mechanical components and design features of the reactor primary system.
(14) Secondary coolant and auxiliary systems that affect the facility.
(15) Design, components, and functions of control, protection, and safety systems, including instrumentation, signals, interlocks, failure modes, and automatic and manual features.
(16) Active safety features, passive safety features, and inherent safety characteristics incorporated into the plant design.
78 (17) Shielding, isolation, and containment design features, including access limitations.
(18) Design, components, and functions and operation of reactivity control mechanisms and instrumentation.
(19) Procedures and equipment available for handling and disposal of radioactive materials and effluents.
(20) Administrative, normal, abnormal, and emergency operating procedures for the facility.
(21) Applicable portions of title 10, chapter I, Code of Federal Regulations.
Job and task Analysis should also include the ability to perform generally licensed operator duties. The job and task analysis should be derived from a systematic analysis of generally licensed operator duties. Include in the analysis consideration the following items, as relevant to the design:
(1) Identify the significance of instrumentation readings and annunciators and perform remedial actions where appropriate.
(2) Observe and safely control the operating behavior characteristics of the facility during the following:
(A) Plant or reactor startups (B) Plant shutdown.
(C) Significant (10 percent) power changes (3) Perform pre-startup procedures for the facility, including operating of those controls associated with plant equipment that could affect reactivity.
(4) Demonstrate the ability to diagnose and mitigate the following conditions:
(A) Loss of protective system channel.
(B) Malfunction of an automatic control system that affects reactivity.
(C) Reactor trip.
(D) Nuclear instrumentation failure.
(5) Demonstrate knowledge of significant radiation hazards, including permissible levels more than those authorized, and ability to perform other procedures to reduce excessive levels of radiation and to guard against personnel exposure.
(6) Demonstrate knowledge of the emergency plan for the facility, including, as appropriate, the operator's responsibility to decide whether the plan should be executed and the duties under the plan assigned.
Appendix 4 Operations Training Program to Operator Examination Guidance Alignment 79 The red flow path provides guidance for the licensee to produce a KSA List in the analysis phase that provides justification for all items in the Operator Licensing Programs ISG. Any changes to the task list are tracked/monitored for updates to this list.
The blue flow path provides is the commission approved training program design that is developed and implemented into the initial and continuing training program. The objectives and associated exam items may be used for the Operator Examination
- process, but they must reference it against the Commission approved KA List (red line) via a sample plan, as required by the examination ISG.
Same guidance in SAT ISG as the Operator Licensing ISG pertaining to the topics evaluated.
Examinations should test a representative selection of the knowledge needed to perform licensed operator and senior licensed operator duties. These selections should be made from among both learning objectives or KSAs derived from a systematic analysis of licensed operator and senior licensed operator duties and from the following items, as relevant to the design: Reference Appendix 3 (Same list of topics as in analysis) 1.2.3.1.1 Job Analysis Cognitive tasks for operations training programs shall also include systems important to safe plant operations and foundational theory of plant operations in their task list. These include:
- Reactor theory and thermodynamic principles
- Plant systems and components
- Reactivity management and manipulations
- Radiation control and safety
- Emergency, abnormal, and normal operations
- Administrative requirements and conditions of the facility license
- Technical specifications Operations training programs shall document the basis for the scope of the job analysis.
Reference Appendix 3: All items on Appendix 3 must be considered as part of the operations training program, through task and associated KSA development, as relevant to the design.
1.2.4 Tasks are Systematically Selected for Training 1.2.4.1 Licensed Operator Training Includes Items Important to Safety For licensed operator training programs, tasks for foundational theory of plant operations and systems important to safety identified in 1.2.3.1.1 are required to be included in the resulting task list and KA development for training program design.
Cognitive tasks included in the KSA development include at a minimum:
- Reactor theory, thermodynamic principles, and chemical theory associated with the technologies, materials, and processes of their reactor design.
- Plant systems and components
- Reactivity management and manipulations
- Radiation control and safety
- Emergency, abnormal, and normal operations
- Administrative requirements and conditions of the facility license Task List Created Tasks are selected for training 1.3 Task Analysis 1.3.2.1 Operator Licensing Programs Produce a KA List for Commission Approval Licensed operator training program job and task analysis must include those tasks important to safe plant operation and tasks derived from section 1.2.3.1.1 related to the foundational theory of plant operations. The resulting KSA list is considered the comprehensive list of KSAs licensed operators are expected to master for safe plant operations (not just those tasks selected for training) and will input in the NRC Operator Examination KA selection process.
Approved KSA List 2.2 Develop Learning Objectives 2.2.4 Lesson Plans include enabling objectives to support the terminal objective goal.
Enabling objective creation is focused around the knowledge, skills and abilities identified in the job and task analysis There are no objectives not tied to a KSA There are no KSAs not tied to an objective Learning Objectives Created Exam Items Created ANALYSIS DESIGN Licensed Operator KSA ranking process Following production of the KSA list for items important to safe plant operation (as defined in section 1.3 of the SAT ISG), the licensee must screen the KSA list to identify those tasks and associated KSAs important to safe plant operation and tasks related to the foundational theory of plant operations.
The licensee is expected to determine and apply a process to filter the Commission approved KSA list to rate the KSA list and submit the process and results to the NRC for review.
Guidance for the ranking process is provided in DRO-ISG-2023-01, Operator Licensing Programs ISG.
Licensed Operator KSA ranking process Licensed Operator Examinations SAT Based Training Program Development Implementation Evaluation
80 Appendix 5 Blooms Taxonomy COGNITIVE DOMAIN Level Type of Learning Definitions and Examples of Behavior Evaluation Making judgments about the value of ideas, works, solutions, methods, materials, etc. Judgments may be either quantitative or qualitative.
Examples: To argue, to decide, to compare, to consider, to contrast.
Synthesis Putting together elements and parts to form a new whole.
Examples: To write, to produce, to plan, to design, to derive, to combine.
Analysis Breaking down material or ideas into their constituent parts and detecting the relationship of the parts and the way they are arranged.
Examples: To distinguish, to detect, to employ, to restructure, to classify.
Application Knowing an abstraction well enough to apply it without being prompted or without having been shown how to use it.
Examples: To generalize, to develop, to employ, to transfer.
Comprehension Understanding the literal message contained in a communication.
Examples: To transform, to paraphrase, to interpret, to reorder, to infer, to conclude.
Knowledge Remembering an idea, material, or phenomenon in a form very close to that in which it was originally encountered.
Examples: To recall, to recognize, to acquire, to identify.
AFFECTIVE DOMAIN High Cognitive Level Low Cognitive Level
81 Appendix 6 Learning Objective Characteristics (Ref 7)
Action Statements:
The action statement consists of an action verb and a direct object. The action verb should identify trainee behavior that is observable and measurable. For example, in the action statement start secondary feed system, the action verb (start) and the direct object (secondary feed system) are both observable and measurable The following is a verb list with definitions as a reference in the selection of an action verb for the action statement:
ACKNOWLEDGE Recognize and respond to an indication or alarm.
ACTUATE Put into mechanical action or motion.
ADD Increase; to perform the mathematical addition process.
ADJUST Bring a continuous effort into proper or exact position.
ALIGN Adjust or correct relative position of an item.
ALTERNATE Change or substitute one to another.
ANALYZE Break down a complex whole into its component parts.
ANNOUNCE Give notice of an event or evolution (e.g., via the public address system).
ANSWER Respond to a request for information.
ANTICIPATE Give advance thought, discussion, or treatment; foresee.
APPLY Bring into action; put into operation.
ASSEMBLE Fit parts together into a complete structure or unit.
ASSESS Determine the importance, size, or value.
ASSIST Give support or aid.
AUTHORIZE Legally approve of action; empower.
BACKWASH Move air or liquid backward by a propelling force.
BALANCE Equalize opposing forces.
BEGIN Commence or initiate.
BLEED Extract or cause to escape from a contained source.
BLOCK Obstruct passage or progress.
BOIL Heat to the boiling point.
BORATE Add boric acid.
BUILD Construct according to specific plan or process.
BYPASS Avoid or circumvent.
CALCULATE Determine by mathematical processes.
CALIBRATE Detect, correlate, report, or eliminate, by adjustment, any discrepancy in accuracy of an instrument or measuring device being compared with a standard.
CALL Communicate orally in person or by phone.
CENTER Place or adjust around a center area or position.
82 CHANGE Replace.
CHARGE Restore or load to capacity.
CHECK Look at carefully or critically; verify.
CHOOSE Select after consideration of alternatives.
CIRCULATE Flow in a circular path.
CLEAN Free from dirt or contamination.
CLEAR Free from obstruction or limitation.
CLOSE Bring or come to a natural or proper end; cease operation.
CODE Assign symbols, letters, numbers, or words.
COLLECT Bring together into one body or place.
COMPARE Examine the character or qualities in order to discover resemblances or differences.
COMPLETE Bring to an end; having all necessary parts.
COMPUTE Determine by mathematical means.
CONNECT Join or fasten together.
CONTROL Manage with authority.
COOL Cause to lose heat or warmth.
CORRECT Alter or adjust to a required condition or standard.
CONSTRUCT Make or form by combining parts.
DECIDE Come to a conclusion based on available information.
DECREASE Make less as in size, number, or intensity.
DEENERGIZE Disconnect energy or voltage.
DEPRESS Press down.
DESELECT Stop a selected function.
DETECT Discover the existence or presence of something.
DETERMINE Decide or resolve conclusively.
DIAGNOSE Recognize or determine the nature or cause of a condition by consideration of signs or symptoms.
DILUTE Make thinner or diminish the strength of, by admixture.
DIRECT Assign activities to another person.
DISASSEMBLE Take apart.
DISCONNECT Sever or terminate a connection.
DISPLAY Exhibit for visual evidence.
DISPOSE Get rid of.
DISSOLVE Cause to pass into solution.
DON Put on clothing or equipment.
ENERGIZE Impart energy or voltage.
ENTER Input data.
83 ESTABLISH Make firm or stable.
ESTIMATE Calculate approximately the extent or amount of.
EXIT The act of going out or going away.
EXPLAIN Make understandable.
FEED Supply a signal to an electric circuit; supply liquid to a system.
FLUSH Cleanse or wash out with a fluid.
HEAT Add energy to achieve higher temperatures.
HOIST Raise into position using a tackle.
HOLD Retain by force; apply continuous pressure.
IDENTIFY Regard or recognize clearly.
IMMERSE Plunge or dip into a fluid.
INCREASE Add or enlarge in size, intent, quantity.
INFORM Communicate information.
INSPECT Examine officially; to determine the serviceability of an item by comparing its physical, mechanical, and/or electrical characteristics with established standards.
INSTALL Seat or fix into position a component or assembly to allow the proper functioning of equipment or system.
INTERPOLATE Determine or estimate intermediate values from two given values.
INTERPRET Translate the meaning of.
INSERT Put in.
ISOLATE Separate from another.
JOG Move, start, and then stop quickly.
LET DOWN Allow to descend.
LINE UP Organize in a linear arrangement.
LOAD Place power output on line.
LOCATE Find a particular spot or place.
LOCK Secure by key, combination, or device.
LOG Record required information in a book or on a sheet.
LOWER Decrease in elevation, pressure, or temperature.
LUBRICATE Make smooth or slippery by applying a substance capable of reducing friction.
MAINTAIN Keep in an existing state.
MANIPULATE Operate mechanically or with skillful hands.
MEASURE Regulate by a standard.
MIX Combine or blend.
MONITOR Check or observe the operation of a system and its components over a period of time.
MOVE Go or pass from one place to another with continuous motion.
MULTIPLY Increase in number greatly or in multiples.
NEUTRALIZE Counteract the activity or effect. To make electrically or chemically inert.
84 NOTIFY Give formal notice to.
OBSERVE Watch with careful attention.
OBTAIN Hold onto; gain by planned action.
OPEN Make available for entry or activity.
OPERATE Start, stop, or influence the operation of a specified component or system.
ORGANIZE Arrange into a coherent unity or function.
OVERHAUL Restore to completely serviceable or operational conditions as prescribed by maintenance standards.
OVERRIDE Bypass the action of an automatic control.
PERFORM Carry out an action to conform to prescribed procedure.
PLAN Devise or formulate a program of future or contingency activity.
PLOT Represent by means of placing points on a graph.
POSITION Place a control in a discrete state.
PREPARE Compound; put together; make ready.
PRESSURIZE Apply force in a contained vessel.
PRIME Prepare for work by filling or charging with something.
PRINT Produce something in printed form.
PULL Draw out or hold back.
PUMP Raise, lower, transfer, or compress fluid or gasses by suction, pressure, or both.
PURGE Free of sediment or relieve trapped gas by bleeding.
RACK IN/OUT Insert or remove the breaker from the cabinet.
RAISE Increase in elevation.
REACTIVATE Become active or functioning again.
READ Understand visual information which is presented symbolically by scanning.
REALIZE Bring into existence.
REBUILD Restore unserviceable equipment to a like new condition in accordance with original manufacturing standards.
RECEIVE Be given written or verbal information.
RECIRCULATE Begin flow again.
RECORD Write information; document events or trends.
RELEASE Set free.
REMEMBER Retain information or recall information.
REMOVE Take away.
REPAIR Restore serviceability to an item by correcting specific damage, fault, malfunction, or failure in component or assembly.
REPLACE Substitute a serviceable component or assembly for an unserviceable counterpart.
REPORT Give an account of; formally document meeting or event proceedings.
REQUEST Ask for information.
85 RESPOND React in response; answer.
RETURN Restore something to former state or condition.
RINSE Cleanse by flushing with liquid.
RUN Continue in force or operation.
SAMPLE Draw a specimen for judging the quality of the whole.
SCAN Read quickly.
SECURE Protect from damage; control access.
SELECT Choose from a group.
SEQUENCE Arrange in order.
SERVICE Keep an item in proper operating condition.
SHUT Stop or suspend operation (see close).
SHUT DOWN Stop or suspend operation (see close).
SKETCH Draw roughly.
SPRAY Apply a jet of vapor of liquid.
START Begin; come into being.
START UP Begin; set in operation.
STOP Close or cease (see close).
STORE Lay away for future use.
SWITCH Shift to another electrical circuit; exchange.
SUBTRACT Take away by reducing.
SUPPLY Provide or furnish.
SYNCHRONIZE Arrange operations to occur simultaneously.
TELEPHONE Communicate by phone.
TEST Verify serviceability and detect failure by measuring against prescribed standards.
THROTTLE Decrease the flow of; regulate the speed of.
TITRATE Method or process of determining the strength of a (titration) solution or the concentration of a substance in solution, in terms of the smallest amount of a reagent of known concentration, required to bring about a given effect in reaction with a known volume of a test solution.
TOTAL Add up; compute.
TRACE Discover signs, evidence, or remains of; follow a path.
TRACK Be aware of a progression of activities.
TRANSFER Convey from one place or situation to another.
TRANSMIT Send or transfer from one person to another.
TRANSPORT Transfer or convey from one place to another by mechanical means.
TRIP Remove from service rapidly.
TUNE Adjust; respond to radio waves of a particular frequency.
TURN Rotate or revolve.
86 TYPE Operate a keyboard.
UNLATCH Open or loosen by lifting a latch.
UNLOAD Take off a load.
UPGRADE Raise the quality of; improve.
UPDATE Bring up to date; revise.
UNLOCK Unfasten; free from restraint.
UNCOUPLE Detach or disconnect.
VENT Release gas, liquid, or pressure.
VERIFY Confirm the accuracy of.
VENTILATE Expose to air.
WAIT Expect or remain in readiness.
WARM UP Make ready for operation by preliminary exercise or operation.
WEIGH Ascertain the weight of.
WITHDRAW Remove from use.
ZERO Adjust to zero.
Objective Conditions A properly developed learning objective should clearly state the condition that will exist at the time of trainee performance. Conditions of performance define the facility situation, environmental aspects and resources available to aid trainee performance. Typical conditions may include the following:
Facility operating mode Safety considerations or hazards System and equipment status References available Environmental conditions Problem situations or contingencies (abnormal or emergency)
Learning objective conditions are derived from various job conditions identified during analysis.
When developing learning objective conditions, adjustments may be necessary to reflect the degree of fidelity that can be achieved in the training setting.
For example, job conditions can be simulated with high fidelity during OJT and simulator training, because they mirror the actual job conditions. When classroom or self-pacing is used, the learning objective conditions are limited by the constraints of the classroom or self-paced environment. If an implied condition is used, it should be easily understood by all who read the objective.
Objective Standards (Ref 7)
A well-prepared learning objective includes a standard for evaluating student performance. The trainee's action should result in an output, and the required quantity or quality of that output is the standard of performance. Standards can include step-by-step processes that do not permit deviation.
Others may prescribe the product of performance and the factors for judging that product.
Standards are derived from job standards identified during analysis. Similar to the development process for conditions, learning objective standards also should be adjusted to reflect fidelity to job standards. In some cases an implied standard may not be included in the objective. For example, an implied standard of without error may be assumed for a procedural step. If an implied standard is used, it should be easily understood by all who read the objective.
Characteristics of Good Standards What is specified Example Completeness Precise nature of the output; number of features output must contain; minimum acceptable level of performance.
Number of steps or sequence of steps that must be covered; reference to a plant operating procedure.
Using a calculator, multiply two three-digit numbers and write the answer to the nearest tenth.
Given a process sample, laboratory equipment, and reagents, analyze the sample for pH in the correct sequence and in accordance with the plant procedure.
Accuracy Implying the standard of NO ERROR; how exact the performance must be; correct numbers reflecting tolerances Value of dimensions that acceptable answer/
performance can assume (these may be qualitative)
Given the reactor plant at power, the feedwater regulating system in manual, a wide range of steam generator level readings, and a steam generator system status, calculate the steam generator narrow range level to +/- 5% of the wide range level.
Given a misadjusted carburetor and the necessary tools, adjust the carburetor so the engine idles at its smoothest point in accordance with the technical manual Speed Amount of days, hours, minutes, or seconds allowed for performance Given a 200-word rough draft, type a letter without error at a minimum speed of 40 words per minute.
Given a disassembled globe valve, rags, gasket, material, tools, and a technical manual, reassemble the globe valve to operating condition within 30 minutes
Appendix 7 Exam Item Plausibility Definitions:
Plausibility -reasonable, appearing worthy of belief (seemingly true, appears to be reasonable or valid)
Implausible - provoking disbelief Plausibility Guidelines:
The distractor must not conflict with information in the stem The distractor must not be a subset of another distractor or the answer The distractor must be consistent with the laws of physics/science.
The plant process referred to by the distractor must exist at the plant The equipment configuration or flow path must be physically possible The switch position referenced by the distractor must exist on the component or on a similar component.
The component (e.g. power supply) in the distractor must be in the same category (e.g.
safety related) as the answer, if applicable A system used as a distractor must operate in a similar manner as other systems The value in a distractor should correspond to an equipment set point (e.g. alarm set point)
The value referenced in a distractor should correspond to an incorrect method of calculation (e.g. using the wrong curve, using psig vs. psia).
The terminology (e.g. job title, action) must be used at the plant The action referenced in the distractor must occur in the procedure or in a similar procedure.
The distractor should become the correct answer if the procedure was not implemented correctly The distractor should become the correct answer if an outdated equipment modification or procedure revision (i.e., within the last couple of cycles) was used instead of the correct configuration or revision.
A minor change in the stem (e.g. containment pressure = adverse value) will make the distractor correct
APPENDIX H Resolution of Public Comments A notice of opportunity for public comment on this Interim Staff Guidance (ISG) was published in the Federal Register (insert FR Citation #) on [date] for a 30-60 day comment period. [Insert number of commenters] provided comments which were considered before issuance of this ISG in final form.
Comments on this ISG are available electronically at the NRC's electronic Reading Room at http://www.nrc.gov/reading-rm/adams.html. From this page, the public can gain entry into ADAMS, which provides text and image files of NRC's public documents. Comments were received from the following individuals or groups:
Letter No.
ADAMS No.
Commenter Affiliation Commenter Name Abbreviation 1
2 3
4 5
The comments and the staff responses are provided below.
Comment 1: [Each comment summary must clearly identify the entity that submitted the comment and the comment itself].
NRC Response: Comment responses should begin with a direct statement of the NRC staffs position on a comment, e.g., the NRC staff agrees with the comment or the NRC staff disagrees with the comment.
If the NRC staff agrees, explain why and provide a clear statement as to how the relevant language was revised or supplemented to address the comment. Include the following language at the end of the comment response: The final ISG was changed by <describe the change; if necessary by quoting the newly revised language>.
If the NRC disagrees with a comment and no change was made to the generic communication, then explain why and provide the following language at the end of the comment response: No change was made to the final ISG as a result of this comment.
4 APPENDIX I References
- 1. ANSI/ANS-3.1-2014, Selection, Qualification, and Training of Personnel for Nuclear Power Plants
- 2. Nuclear Energy Innovation and Modernization Act (NEIMA; Public Law 115-439), as amended by the Energy Act of 2020
- 3. Nuclear Energy Institute, Template for an Industry Training Program Description, NEI 06-13A Revision 2.
- 4. U.S. Code of Federal Regulations, "Domestic Licensing of Production and Utilization Facilities," Part 50, Chapter 1, Title 10, Energy,
- 5. U.S. Code of Federal Regulations, "Licenses, certifications, and approvals for nuclear power plants," Part 52, Chapter 1, Title 10, Energy,
- 6. U.S. Code of Federal Regulations, "Operators' licenses," Part 55, Chapter 1, Title 10, Energy,
- 7. U.S. Department of Energy, Training Program Handbook: A Systematic Approach To Training, DOE-HDBK-1078-94; August 1994
- 8. US Marine Corp, Systems Approach to Training (SAT) Manual, June 2004
- 9. U.S. Nuclear Regulatory Commission, Review of Risk-Informed, Technology-Inclusive Advanced Reactor Applications - Roadmap, DANU-ISG-2022-01, September 2022
- 10. U.S. Nuclear Regulatory Commission, Human Factors Engineering Program Review Model, NUREG-0711, Revision 3.
- 11. U.S. Nuclear Regulatory Commission, Knowledge and Abilities Catalog for Nuclear Power Plant Operators: Pressurized Water Reactors, NUREG-1122, Revision 3.
- 12. U.S. Nuclear Regulatory Commission, Knowledge and Abilities Catalog for Nuclear Power Plant Operators: Boiling Water Reactors, NUREG-1123, Revision 3.
- 13. U.S. Nuclear Regulatory Commission, Qualification and Training of Personnel for Nuclear Power Plants. Regulatory Guide 1.8, Revision 4.
- 14. U.S. Nuclear Regulatory Commission, Operator Licensing Examination Standards for Power Reactors, NUREG-1021, Revision 12.
- 15. U.S. Nuclear Regulatory Commission, Operator Licensing Programs, DRO-ISG-2023-01
- 16. U.S. Nuclear Regulatory Commission, RIL2020-07 Cognitive Task Analysis Technical Basis and Guidance Development
5
- 17. U.S. Nuclear Regulatory Commission, Risk-Informed, Technology-Inclusive Regulatory Framework For Commercial Nuclear Plants, Draft Rule, May 2022. (ML22125A000)