ML20117N069

From kanterella
Jump to navigation Jump to search
Examiners Handbook for Developing Operator Licensing Examinations
ML20117N069
Person / Time
Issue date: 01/31/1988
From:
Office of Nuclear Reactor Regulation
To:
References
NUREG-BR-0122, NUREG-BR-122, NUDOCS 9609180334
Download: ML20117N069 (122)


Text

1

p'" "'%,

United States Nuclear Regulatory Commission l

l Examiners' Handbook for

! Daveloping Operator

L.icensing Examinations i

s J

i l

JANUARY 1988 Division of Licensee Performance and Quality Evaluation O' /'

Office of Nuclear Reactor Regulation h

9609180334 880131 80RO1 R

PDR

I

?

4 EXAMINERS' HANDBOOK FOR DEVELOPING OPERATOR LICENSING EXAMINATIONS Prepared for U.S. Nuclear Regulatory Commission Under Contract NRC-03-85-052 NRC FIN D 13 15 Prepared by Professional Examination Service 475 Riverside Drive New York, New York 10115 (Revision 4)

January 1988

I j

s ABSTRACT A procedure is presented for systematically constructing content-valid licens-ing examinations for nuclear power plant reactor operators (R0s) and senior reactor operators (SR0s) at pressurized water reactors and boiling water reac-tors.

This procedure contains explicit guidance for sampling knowledge and abilities (K/As) from companion documents which define important, safety related, testable K/As for safe power plant operation at the R0 and SR0 levels.

The companion documents, "Knowledges and Abilities Catalog for Nuclear Power Plant Opera + ors:

Pressurized Water Reactors" (NUREG-1122) and " Knowledge and Abili-ties Catalog for Nuclear Power Plant Operators:

Boiling Water Reactors" (NUREG-1123), catalog K/As derived from a job-task analysis of the operator's work. Guidance for developing test objectives based on facility learning objec-tives is also provided. Using the procedures outlined in this handbook, opera-tor licensing examinations having demonstrable content validity can be constructed.

i i

Examiners' Handbook iii

t 1

TABLE OF CONTENTS ABSTRACT.....................................................i11 1 INTRODUCTION.............................................

1-1 1.1 Content Validity of the NRC Licensing Examination P r o c e s s................................................ 1-1 1.1.1 Selection of Job-Relevant Content............... 1-2 1.1.2 Proportional Sampling of K/As...................

1-2 1.1.3 Development of Test Objectives...................

1-3 1.1.4 Congruence Between Test Objectives and Test Items........................................... 1-3 1.1.5 Test Reliability and Quality Control............

1-4 1.1.6 Summary of Steps in Developing Licensing Examinations....................................

1-4 2 USE OF THE K/A CATALOG IN CONSTRUCTING LICENSING EXAMINATIONS.............................................

2-1 2.1 Organization of the K/A Catalogs....................... 2-1 2.1.1 Plant-wide Generic Knowledge and Ability K/As... 2-2 2.1.2 Catalog Differences............................. 2-3 2.1.3 Plant Systems and Emergency Plant Evolutions:

PWR Catalog, Section 3.......................... 2-3 2.1.4 Plant Systems and Emergency and Abnormal Plant Evolutions:

BWR Catalog, Sections 3 and 4...... 2-8 2.1.5 Components......................................

2-14 2.1.6 Theory..........................................

2-14 2.1.7 R0/SR0 Importance Ratings of K/As...............

2-15 2.1.8 Plant Specific Data............................. 2-15 2.1.9 Difference Ratings.............................. 2-15 2.2 Developing a Test 0utline.............................. 2-16 2.2.1 Using the Test Outline Forms to Develop the Written Exam................

................... 2-16 Examiners' Handbook v

I F

2.2.2 Specific Steps for Developing a Test Outline or Blueprint....................................

2-17 2.2.3 Summary of Procedures for Developing a Test Outline o r B l u e p r i n t................................................ 2-2 4 3 DEVELOPING TESTING 0BJECTIVES............................

3-1 3.1 Elements of a Testing 0bjective........................ 3-2 3.2 Sources of Testing 0bjectives..........................

3-2 3.3 Selected Examples of Poor and Good Testing Objectives.. 3-3 3.4 Testing Objectives and Level of Knowledge..............

3-5 3.5 Summary Guidelines.....................................

3-7 4 WRITTEN EXAMINATION ITEMS:

CONSTRUCTION AND SCORING..... 4-1 4.1 Objective vs. Subjective Test Items.................... 4-1 4.2 Source Questions for Test Items........................ 4-1 4.3 Development of Written Test Items...................... 4-2 4.4 Selecting the Type of Written Examination Item to Construct..............................................

4-4 4.5 Constructing Supply-Type Items......................... 4-5 4.5.1 Short-Answer Items.............................. 4-5 4.5.2 Completion (" Fill in the blank") Items.......... 4-7 4.6 Constructing Select-Type Items......................... 4-8 4.6.1 Multiple-Choice Items..................

........ 4-8 4.6.2 True-FalseItems............[...................4-15 4.6.3 Matching ltems.................................. 4-16 5 EXAM DEVELOPMENT CHECKLISTS.............................. 5-1 5.1 Constructing and Scoring Written Examinations.......... 5-1 5.1.1 General Guidance................................

5-1 5.1.2 Short Answer Questions..........................

5-1 5.1. 3 Completion Questions...........................

5-2 5.1.4 Multiple-Choice Items........................... 5-2 Examiners' Handbook vi

LIST OF FIGURES i

Figure 1 Steps in Developing Licensing Examinations........ 1-6 LIST OF TABLES Table 2.1 Structure of the PWR K/A Catalog and Supplement... 2-1 I

Table 2.2 Structure of the BWR K/A Catalog..................

2-2 Table 2.3 PWR Plant Systems and Emergency Plant Evolutions by Safety Function............................... 2-4 Table 2.4 Knowledge and Ability Categories for PWR Plant Systems...........................................

2-7 Table 2.5 Knowledge and Ability Categories for PWR Emergency Plant Evolutions........................ 2-8 Table 2.6 BWR Plant Systems by Safety Function.............. 2-9 Table 2.7 Knowledge and Ability Categories for BWR Plant Systems...........................................

2-11 Table 2.8 Emergency and Abnormal Plant Evolutions Delineated in the BWR Catalog..................... 2-12 Table 2.9 Knowledge and Ability Categories for BWR 3

j Emergency and Abnormal Plant Evolutions........... 2-13 l

Table 4.1 Considerations for Selecting Type of Written l

Examination Item to Test K/As..................... 4-19

\\

8 T

1 LIST OF FORMS Exam Outline Model:

BWR - R0............................... 2-26 K/A Record Forms:

BWR - R0................................. 2-27

)

Exam Outline Model:

BWR - SR0.............................. 2-37 i

K/A Record Forms:

BWR - SR0................................ 2-38 t.

1 i

\\

i 5.1.5 True-False Items................................ 5-3 l

5.1.6 Matching Items........

......................... 5-3 1

l l

l i

l l

1 l

I i

1 i

I Examiners' Handbook vii

4 Exam Outline Model:

PWR - R0............................... 2 - 4 8 K/A Record Forms:

PW R - R0................................. 2 - 4 9 Exam Outline Model:

PWR - SR0.............................. 2-59 K/A Records Forms:

PW R - S R0............................... 2 - 6 0 Examiners' Handbook ix

a 1 INTRODUCTION For a test to be considered valid, it must be shown to measure that which it is intended to measure. In the case of the NRC licensing examination, the intent is to measure the candidate's knowledge and ability such that those who are licensed will perform the duties of the R0 and SR0 to ensure the safe operation of the plant. The general sequence of activities needed to establish the content validity of NRC licensure examinations are outlined below.

1.1 CONTENT VALIDITY AND THE NRC LICENSURE EXAMINATION PROCESS In the last few years, there has been increasing public interest in the testing movement and in the issue of test validation in particular.

One of the most challenging of the critical issues confronting licensure agencies is the problem of which validation procedures to use in the case of licensing examinations. In licensing situations, the critical issue is the degree to which examination scores are interpretable in terms of public protection and public safety.

The public protection function of licensing examinations is necessary to assure that those who are licensed possess sufficient knowledge and ability to perform the job activities in a safe and effective manner. In order to develop examinations that serve this public protection function, the knowledge and abilities (K/As) selected for testing must be linked to and based upon a description of the most important job duties. This is accomplished through the conduct of job / task analysis (JTA),

focusing on the delineation of essential knowledge and abilities.

]

This approach to the development of content valid licensing examinations is endorsed by the testing industry in the 1985 revision of the Standards for Educational and Psychological Testing published by the American Educational Research Association, the American Psychological Association, and the National Council on Measurement in Education. The standards treat licensure examinations in a separate section in recognition of their importance and uniqueness. Accordingly, those seeking additional technical guidance are encouraged to consult Chapter Eleven of the document for further clarification.

Once the essential knowledge and abilities have been identified through the conduct of the job or task analysis, test specifications must be developed. The test specifications consist of a content outline or blueprint indicating.what proportion of items or questions shall deal with each knowledge and ability.

The overall outline may also include a) specifications as to the number of items in the examination, b) the time allowed for administration, c) the format for the items or questions, d) rules for selecting knowledge and abilities to be tested based on frequency and importance ratings by subject matter experts, and, Examiners' Handbook 1-1

N f

8 where appropriate, e) the statistical properties of test items or

- questions such as difficulty and discrimination.

It is important to note that the purpose of licensing is not to distinguish among levels of competency or to identify the most qualified, but to make reliable and valid distinctions at the minimum level of competency that the agency has selected in the interests of public protection. This level is usually defined by licensure agencies in terms of education, experience, and test scores.

This handbook in conjunction with its dual companions, the K/A Catalog and Supplement for Pressurized Water Reactors and the K/A Catalog for Boiling Water Reactors, provides the basis for the development of content valid licensing examinations for reactor operators and senior reactor operators consistent with testing industry standards described above. The examinations developed using this handbook and the appropriate K/A catalog will cover those topics listed under Title 10, Code of Federal Regulations, Part 55.

1.1.1 Selection of Job-Relevant Content First and foremost, K/As tested in a licensing examination must have been proven to be important and essential to the performance of the job or position being tested. This proof must involve a documented analysis of the job.

For the purposes of validating the content of the NRC licensing examinations, the job / task analysis performed on the licensing operator position by the Institute of Nuclear Power Operations (INPO) served as the initial source of information. The INP0 JTA identified more than 28,000 knowledge, skills, and abilities (KSAs) and nearly 800 tasks. The extensive number of INP0 tasks and KSA statements is due, in part, to the specific purpose of the INP0 analysis, which was to provide an information base to be used in developing training programs that would be applicable to all PWR and BWR facilities. Accordingly, many of the individual statements were too specific and/or too elementary for use as the basis for the development of licensing examinations. Perhaps most

~iinportant, the job content of special~ interest in licensure applications is that subset of K/As that are required for the safe operation of the nuclear plant. Although safe performance and efficient performance may have considerable overlap, any K/A that contributes to efficiency but not to safety is an inappropriate focus for the licensing examination.

The initial drafts of the K/A catalogs were reviewed by licensed SR0s as well as licensed examiners from the NRC regional offices.

These experts reviewed each statement for accuracy and completeness and then rated each statement with respect to its importance to safe operation. Further explanation of the content of the resulting K/A catalogs is provided in Section 2 which follows and in the introduction to each catalog.

Examiners' Handbook 1-2

1.1.2 Proportional Sampling of K/As Test items selected for inclusion in the examination should be based on the K/As contained in the appropriate catalog. Testing outside the documented K/As can jeopardize the content validity of the examination. Content validity can also be threatened if important K/As are omitted from the examination. Therefore, the sample of K/As that are tested should cover all the major K/A categories in the catalogs in a fashion that is consistent with their contribution to the public protection function of the examination.

Not all categories are equal in this regard. This conclusion is based on the analysis of ratings on importance and testing emphasis collected from licensed SR0s, as well as from licensed examiners from the regional offices. Section 2 describes how to develop a test outline that will provide proper documentation and guidance concerning the application of the proportional sampling techniques to ensure accurate content coverage.

1.1.3 Development of Test Objectives Test objectives provide the specific conditions for testing and the standards of performance for K/As selected for inclusion in the licensing examination. Since facility learning objectives are specific to the job requirements at that site, they should provide an excellent basis for test objectives. Examiners should refer to facility learning objectives (as described in section 3), to determine the appropriate conditions for testing and the standard of performance expected of fully competent candidates at that plant.

1.1.4 Congruence Between Test Objectives and Test Items Once the relevant K/As are selected and the testing objectives have been considered, the examiner's next responsibility is to develop test items for the written examination that will meet the testing objectives. The content and format of the test material must be clearly linked to the underlying purposes of the test.

This means that, among other things, the test items must be relevant, unambiguous, readable, and at a level of difficulty necessary to assure accurate discrimination at the minimum level of competence selected by the examiner in the interest of public safety.

PLEASE REMEMBER:

The procedures described in section 2 for using the catalog (s) in constructing the licensing examinations do permit licensing examiners latitude in the selection of the appropriate test item format and in the development of the test.

Test construction is not a mechanical process and should never be completely reduced to the blind application of testing principles. Rather, a good test is the result of the implementation of testing principles by a well-informed test examiner.

Accordingly, the information presented in section 3 should be useful to examiners in evaluating such factors as the Examiners' Handbook 1-3

level of knowledge and mode of testing appropriate for items.

Section 4 discusses the various item types, lists issues and considerations in using each type, and provides suggestions for writing good test material regardless of the format of the test items.

1.1.5 Test Reliability and Quality Control The concept of reliability refers to the making of consistent interpretations of test scores across examinations and candidates and is different from the notion of validity which emphasizes the appropriateness of the content of the NRC licensing examinations.

The higher the reliability of a test, the fewer errors will be l

made in determining whether candidates have passed or failed the l

licensing examination. Errors can be caused by such things as an inappropriate sampling of questions, candidate anxiety about the examination which leads to a level of test performance that is i

lower than actual competence, lucky guesses and unlucky slips, l

l mistakes or biases of the person administering and/or scoring the l

test, or any other factors that might affect performance that has nothing directly to do with the candidates' level of knowledge and ability related to the safe operation of the nuclear plant.

l Accordingly, the examiner should do everything possible to make l

sure that these errors are minimized and/or eliminated to the degree possible.

Key points to be remembered in reducing scoring and interpretation errors have to do with quality control. For example, maintaining standardized examination development, administration, and grading procedures will enhance the reliability of the pass / fail decision that is made. A thorough and accurate sampling from the appropriate K/A catalog (s) according to the guidance provided by the test specifications is essential in maintaining consistency from examination-to-examination. In a sense, every section of this handbook addresses some aspect of the quality control issue. Special reference must l

be made to section 2 which describes the procedures examiners are I

to implement to assure that NRC examinations cover the most important K/As in a comprehensive fashion from each test l

construction effort to the next. Please remember, this section is the most critical with regard to the procedures that need to be followed to assure the development of valid test specifications or outline.

Once the test outline has been developed, sections l

3, 4, and 5 will be very useful in constructing test items that I

are consistent with the test specifications. Finally, sections 4, 5, and 6 also describe ways of assuring that the candidates will be evaluated in a consistent, job-related manner.

1.1.6 Summary of Steps in Developing Licensing Examinations The flowchart presented in Figure One illustrates the logical and preferred sequence of activities to be followed in the development of content valid licensing examinations. The purpose Examiners' Handbook 1-4

of the flowchart is to highlight and overview the test

+

development enterprise and illustrate for examiners exactly where in the sequence their work and skills are critical to completing the process, 1

i Examiners' Handbook 1-5

FIGURE ONE 4

GTEPS IN DEVELOPING NRC LICENSING EXAMINATIONS Step One Conduct Job / Task Analysis To Produce K/A's Necessary To Assure Saf e Operation of The Plant i

Step two Validate K/A's For Completeness and Accuracy Step Three i

Validate K/A's For Importance and Testing 1

Emphasis

[

Step Four Develop Test Specificati ons Based On Validation Data Step Five Develop Test Dutline/ Blueprint By Selecting K/A's Based On The Test Specifications Step Six Develop and/or Select Test Questions Based On The Test Dutline and Test Objectives v

Step Seven Assemble Test and Conduct Required Review Step Eight Administer The Examination Step ins Score The Examination Stepkan Make Pass / Fail Determinations l

l STEPS COMPLETED FOR EXAMINERS BY NRC AND/OR BME'S l

STEPS TO BE COMPLETED BY NRC EXAMINERS Examiners

  • Handbook 1-6

2 USE OF THE K/A CATALOG IN CONSTRUCTING LICENSING EXAMINATIONS This section describes the structure of the knowledge and ability (K/A) catalogs for pressurized water reactors (PWRs) and boiling water reactors (BWRs) and the development of test outlines that will ensure valid, representative coverage of test content.

Any examination can only include a sample of all the relevant questions that can be asked of an R0 or SR0 candidcte.

To make maximum use of testing time and to ensure that the test adequately samples the content that might be covered, a test outline should be developed before writing individual questions or scenarios.

Use of the procedures described in this handbook will provide documentation for each question on the examination and will show how each question fits into the overall plan for the examination.

Use of the procedures will also assure that the overall content of the examination samples the content from each section of the catalog in proportion to estimates of the importance of the sections assigned by subject-matter-experts.

The test outline can serve as a record of specific K/As that have been tested at a given facility.

Additionally, a test outline can be used to defend the content of a particular examination, should defense become necessary, since the outline can be linked to a job / task analysis and to important K/As.

2.1 Organization of the K/A Catalogs The structure of the K/A catalog and supplement for PWRs and the structure of the K/A catalog for BWRs are shown schematically below in Tables 2.1 and 2.2.

Please open the appropriate K/A catalog and refer to it as you read through the rest of this section.

Table 2.1 Structure of the PWR K/A Catalog and Supplement PLANT-WIDE GENERIC KNOWLEDGE AND ABILITIES (33 K/As)

SAFETY FUNCTIONS (11)

Plant Systems (45 engineering systems)

Task Modes (5)

Knowledge Categories (K1 - K6)

Ability Categories (Al - A4)

System-wide Generics (15)

Emergency Plant Evolutions (38)

Knowledge Categories (EK1 - EK3)

Ability Categories (EA1 - EA2)

System-wide Generics (12)

Examiners' Handbook 2-1

COMPONENTS (8 categories)

THEORY Reactor Theory Categories (8)

Thermodynamics (10)

Table 2.2 Structure of the BWR K/A Catalog PLANT-WIDE GENERIC KNOWLEDGE AND ABILITIES (33 K/As)

SAFETY FUNCTIONS (9)

Plant Systems (54)

Knowledge Categories (K1 - K6)

Ability Categories (Al - A4)

System-wide Generics (15)

EMERGENCY AND ABNORMAL PLANT EVOLUTIONS (38)

Emergency Plant Evolutions (15)

Abnormal Plant Evolutions (23)

Knowledge Categories (E/A K1 - E/A K3)

Ability Categories (E/A A1 - E/A A2)

System-wide Generics (12)

COMPONENTS (8 Categories)

THEORY Reactor Theory Categories (8)

Thermodynamics Categories (10)

Overall, the content of the PWR catalog and supplement and the BWR catalog are similar, although the location of the information within the catalogs differs.

Plant-wide generic responsibilities are delineated in Section 2 and plant systems are delineated in Section 3 of both catalogs.

Emergency plant evolutions are delineated in Section 3 of the PWR catalog and Section 4 of the i

BWR catalog.

Components and fundamental theory are located in Sections 5 and 6 of the BWR catalog and in the supplement to the PWR catalog.

These locations will be highlighted in the descriptions which follow.

2.1.1 Plant-wide generic knowledge and ability K/As i

Plant-wide generic knowledge and ability statements are located in Section 2 of both the PWR and the BWR catalogs.

These i

statements are primarily administrative-type job requirements with broad applicability across all plant systems or emergency (and abnormal) plant evolutions. The importance of these Examiners' Handbook 2-2

4 statements is a function of their impact on the overall operation of the entire plant rather than on any specific system.

For example, the plant-wide generic knowledge statement, " Knowledge of tagging and clearance procedures," can apply to any plant i

system.

2.1.2 Catalog differences Section 3 of the PWR and the BWR catalogs are organized by safety functions.

However, the two catalogs display major format

- differences.

The formats of the two catalogs are described separately in the material which follows.

2.1.3 Plant systems and emergency plant evolutions:

PWR catalog, j

Section 3 Major safety functions must be maintained to ensure safe nuclear power plant operation.

The eleven safety functions required for a PWR plant are:

1 Reactivity Control 4

RCS Inventory Control RCS Pressure Control l

RCS Heat Transport Secondary System Heat Transport Containment Integrity Electrical i

Control Air Instrumentation Auxiliary Thermal Indirect Radioactivity Release Control In the PWR catalog and supplement, the K/As are organized within i

the 11 safety functions.

Each safety function contains both j

plant systems and emergency evolutions.

There are a total of 45 j

plant systems and 38 emergency evolutions, each with its associated K/As.

Every system has an INPO JTA identifying number.

Emergency plant evolutions (EPEs) are linked to more than one plant system and are included in the safety functions 1

after the plant systems.

The first three digits for emergency evolutions have been designated by the generic system number of i

"000."

However, each EPE also has a unique three digit number that corresponds with those numbers used in the INP0 JTA.

l Table 2.3 shows the names of each PWR plant system and emergency plant evolution organized within the 11 safety functions.

Four plant systems that are included in two safety functions are denoted by an asterisk (*).

4 i

Examiners' Handbook 2-3

4 Table 2.3 PWR Plant Systems and Emergency Plant Evolutions by Safe,ty Function Safety Function I:

Reactivity Control Systems and Malfunctions Plant Systems Control rod drive system (CRDS)

  • Chemical and volume control system (CVCS)

Rod position indication system (RPIS)

Emergency Plant Evolutions Continuous rod withdrawal Dropped control rod Inoperable / stuck control rod Reactor trip Emergency boration Anticipated transient without scram (ATWS) i Safety Function II: RCS Inventory Control Systems and Malfunctions Plant Systems

" Chemical and volume control system

Engineered safety features actuation system (ESFAS)

Emergency Plant Evolutions Loss of reactor coolant makeup Pressurizer level malfunction l

Safety Function III: RCS Pressure Control Systems and Malfunctions Plant Systems

Emergency Plant E".utions 3

Pressurizer vapor space accident Small break LOCA Large break LOCA Pressurizer pressure control system malfunction Steam generator tube leak Steam generator tube rupture Safety Function IV:

RCS Heat Transport Systems and Malfunctions Plant Systems

)

Residual heat removal system (RHRS)

Emergency Plant Evolutions Reactor coolant pump malfunctions Loss of residual heat removal system Inadequate core cooling Safety Function V:

Secondary System Heat Transport Systems and Malfunctions Plant Systems

Steam dump system (SDS)/ turbine bypass control Main turbine generator (MT/G) system Condenser air removal system (CARS)

Condensate system Main feedwater (MFW) system Auxiliary / Emergency feedwater (AFW) system Service water system (SWS)

Emergency Plant Evolutions Steam line rupture Loss of condenser vacuum Loss of main feedwater Safety Function VI: Containment Integrity Systems and Malfunctions Plant Systems Pressurizer relief tank / quench tank system (PRTS)

Containment cooling system (CCS)

Ice condenser system Containment spray system (CSS)

Containment iodine removal system (CIRS)

Hydrogen recombiner and purge control system (HRPS)

Containment system Emergency Plant Evolutions Loss of containment integrity Safety Function VII:

Electrical Systems and Malfunctions Plant Systems A.C. electrical distribution system D.C. electrical distribution system Emergency diesel generator (ED/G) system Emergency Plant Evolutions Loss of offsite and onsite power Loss of offsite power Loss of vial A.C. electrical instrument bus Loss of D.C. power Safety Function VIII:

Control Air Systems and Malfunctions Plant Systems Instrument air system (IAS)

Station air system (SAS)

Examiners' Handbook 2-5

Emergency Plant Evolutions Loss of instrument air Control room evacuation Safety Function IX:

Instrumentation Systems and Malfunctions Plant Systems Reactor protection system (RPS)

Nuclear instrumentation system (NIS)

Non-nuclear instrumentation system (NNIS)

In-core temperature monitor (ITM) system Area radiation monitoring (ARM) system Process radiation monitoring (PRM) system Emergency Plant Evolutions

]

Loss of source range nuclear instrumentation i

Loss of intermediate range instrumentation Area radiation monitoring system alarms Safety Function X:

Auxiliary Thermal Systems and Malfunctions Plant Systems Component cooling water system (CCWS)

Circulating water system Emergency Plant Evolutions Loss of component cooling water I

Loss of nuclear service water Safety Function XI: Indirect Radioactivity Release Control Systems and Malfunctions Plant Systems Containment purge system (CPS)

Spent fuel pool cooling system (SFPCS)

Fuel handling equipment system (FHES)

Liquid radwaste system (LRS)

Waste gas disposal system (WGDS)

Fire protection system (FPS)

Emergency Plant Evolutions Fuel handling incident Accidental liquid radioactive waste release Accidental gaseous-waste release Plant fire on site High reactor coolant activity Tasks underlying each system have been combined into between one and five task modes:

Generic (000)

Startup/ Shutdown (010)

Normal Operations (020)

Mode Change (030)

What If/ Abnormal (050)

Examiners' Handbook 2-6

)

The generic task mode is the first mode listed for a particular i

plant system.

The generic K/As are those job requirements which were judged to be applicable to all of the tasks within that i

system.

Most systems in the catalog delineate K/As for just the generic mode.

1 Below each task mode is a list of INP0 JTA tasks (e.g., De-energize a CRDM).

These tasks generally require similar knowledge and abilities.

The task lists may provide a context 4

within which to develop operationally oriented questions related to the associated K/As.

The K/As for each plant system are organized into six knowledge categories and four ability categories.

Each K/A category has a unique number.

Table 2.4 contains the K/A categories for the PWR plant systems.

Table 2.4 Knowledge and Ability Categories for PWR Plant Systems K1 Knowledge of the physical connections and/or cause-effect relationships between the (SYSTEM) and the following systems:

K2 Knowledge of bus power supplies to the following:

K3 Knowledge of the effect that a loss of the (SYSTEM) will have on the following:

K4 Knowledge of (SYSTEM) design feature (s) and/or interlocks which provide for the following:

K5 Knowledge of the following theoretical concepts as they apply to the (SYSTEM):

K6 Knowledge of the applicable performance and design attributes of the following (SYSTEM) components:

Al Ability to predict and/or monitor changes in parameters (to prevent exceeding design limits) associated with operating the (SYSTEM) controls including:

A2 Ability to (a) predict the impacts of the following malf0nctions or operations on the (SYSTEM); and (b) based on

{

those predictions, use procedures to correct, control, or mitigate the consequences of those malfunctions or operations:

A3 Ability to monitor automatic operation of the (SYSTEM) including:

4 A4 Ability to manually operate and/or monitor in the control room:

Examiners' Handbook 2-7

Fifteen knowledge and abilities have been identified as generic to all systems.

They are generally administrative in nature.

Unlike the plant-wide generic statements in section 2 of the PWR Catalog, the system-wide generic statements derive their importance in relation to their impact on the operation of specific systems within the plant. Therefore, the system-wide generic (SG) K/As are repeated at the end of the K/As for each plant system in the PWR Catalog.

Following the plant systems, the related emergency plant evolutions (EPEs) are included in each safety function.

Table 2.3 lists the EPEs included within each safety function. There are only three knowledge categories and two ability categories for the EPEs.

Table 2.5 contains a list of the K/A categories for the EPEs Table 2.5 Knowledge and Ability Categories for PWR Emergency Plant Evolutions EKI Knowledge of the following theoretical concepts as they apply to the emergency task:

EK2 Knowledge of the following components:

EK3 Knowledge of the basis or reasons for the following:

EA1 Ability to operate and monitor the following:

EA2 Ability to determine and interpret:

Twelve knowledge and abilities have been identified as generic to all emergency plant evolutions.

The 12 system wide generic K/As are repeated following each emergency plant evolution in the PWR Catalog as they derive their importance in relation to the specific emergency evolutions.

2.1.4 Plant systems and emergency and abnormal plant evolutions:

BWR catalog, Sections 3 and 4 Major safety functions must be maintained to ensure safe nuclear power plant operation.

The nine safety functions required for a BWR plant are:

.pa Reactivity Control Reactor Water Inventory Control Reactor Pressure Control Heat Removal from Reactor Core Containment Integrity Electrical Examiners' Handbook 2-8

I l

Instrumentation i

Plant Service Systems Radioactivity Release 1

Fifty-four plant systems have been included in Section 3 of the BWR catalog based on their relationship and importance to the safety functions.

Table 2.6 contains a list of these plant systems by safety function.

Ten plant systems each contributing to two safety functions are denoted by an asterisk (*).

Each plant system has a six-digit code number.

The first three digits are the same as those used by INP0 to identify the related SYSTEM / DUTY area.

Table 2.6 BWR Plant Systems by Safety Function i

Safety Function I:

Reactivity Control 201001 Control rod drive hydraulic system 201003 Control rod and drive mechanism 201002 Reactor manual control system 202002 Recirculation flow control system 202001

  • Recirculation system 201005

Reactor Water Inventory Control 206000 *High pressure coolant injection system 209002 *High pressure core spray system 209001

  • Low pressure core spray system 256000 Reactor condensate system 217000

Safety Function III:

Reactor Pressure Control 218000 Automatic depressurization system 239001

  • Main and reheat steam system 241000 Reactor / turbine pressure regulating system 239002 Relief / safety valves Safety Function IV:

Heat Removal From Reactor Core 206000 *High pressure coolant injection system 209002 *High pressure core spray system 207000 Isolation (emergency) condenser 209001

  • Main and reheat steam system 245000 Main turbine generator and auxiliary systems Examiners' Handbook 2-9

217000 CReactor core isolation cooling system 202001

  • Recirculation system i

203000 *RHR/LPCI:

Injection mode (Plant specific) 205000 Shutdown cooling system (RHR shutdown cooling mode)

Safety Function V:

Containment Integrity i

223001 Primary _ containment system and auxiliaries 223002 Primary containment isolation system / Nuclear steam supply shut-off 290002+

  • Reactor vessel internals 219000 RHR/LPCI: Torus / suppression pool cooling mode 226001 RHR/LPCI: Containment spray system mode 230000 RHR/LPCI: Torus / suppression pool spray mode 290001+ Secondary containment Safety Function VI:

Electrical 262001 A.C. electrical distribution 263000 0.C. electrical distribution 264000 Emergency generators (diesel / jet) 262002 Uninterruptable power supply (A.C./D.C.)

Safety Function VII:

Instrumentation i

215005 Average power range monitor / local power range monitor 1

system 215003 Intermediate range monitor system 216000 Nuclear boiler instrumentation 272000

  • Rod control'and information system 214000 Rod position information system 201004 Rod sequence control system (Plant specific) 201006 Rod worth minimizer system (RWM) (Plant specific) 215004 Source range monitor (SRM) system 215001 Traversing in-core probe Safety Function VIII: Plant Service Systems 286000 Fire protection system 234000 Fuel handling equipment Safety Function IX:

Radioactivity Release 239003 MSIV leakage control system 271000 Offgas system 288000 Plant ventilation systems 272000

  • Radiation Monitoring system i

268000 Radwaste 290002+

  • Reactor vessel internals 233000 Fuel pool cooling and clean-up Examiners' Handbook 2-10 i

I

4 261000 Standby gas treatment system 290003+ Control room heating, ventilating and air conditioning

+This system number does not correspond to an INP0 SYSTEM / DUTY area number.

Tasks listed are INP0 JTA tasks (e.g., Perform lineups on the system).

These tasks generally require similar knowledge and abilities.

The task lists may provide a context within which to develop operationally oriented questions related to the associated K/As.

The information listed within each plant system is organized into six knowledge categories and four ability categories.

Each K/A category has a unique number.

The knowledge and ability categories for the BWR plant systems are listed in Table 2.7.

Table 2.7 Knowledge and Ability Categories for BWR Plant Systems K1 Knowledge of the physical connections and/or cause-effect relationships between (SYSTEM) and the following:

K2 Knowledge of electrical power supplies to the following:

K3 Knowledge of the effect that a loss or malfunction of the (SYSTEM) will have on the following:

K4 Knowledge of (SYSTEM) design feature (s) and/or interlocks which provide for the following:

K5 Knowledge of the operational applications of the following concepts as they apply to (SYSTEM):

K6 Knowledge of the effect that a loss or malfunction of the following will have on the (SYSTEM):

Al Ability to predict and/or monitor changes in para-meters associated with operating the (SYSTEM) including:

A2 Ability to (a) predict the impacts of the following i

on the (SYSTEM) and (b) based on those predictions, use

^

procedures to correct, control, or mitigate the conse-quences of those abnormal conditions or operations:

A3 Ability to monitor automatic operati.ons of the (SYSTEM) including:

A4 Ability to manually operate and/or monitor in the control room:

Examiners' Handbook 2-11

A set of fifteen knowledge and abilities have been identified as generic to all systems.

They are generally administrative in nature.

Unlike the plant-wide generic statements in section 2 of the BWR Catalog, the system-wide generic statements derive their importance in relation to their impact on the operation of specific systems within the plant. The fifteen system-wide generic K/As are repeated at the end of each plant system included in the BWR Catalog.

Section 4 of the BWR Catalog contains 15 emergency plant evolutions and 23 abnormal plant evolutions (E/ APES).

The listing of emergency and abnormal plant evolutions was developed to include integrative situations crossing several plant systems and/or safety functions.

An emergency plant evolution is any condition, event or symptom which leads to entry into Emergency Procedures Guidelines (EPGs).

An abnormal evolution is any degraded condition, event or symptom not directly leading to an EPG entry condition nor related to an operational condition such as power operation, start-up, hot shut down, cold shut down, and refuel.

Table 2.8 contains an alphabetical list of the emergency and abnormal plant evolutions included in section 4 of the BWR Catalog.

Table 2.8 Emergency and Abnormal Plant Evolutions Delineated in the BWR Catalog EMERGENCY PLANT EVOLUTIONS High Containment Temperature (Mark III only)

High Drywell Pressure High Drywell Temperature High Off-site Release Rate High Reactor Pressure High Secondary Containment Area Radiation Levels High Secondary Containment Area Temperature High Suppression Pool Water Level Low Suppression Pool Water Level Reactor Low Water Level Scram Condition Present and Reactor Power Above APRM Downscale or Unknown Secondary Containment High Differential Pressure Secondary Containment High Sump / Area Water Level Secondary Containment Ventilation High Radiation Suppression Pool High Water Temperature ABNORMAL PLANT EVOLUTIONS Control Room Abandonment High. Containment Temperature (Mark III only)

High Drywell Pressure High Drywell Temperature High Off-site Release Rate Examiners' Handbook 2-12

1 High Reactor Pressure High Reactor Water level High Suppression Pool Temperature Inadvertent Containment Isolation Inadvertent Reactivity Addition Incomplete SCRAM Loss of CRD Pumps Loss of Main Condenser Vacuum Loss of Shutdown Cooling Low Reactor Water Level Main Turbine Generator Trip Partial or Complete Loss of A.C. Power Partial or Complete Loss of Component Cooling Water Partial or Complete Loss of D.C. Power Partial or Complete Loss of Forced Core Flow Circulation Partial or Complete Loss of Instrument Air Refueling Accidents Scram There are three knowledge categories and two ability categories for the E/ APES.

The K/As for the emergency and abnormal plant evolutions are listed in Table 2.9.

Table 2.9 Knowledge and Ability Categories for BWR Emergency and Abnormal Plant Evolutions i

E/AK1 Knowledge of the operational implications of the following concepts as they apply to (EMERGENCY OR ABNORMAL PLANT EVOLUTION):

E/AK2 Knowledge of the interrelations between (EMERGENCY OR ABNORMAL PLANT EVOLUTION) and the following:

E/AK3 Knowledge of the reasons for the following responses as they apply to (EMERGENCY OR ABNORMAL PLANT EVOLUTION):

E/AA1 Ability to operate and/or monitor the following as they apply to (EMERGENCY OR ABNORMAL PLANT EVOLUTION):

E/AA2 Ability to determine and/or interpret the following as they apply to (EMERGENCY OR ABNORMAL PLANT EVOLUTION):

Finally, 12 knowledge and abilities have been identified as generic to all emergency and abnormal plant evolutions.

The 12 system wide generic K/As are repeated following each emergency and abnormal plant evolution in the BWR Catalog as they derive their importance in relation to the specific emergency evolutions.

Examiners' Handbook 2-13

. -. - = -

2.1.5 Components Basic components such as valves and pumps are found in many systems.

The following eight categories of components, for which additional K/As are presented, are delineated in Appendix A of the PWR Catalog and Supplement and Section 5 of the BWR Catalog.

These additional K/As are more detailed or specific than those appropriate for system listings, yet at the same time they are generic to the component types.

Each component has a unique six-digit code number.

Valves Sensors / Detectors Controllers and Positioners j

Pumps Motors and Generators Heat Exchangers and Condensers Demineralizers and Ion Exchangers j

Breakers, Relays and Disconnects 2.1. 6 Theory Fundamental theoretical knowledge which underlies safe performance on the job is delineated in Appendix B of the PWR Catalog and Supplement and in Section 6 of the BWR Catalog.

Each theory topic has a unique six-digit code number.

Reactor Theory i

Neutrons Neutron Life Cycle Reactor Kinetics and Neutron Sources Reactivity Coefficients Control Rods Fission Product Poisons Fuel Depletion and Burnable Poisons Reactor Operational Physics Thermodynamics Thermodynamic Units and Properties Basic Energy Concepts Steam Thermodynamic Processes Thermodynamic Cycles Fluid Statics Heat Transfer and Heat Exchanges Thermal Hydraulics 4

Core Thermal Limits Brittle Fracture and Vessel Thermal Stress Examiners' Handbook 2-14 i

~

2.1.7 R0/SR0 i ance ratings of K/As Each K/A in tb

alog and Supplement and the BWR Catalog has been rated..

get matter experts as to its importance in terms of safety, including the control of reactivity power as well as the prevention and mitigation of accidents. Importance includes the direct and indirect impact of the K/A on safe plant operations in a manner ensuring personnel and public health and safety.

K/As were rated for importance to safety, for both R0 and SR0 positions, using the following scale:

5 = essential 4 = very important 3 = fairly important 2 = limited importance 1 = insignificant importance The average importance ratings are listed to the right of each K/A in the catalogs. As you will note, some importance ratings are flagged with either an asterisk (*) or a question mark (?).

Ratings marked with an asterisk indicate that subject matter experts varied considerably with respect to their opinion about the importance of that K/A.

An asterisk can also signify that more than 15 percent of the raters indicated that the knowledge or ability is not required for the R0 or SR0 position at their plant because it is the responsibility of someone else (e.g., SR0 vs. R0).

A question mark indicates that more than 15 percent of the raters felt that they were not familiar with the knowledge or ability as related to the particular system or design feature.

These marks highlight the need for examiners to carefully review plant-specific materials to determine whether or not that knowledge or ability is indeed appropriate for a given facility.

2.1.8 Plant specific data In the BWR Catalog, some K/A statements apply only to specific BWR plants with applicable design features.

These statements appear with a colon (:) and are followed by plant identifying information.

For example, if a statement applies to containment, it may refer only to plants with Mark I, or II, or III containment vessels.

The statement would then be followed by ":

Mark I, or II, or III".

Some statements are followed by ":

Plant-specific" indicating that they relate to a particular plant or a group of plants regardless of design or containment.

2.1.9 Difference ratings In the BWR Catalog, a dagger (+) to the left of an individual knowledge or ability statement indicates that more than 20% of the raters indicated that the {^S}1evel{^5} of knowledge or ability required by an SR0 is different than the {^S}1evel{^S} of knowledge or ability required by an R0.

Examiners' Handbook 2-15

2.2 Developing a Test Outline This section describes a procedure for selecting K/As from the PWR Catalog and Supplement and the BWR Catalog to develop a test outline.

A test outline includes the list of K/As selected as the basis for examination item development, the location of each K/A in the catalog and its importance rating.

The development of a test outline is a vital step in any content-valid testing process.

The outline may be considered a blueprint for the actual construction of the items for the examination.

The test outline provides:

-- an explicit, documented link between each question or exam topic and one (or more) K/As that have been verified as rele-vant and important, based on the job / task analysis;

-- assurance that the total array of test questions / topics samples from all relevant job content.

Omitting the planning involved in developing a test outline risks an imbalance in test content.

Imbalance can occur in two ways.

First, deficiencies may exist in test content if important topic areas are untested.

Second, irrelevancies may exist in test content if topics are included that have minimal or no rela-tionship to the purpose of the exam -- the assessment of K/As that will assure safe and competent operator performance;

-- consistency in the way exams are developed across administrations, examiners, and regions.

This consistency j

will help reduce biases in exam content due to individual examiner likes and dislikes.

It will also help ensure that all licensing decisions are based on candidate' performance on the same body of knowledge and ability, even though spe-cific topics covered on individual exams will differ.

The specific procedures to develop a test outline described below are intended to be clear-cut and easy to follow.

These procedures are not intended to dictate how many K/As or which K/As should be included in any outline.

Rather, this handbook, 3

current. guidance including 10 CFR 55 and NUREG - 1021, plant-specific priorities, and the examiner's knowledge of R0/SR0 responsibilities in ensuring the safe operation of the plant all contribute to making these decisions.

2.2.1 Using the Test outline forms to develop the written exam The test outline forms, located at the end of this section, include: (1) an Examination Outline Model leading to a blueprint for the overall outline of the written examination; and (2) a Knowledge and Ability (K/A) Record Form.

Completion of the K/A Record Form makes it possible to trace specific items in the examination to K/As in the catalog, thereby ensuring that topics included are important and relevant.

The Examination Outline Model provides an efficient method to scan the content of the Examiners' Handbook 2-16

i

~,.

catalog and to ensure well-balanced coverage across the examination items.

Separate sets of test outline forms have been included in this handbook, reflecting the differences in the specific responsibilities of R0s and SR0s at PWR and BWR facilities.

These differences affect the overall blueprint which serves as the basis for K/A selection and examination item construction.

However, the directions for using the sets of forms are the same regardless of job position or plant type.

2.2.2 Specific procedures for developing a test outline or blueprint The following procedures presume familiarity with the layout and general content of the catalog.

In order to develop a blueprint from which to develop a written licensing examination, review the appropriate Examination Outline Model for the facility type and the job position of the candidate.

Then, after reviewing the model, identify and record on the K/A Record Form all of the l

plant-specific test priorities you may wish to include in the i

examination.

Using the general guidance which has been i

developed, select K/As from the various sections of the catalog j

and complete the remainder of the K/A Record Form.

Finally, review the balance of coverage for the written examination as indicated by the specific K/As which have been included in the blueprint.

These procedures are describcd in detailed below.

Review the Examination Outline Model The Examination Outline Models have been developed to guide the construction of all written licensing examinations. Please turn now to the version of the Examination Outline Model correspond-ing to the exam you are developing.

Because of differences in the responsibilities of R0s and SR0s at BWR and PWR facilities, separate Examination Outline Models have been developed for each position at each type of facility.

(Models for each reactor type and position level are located at the end of this section of the Handbook on pages 2-26, 2-36, 2-46, and 2-56.)

The exact structure of th. model affects the overall composi-tion of the written examination in terms of testing emphasis and K/A content selection.

The Examination Outline Model is a five-tiered structure parallel to the structure of the K/A catalog. In the case of the BWR Catalog, each layer of the model matches a specific section of j

the catalog.

In the case of the PWR Catalog and Supplement, each j

layer of the model matches either a specific section of the mate-rial {^S}or{^S} readily identifiable content across several sections of the material.

The first tier of the model provides information regarding the selection of K/As related to plant-wide generic responsibilities.

(Plant-wide generic statements are located in Section 2 of both Examiners' Handbook 2-17 l

i

4 8

the PWR Catalog and Supplement and the BWR Catalog.)

Included in the model is the percentage of the total written examination which is to be derived from the array of plant-wide generic statements.

For example, approximately 10% of the total point value of the written examination for BWR R0s should be based on the plant-wide generic K/As.

The second tier of the model provides information regarding the selection of K/As related to the plant systems in the facilities.

(Plant system K/As are located in Section 3 of both the PWR Catalog and Supplement and the BWR Catalog.) All of the plant systems included in each catalog are listed in one of three groups within the tier.

The placement of each plant system into one of the groups reflects the testing emphasis to be given to that system.

The content of Group 1 is given greater emphasis than the content of Group 2 and, in turn, Group 2 content is given greater emphasis than Group 3 content. Therefore, a K/A for a Group 1 plant system has a greater likelihood of being included in an examination than a K/A for either a Group 2 or Group 3 plant system.

Two sets of guidelines are provided within the model with regard to the testing of plant system K/As.

First, the percentage of the total written examination to be derived from all of the plant systems K/As is indicated.

Second, the percentage of the total written examination to be derived from the plant systems within each system group is indicated.

For example, approximately 38% of the total written examination for R0s at BWR facilities should be based on K/As delineated within the plant systems section of the catalog.

More specifically, however, 21%

of the total written examination should be derived from the K/As delineated in connection with the 25 plant systems included in Group 1; 14% in connection with the 22 plant systems included in Group 2, and 3% in connection with the 7 plant systems included in Group 3.

The third tier of the model provides information regarding the selection of K/As related to emergency (and abnormal) plant evolutions in the facilities.

(Emergency plant evolutions are delineated in Section 3 of the PWR Catalog and Supplement and emergency and abnormal plant evolutions are delineated in Section 4 of the BWR Catalog.) All of the evolutions included in each catalog are listed in one of groups included within this tier.

As described in connection with the plant systems, placement of an emergency plant evolution into a group reflects the testing emphasis to be given to the K/As associated with the emergency evolution.

That is, material in Group 1 has a higher probability of being incorporated in a licensing exam than material in Group 2 or Group 3.

Similarly, material in Group 2 has a proportionately greater testing emphasis than material in Group 3.

The third tier of the Examination Outline Model includes two guidelines similar to those in the second tier.

First, the Examiners' Handbook 2-18

j 1

s overall proportion of the written test to be derived from K/As related to the emergency (and abnormal) plant evolutions is t

indicated.

Second, the percentage of the overall written j

examination to be derived from the evolutions within each group 3

is indicated.

For example, for the R0 position in BWR j

facilities, approximately 27% of the total written licensing examination should be drawn from K/As related to emergency and abnormal plant evolutions. However, 10% of the written exam should be derived from the 11 evolutions in Group 1, 14% percent from the 22 evolutions in Group 2, and 3% from the 5 evolutions included in Group 3.

i The fourth tier of the Examination Outline Model provides information regarding the selection of K/A topics related to components. (K/As related to the components are found in the Supplement to the PWR Catalog and in Section 6 of the BWR Catalog.) The percentage of the written examination to be derived from components section of the catalog is indicated.

In i

the case of the R0 position at a BWR facility, approximately 11%

of the written examination is to be drawn from the 8 topics within the components section of the catalog.

The fifth tier of the Examination Outline Model provides guidance concerning the selection of K/As related to fundamental theory.

(K/As related to fundamental theory are located in the Supplement to the PWR Catalog and in Section 5 of the BWR Catalog.) All of the subject areas delineated in the catalogs are included in one of the two subsections of the tier:

reactor theory or j

thermodynamics.

Additionally, within each subsection, the i

subject areas are sorted into priority groups.

Thus, for example, all topics in reactor theory fall into Group 1 or Group 2 as a function of each topic's importance in the overall testing program.

Three sets of guidelines are provided regarded the selection of K/As connected to fundamental theory:

the overall percentage of the total written examination related to fundamental theory, the specific percentage related to reactor theory and thermodynamics, and the percentage of the written examination to be derived from Group 1 and Group 2 K/As.

For example, looking again at the Examination Outline Model for the R0 position at a BWR facility, 14% of the total written exam is to be developed in connection with the fundamental theory section of the catalog.

However, 7%

is to be related to reactor theory and 7% is to be related to thermodynamics.

Within the reactor theory subsection, 3% is to be based on the Group 1 topics and 4% is to be based on the Group 2 topics. Within the thermodynamics subsection, 2% is to be based on K/As derived from the Group 1 topics and 3% and 2% are to be based on the K/As derived from the Group 2 and Group 3 topics, respectively.

These sampling patterns are consistent with the notion of proportional sampling described throughout the model.

Examiners' Handbook 2-19

~ - = - _ - _ _ _

Identify plant-specific test priorities Before selecting K/As to include in the test outline, you should identify any plant-specific, high priority items that you may want to include in the licensing examination.

High priority events, systems, problems, or new technology developments are all potential topics for an examination.

Items identified as plant-specific test priorities should be based on such considerations as (1) relevant recent licensee event reports (LERs) or other incident reports that reflect operator K/As, (2) system modifications or new procedures that have been implemented at the plant, and/or (3) any other testable areas excluded from other examinations over the past two years.

However, your list of high priority topics need not be too extensive, nor is it necessary to identify high priority topics for every examination.

As you identify a high priority topic, make a decision about where in the exam it might be tested, and note the topic on the appropriate page of the K/A Record Form.

For example, if you decide that one high priority item you have identified is best tested as part of the emergency (and abnormal) plant evolution section of the exam, write the topic down on that part of K/A Record Form used to record emergency (and abnormal) plant evolution K/As.

If you decide that a recent system modification would be best tested as part of the plant systems section of the examination, note the topic on the K/A Record Form for that section of the exam.

Once you have identified all plant-specific priority topics, look through the catalog to locate a K/A that corresponds to each topic.

First, locate the appropriate section of the catalog in which you expect to find the topic.

Then, review the K/A categories and the individual K/As in that section of the catalog to find the specific one that best represents the topic.

If you find two or more K/A statements that represent your topic, select i

the one that you feel is more specific to your topic or more accurate.

Finally, complete the required information on the appropriate part of the K/A Record Form as described below.

Review the general guidance for the selection of K/As As you review the catalog to locate K/As for construction of the examination, keep in mind the following general criteria:

Importance Importance is determined on the basis of the importance ratings listed in the catalog and any plant-specific priorities you have previously identified. You should select K/As from the catalog in accordance with their importance ratings; that is, K/As with the highest average importance ratings should be selected first.

PLEASE REMEMBER:

The average importance rating reflects the judgments of subject matter experts as to the contribution that i

Examiners' Handbook 2-20 l

E 6

4 the K/A makes to the performance of the R0 and SR0 in a manner assuring the safe operation of the plant and public and personnel health and safety.

These ratings can range from a "5" which represents a knowledge or ability essential to the safe operation of the plant and public and personnel health and safety to a "1" indicating a knowledge or ability considered insignificant with regard to assuring the safe operation of the plant and public and personnel health and safety.

In making selections, you should use K/As with ratings of at least 2.5.

K/As with importance ratings less than 2.5 can be used, but justification for their inclusion must be based on other information beyond that contained in the catalog.

That is, K/As with ratings less than 2.5 can be selected provided that their plant specific importance can be documented or that they can be clearly traced to licensee event reports (LERs) and/or other operational data available from I

the plant.

It is also required that the selection and documentation of K/As with ratings less than 2.5 be approved by the Section Leader.

A final point to consider with regard to the importance issue is i

to review the selected K/As that are marked by either an asterisk or " Plant-specific".

The purpose of this review is to assure the importance and applicability of the selected K/As to the plant by i

making a special effort to check them against the plant reference material.

Differentiation i

In addition to being important, a K/A should lead to an examination item which differentiates between competent and less-than-competent candidates.

Try to select a high percentage of K/As that will differentiate among candidates in terms of their competence. Additional guidance on developing differentiating items is provided in later sections 3 and 4 of this handbook.

Job level Different Examination Outline Models have been developed for guiding the selection of K/As for testing R0s and SR0s.

The models reflect differences in the testing emphasis related to the specific responsibilities of the R0s and the SR0s.

Each model contains information about the relative percentages of the examination that are to be related to each of the five sections of the catalog and more specific information about the breakdown of the selections within each section of the catalog.

If you are developing an examination for a BWR facility, you may find it useful to consider the K/As which are flagged by a dagger

(+) because (as was noted previously) these K/As were judged to represent different levels of either knowledge or ability for the R0 and the SRO.

The examination items developed from these K/As should be different at the R0 and the SR0 levels.

Examiners' Handbook 2-21

t Operationally oriented Choose K/As that have an explicit, operationally oriented basis.

Even when selecting K/As for the theory sections of your exams, pick K/As that have a direct link to safe and competent operator performance.

The task list at the beginning of each system can help you tie individual K/As to operator performance by providing an operational base to the K/As.

Review the task lists before you review the K/As which have been delineated for each system.

Selecting K/As for the written exam When selecting K/As for the written examination, remember that:

-- written examinetSns are especially useful for testing K/As that are difficult to judge on the basis of behavior alone (e.g., trying to determine certain aspects of candidate knowledge of basic reactor theory from performance on the simulator examination);

-- written examinations are appropriate for testing written abilities and skills (e.g., steam table calculations);

-- written examinations are appropriate for testing factual information.

There are no hard and fast rules regarding the choice of exam mode for specific knowledge or ability categories.

In general, knowledge statements can easily be tested on the written examination.

However, ability statements may also be tested on the written examination.

Select K/As and complete the K/A Record Form The following guidance is for the selection of specific K/As in each section of the catalog.

Turn to the appropriate page of the K/A Record Form corresponding to the examination you are l

developing as you read these instructions.

(Separate versions of 1

the K/A Record Form have been prepared for licensing examinations for R0s and SR0s at BWR and PWR facilities. See the pages following the Examination Outline Models on pages 2-26, 2-36, 2-46, and 2-56, for samples of these K/A Record Forms.) Note:

When you actually construct an examination, there is no predetermined sequence for completing the pages of the K/A Record Form.

Plant-wide generic K/As Review the entire list of 33 plant-wide generic K/A statements.

Select enough K/As to generate items for the written examination. The percentage of points on the written exam related to this material is indicated on the Examination Outline Model Examiners' Handbook 2-22

and on the K/A Record Form. In order to complete the K/A Record Form, you have only to check the numbers of the statements you wish to use to generate the actual items.

The importance rating of each K/A statement has already been recorded in the appropriate column. If you have identified any plant-specific priorities in this area, be sure to check the most appropriate K/As in order to cover the topics.

Plant systems Review the names of the plant systems in Groups 1, 2, and 3 as they appear on the K/A Record Form. Turn to the appropriate section of the catalog and select enough K/As for the plant systems in each group to generate items for the written examination. (Later on, you will have to locate plant reference material in order to develop questions from the K/As. Therefore, it may be helpful to select additional K/As to use in case the necessary plant reference material is not available.) The percentage of the written examination to be related to the K/As for the plant systems in each group is indicated on the Examination Outline Model and repeated on the K/A Record Form.

When selecting K/As related to plant systems, select at least one topic from every K/A category.

That is, select at least one K1 topic, one K2 topic, one K3 topic, etc., and one Al topic, one A2 topic, etc., and one system generic topic. As the K/As with the highest importance ratings generally fall into the K3, A4, and system-wide generic categories, you may wish to emphasize these categories in any additional sampling required to complete the K/A Record Form.

Finally, select K/A tepics from many systems.

Avoid selecting more than two or three K/A topics from one given system unless they are specifically related to plant specific priorities.

If you have identified plant-specific priorities in this area, be sure to select K/As that correspond to the priorities.

To record your K/A selection, fill in the system number, the K/A number, a brief summary statement of the topic, and the importance rating as it appears in the catalog for the job level of the exam you are developing.

Emergency (and abnormal) plant evolutions Review the names of the emergency (and abnormal) plant evolutions in each group as they appear on the K/A Record Form. Turn to the appropriate section of the catalog and select enough K/As for the evolutions in each group to generate examination items. Because of the difficulty in locating plant reference materials, you may wish to identify extra K/As at this time. The percentage of the written examination to be related to K/As in each group of emergency (and abnormal) plant evolutions is indicated on both the Examination Outline Model and on the K/A Record Form.

Examiners' Handbook 2-23

When selecting K/As related to emergency and abnormal plant evolutions, select at least one topic in every K/A category.

That is, select at least one E/A K1, one E/A K2, etc.. one E/A A1, etc., and one system-wide generic statement.

Generally, the statements in the K2 and the system-wide generic categories have the highest importance ratings.

Therefore, you may wish to emphasize the K/As from these categories when selecting additional statements.

Finally, select topics from many evolutions.

Avoid selecting more than two or three K/As from a specific plant evolution unless they relate to a plant specific priority.

Be sure to select K/As to match any plant-specific priorities you may have noted.

To record your K/A selection, complete the E/A plant evolution number, the K/A number, a brief summary statement of the topic, and the importance rating as it appears in the catalog for the job level of the exam you are developing.

Components Review the list of components as it appears on the K/A Record Form. Now, select enough K/As to generate the examination items required.

The total percentage of the written exam related to this area is indicated on the Examination Outline Form.

When selecting.K/As related to the component topics, sample across as'many component topics as is practical.

Select K/As for any plant specific component priorities as required.

To record your K/A selection, complete the component topic number, the specific K/A number, a brief summary statement of the topic, and the importance rating as it appears in the catalog for the job level of the exam you are developing.

Theory Review the names of the fundamental theory topics as they appear in each group on the K/A Record Form.

Turn to the delineations in the catalog, and select a sufficient number of K/As to generate the examination items.

The percentage of the written examination related to each group of fundamental theory topics is indicated on the Examination Outline Form and on the K/A Record Form.

When selecting K/As related to the theory topics, sample across as many different topics as is practical.

Select K/As for any plant-specific priorities you may have noted in this area.

To record your K/A selection, complete the theory topic number, the specific K/A number, a brief summary statement of the topic, and the importance rating as it appears in the catalog for the job level of the exam you are developing.

Examiners' Handbook 2-24

j Check the K/As for balance of coverage As you near completion of the test blueprint, check the adequacy and balance of test content coverage.

Consider the coverage outlined across the five separate sections of the K/A Record Form as well as the coverage within each section of the form.

2.2.3 Summary of procedures for developing a test outline or blueprint 1.

Review the appropriate Examination Outline Model for the licensing examination you are developing.

2.

Identify plant specific priorities.

3.

Review the general guidance for the selection of K/As including information regarding differentiation, job level, operational orientation, and selection of K/As for written examinations.

4.

Select the K/As for each of the five tiers of the outline and i

complete the K/A Record Form.

5.

Check the K/As for balance of coverage.

Examiners' Handbook 2-25

REACTOR THEORY THEHesOOYNA480CS (7%)

(7%)

l 4

I as 14%

4%

THEORY as I

3%

1 2%

I i

't i

I i

l l

l 1

11%

COMPONENTS

[

l

//

't$4 x 1 3 2 K s AJ A 2 s a p

S I

i 27%

E/A P E'S I

K1 K2 K3 K4 Ks Ks A1 A2 As A4 sg ss, s

s p

S

/

38%

PLANT SYSTEMS JS I

/

/

I I

I I

I I

I e

10%

PLANT-WIDE GENERICS EXAMWATlON OUTLNE MODEL 2-26 Ix m ners' aandbook BWR - REACTOR OPERATOR

i i

Knowledge and Abilities Record Form PLANT-WIDE GENERIC RESPONSIBILITIES BWR - Reactor Operator - 10%

Check if 294001 included K/A #

Statement Rating i,

K1.01 Knowledge of how to conduct and verify valve lineups.

3.7 K1.02 Knowledge of tagging and clearance procedures.

3.9 K1.03 Knowledge of 10 CFR 20 and related facility radiation control requirements.

3.3 K1.04 Knowledge of facility ALARA program.

3.3 K1.05 Knowledge of facility requirements for controlling access to vital / control areas.

3.2 j

K1.06 Knowledge of safety procedures related to rotating equipment.

3.2 K1.07 Knowledge of safety procedures related to electrical equipment.

3.3 K1.08 Knowledge of safety procedures related to high temperature.

3.1 K1.09 Knowledge of safety procedures related to high pressure.

3.4 K1.10 Knowledge of safety procedures related to caustic solutions.

3.1 i

K1.11 Knowledge of safety procedures related to chlorine.

2.9*

K1.12 Knowledge of safety procedures related to noise.

2.7 K1.13 Knowledge of safety procedures related to oxygen-deficient environment.

3.2 K1.14 Knowledge of safety procedures related to confined spaces.

3.2

)

K1.15 Knowledge of safety procedures related to hydrogen.

3.4 K1.16 Knowledge of facility protection requirements, including fire brigade and portable fire-fighting equipment usage.

3.5 K1.17 Knowledge of the equipment rotation schedules and the reasoning behind the rotation procedure.

2.3 j

s i

1 4

2-27 1

Examiners' Handbook BWR: RO 1of10 y

Knowledge and Abilities Record Form PLANT WIDE GENERIC RESPONSIBILITIES BWR - Reactor Operator (Continued)

Check if 294001 lacluded K/A #

Statement Rating A1.01 Ability to obtain and verify control procedure copy.

2.9 A1.02 Ability to execute procedural steps.

4.2*

A1.03 Ability to locate and use procedures and station directives related to shift staffing and activities.

2.7 A1.04 Ability to operate the plant phone, paging system, and two-way radio.

3.1*

A1.05 Ability to make accurate, clear, and concise verbal reports.

3.4 A1.06 Ability to maintain accurate, clear and concise logs, records, status boards and reports.

3.4 A1.07 Ability to obtain and interpret station electrical and mechanical drawings.

3.0 A1.08 Ability to obtain and interpret station reference material such as graphs, monographs, and tables which contain system performance data.

3.1 A1.09 Ability to coordinate personnel activities inside the control room.

3.3 A1.10 Ability to coordinate personnel activities outside the control room.

3.6 A1.11 Ability to direct personnel activities inside the control room.

3.3 A1.12 Ability to direct personnel activities outside the control room.

3.5 A1.13 Ability to locate control room switches, controls, and indications, and to determine that they are correctly reflecting the desired plant lineup.

4.5*

A1.14 Ability to maintain primary and secondary plant chemistry within allowable limits.

2.9*

A1.15 Ability to use plant computer to obtain and evaluate parametric

{

information on system and component status.

3.2 A1.16 Ability to take actions called for in the Facility Emergency plan, including (if required) supporting or acting as the Emergency Coordinator.

2.9*

2-28 Examiners' Handbook BWR: RO 2 of 10 l

y 9

Knrwl:dga and Abilitias R:c rd Fcrm PLANT SYSTEMS BWR - Reactor Operator - 38%

Plant SpeclSc Pdorities System # E/A #

K/A Tanic u ntln, Group I Plant Systems - 21%

201001 Control Rod Drive Hydraulle 211000 Standby Liquid Control 223001 Primary Contaia.rnent s'd b"'"*'I'5 201002 Reactor Manual Control 212000 Reactor Protection 002 a

nm a

/

201005 Rod Control and Information 215003 Intermediate Range Monitor

, 5p 202002 Recirculation Flow Control 215004 Source Range Monitor 239002 ReHef/ Safety Valves 203000 RHR/LPCI: Injection Mode 215005 Average Power Range Monitor / Local 241000 Reactor / Turbine Pressure Regul.

Power Range Monitor 206000 High Pressure Coolant Injection 216000 Nuclear Boiler Instrumentation 207000 Isolation (Emergency) Condenser 259002 Reactor Water Level Control 217000 Reactor Core Isolation Cooling 209000 Low Pressure Core Spray 261000 ttandby Gas Treatment 218000 Automatic Depressurization 209002 High Pressure Core Spray Syntam # K/A #

K/A Tante p atin.

2-29 Examiners' Handbook BWR: RO 3 of 10

-. ~.. -. _ -

Group I Plant Systems (C :tinuid)

Syntam # E/A #

E/A Tnple p.,gn, Group II Plant Systems - 14%

201003 Control Rod and Drive Mechanism 219000 RHR/LPCI: Torus / Suppression 262001 A.C. Electrical Distribution 201004 Rod Sequence Control Pool Cooling Mode 262002 201006 Rod Worth Minimirer 226001 RHR/LPCI: Containment Spray Uninterruptable Power Supply (A.C./D.C.)

I8' 202001 Recirculation 263000 D.C. Electrical Distribution 230000 RHR/LPCI: Torus /Suppaession 204000 Reactor Water Cleanup Pool Spray Mode 271000 Offgas 205000 Shutdown Cooling System (RHR 239001 Main and Reheat Steam 272000 Radiation Monitoring Shutdown Cooling Mode) 245000 Main Turbine Generator and 286000 Fire Protection 214000 Rod Position Information Auxiliary 290001 Secondary Containment 215002 Rod Block Monitor 256000 Reactor Condensate 290003 Control Room HVAC g nt m # E/A #

E/A Taple E "'I"'

y Group III Plant System - 3%

215001 Traversing In-Core Probe 268000 Radwaste 233000 Fuel Pool tooling and Clean-up 288000 Plant Ventilation 234000 Fuel Handling Equipment 290002 Reactor Vessel Internals 239003 M51V Leakage System # E/A #

E/A Taple patin.

2-30 Examiners' Handbook BWR: RO 4 of 10

Kn:wirdgi end Abilitits Rzecrd Form EMERGENCY AND ABNORMAL PLANT EVOLUTIONS BWR - Reactor Operator - 27%

Plant Specific Priorities F3Ad K/A #

E/A Tonic Rntino Group I Emergency and Abnormal Plant Evolutions - 10%

295005 Mein Turbine Generator Trip 295015 Incomplete SCRAM 295006 SCRAM 295024 High Drywell Pressure 295007 High Reactor Pressure 295025 High Reactor Pressure 295009 Low Reactor Water Level 295031 Reactor Low Water level 295010 High Drywell Pressure 295037 SCRAM Condition Present and

^ ' ^' "

295014 Inadvertent Reactivity Addition nr FJA e E/A #

K/A Tonic R ntine 2-31 Examiners' Handbook BWR: RO 5 of 10

Grstp II Emirtsney and Abnzrmrl Plant Evclutisna - 14%

295001 Partial or Complete Loss of 295013 High Suppression Pool Temp.

295027 High. containment Temperature Forced Core Flow Circulation 295016 Control Room Abandonment Ill Conta m nt Only) 295002 Loss of Main Condenser vacuum 295017 High Off-Site Release Rate 295028 High Drywell Temperature 295003 Partial or Complete Loss of AC 295018 Partial or Complete loss of 295029 High Suppression Pool Water Component Cooling Water l"I 295004 Partial or Complete Loss of DC 295019 Partial or Complete loss of 295030 Low Suppression Pool Water Level Instrument Air 295033 High Secondary Containment Area 295000 High Reactor Water Level 295020 Inadvertent Containment Radiation Levels 1

295011 High Containment Temperature Isolation 295034 Secondary Containment (Mark !!! Containment Only) 295022 Loss of CR0 Pumps Ventilation High Radiatfor 295012 High Drywell Temperature 295026 Suppression Pool High Water 2%038 @ Of m e Re hase Rate E/A #

K/A #

K/A Tonic Temperature R attge Group III Emergency and Abnormal Plant Evolutions - 3%

295021 Loss of Shutdown Cooling 295023 Refueling Accidents 295032 High Secondary Containment Area Temperature 295035 Secondary Containment High Differential Pressure 295036 Secondary Containment High Sump / Area Water Level E/A #

K/A #

K/A Tonic Rntina 2-32 Examiners' Handbook BWR: RO 6 of 10

Kn:wlIdg2 cnd Abiliti s Ricard Fcrm COMPONENTS BWR - Reactor Operator - 11%

Plant Specific Priorities Cn-a # E/A #

K/A Tnnic Ratine j

i 4

a 1

)

)

Group I Components.11%

291001 Valves l

291002 Sensors / Detectors j

291003 Controllers and Positioners l

291004 Pumps i

291005 Motors and Generators l

291006 Heat Exchangers and Condensers 291007 Dominera11rers and Ion Exchangers 291008 Breakers, Relays and Disconnects Cnenn # K/A #

E/A Tnnic D a*3==

i i

i

\\

1 1

4 4

a I

i r

i f

2-33 Examiners' Handbook BWR: RO 7 of 10

COMPONENTS (Centbe:d)

K/A W E/A Tant, d.ew.

Camn. #

i 4

e i

t l

1 4

i i

I i

i i

2-34 Examiners' Handbook BWR: RO 8 of 10

Kn:wl;dga and Abilities Rmrd F rm FUNDAMENTAL THEORY BWR - Reactor Operator - 14%

Plant SpeelRc Priorit'es Thenevd K/A #

K/A Tanic Ra Ine Group I Reactor Theory - 3%

292004 Reactivity Coefficients 292005 Control Rods 292008 Reactor Operational Physics Thenevd K/A #

K/A Tanle Unfine Group B Reactor Theory - 4%

292001 Neutrons 292002 Neutron Life Cycle 292003 Reactor Kinetics and Neutron Sources 292006 Fission Product Poisons 292007 Fuel Depletion and Burnable Poisons Thearse K/A #

K/A Tanic R *'I"o 2-35 Examiners' Handbook BWR: RO 9 of 10

Fundamental Deory (Certinued)

Plant Specisc Priorities Them ad E/A e E/A Tanic Datinn Group I Dennedynamics - 2%

293007 Heat Transfer and Heat Exchanges 293009 Core Thennal Limits Thermnd E/A d E/A Tanic Ratino Group II Dermodynamics - 3%

293003 Steam 293008 Therna) Hydraulics 293004 Thermodynamic Processes 293010 Brittle Fracture and Vessel Thermal Stress Thermnd E/A e K/A Tnnic Ratino Group III Bermodymanics - 2%

293005 Thermodynamic Cycles 293001 Thermodynamic Units and Properties 293006 Fluid Statics 293002 Basic Energy Concepts h rmnd E/A #

K/A Topic Natino 2-36 Examiners' Handbook BWR: RO 10 of 10 1

1

./

1 REACTOR THEORY THEHasODYNAlfICS I

{

(8%)

(7%)

l i

vs l

qq sm THEORY 2s zu a

as.

j l

i I

j i

8 1

i s

g 10%

COMPONENTS l

///

1 l

K1 K2 K3 A1 A2 SG g

g e

.s i

y l

/*(

SS%

E/A P E S

/

l

/////// &-

i l

x1 x2,Ks K4 xs K8 A,1 A2 AS A4 S4 4

I I

s l

30%

Pt. ANT SYSTEMS I

l

/

1 I

l 1

i I

e i

j 13%

Pt. ANT-WIDE GENERICS 1

EXAMWATION OUTLINE MODEL l

2-37 BWR - SENIOR REACTOR OPERATOR Examiners' Handbook 1

Knowledge and Abilities Record Form PLANT-WIDE GENERIC RESPONSIBILITIES BWR - Senior Reactor Ohrator - 13%

Check if 294001 inclu&d K/A #

Statement Rating K1.01 Knowledge of how to conduct and verify valve lineups.

3.7 K1.02 Knowledge of tagging and clearance procedures.

4.5" -

K1.03 Knowledge of 10 CFR 20 and related facility radiation control requirements.

3.8 K1.04 Knowledge of facility ALARA program.

3.6 t

K1.05 Knowledge of facility requirements for controlling access to j

vital / control areas.

3.7 K1.06 Knowledge of safety procedures related to rotating equipment.

3.4 K1.07 Knowledge of safety procedures related to electrical equipment.

3.6 K1.08 Knowledge of safety procedures related to high temperature.

3.4 K1.09 Knowledge of safety procedures related to high pressure.

3.8 K1.10 Knowledge of safety procedures related to caustic solutions.

3.4 K1.11 Knowledge of safety procedures related to chlorine.

3.3 '

K1.12 Knowledge of safety procedures related to noise.

3.1*

K1.13 Knowledge of safety procedures related to oxygen-deficient environment.

3.6 K1.14 Knowledge of safety procedures related to confined spaces.

3.4 K1.15 Knowledge of safety procedures related to hydrogen.

3.8 K1.16 Knowledge of facility protection requirements, including fire brigade and portable fire-fighting equipment usage.

3.8 K1.17 Knowledge of the equipment rotation schedules and the reasoning behind the rotation procedure.

2.6 2-38 Examiners' Handbook BWR SRO 1 of 10

f Knowledge and Abilities Record Form PLANT-WIDE GENERIC RESPONSIBILITIES BWR - Senior Reactor Operator (Continued)

Check if 294001 included K/A #

Statement Rating A1.01 Ability to obtain and verify control procedure copy.

3.4 A1.02 Ability to execute procedural steps.

4.2*

A1.03 Ability to locate and use procedures and station directives related to shift staffing and activities.

3.7 A1.04 Ability to operate the plant phone, paging system, and two-way radio.

3.2*

)

A1.05 Ability to make accurate, clear, and concise verbal reports.

3.8 A1.06 Ability to maintain accurate, clear and concise logs ocords, status boards and reports.

3.6 A1.07 Ability to obtain and interpret station electrical and mechanical drawings.

3.7 A1.08 Ability to obtain and interpret station reference material such as graphs, monographs, and tables which contain system performance data.

3.6 A1.09 Ability to coordinate personnel activities inside the control room.

4.2*

A1.10 Ability to coordinate personnel activities outside the control room.

4.2*

A1.11 Ability to direct personnel activities inside the control room.

4.3*

A1.12 Ability to direct personnel activities outside the control room.

4.2 A1.13 Ability to locate control room switches, controls, and indications, and to determine that they are correctly reflecting the desired plant lineup.

4.3*

A1.14 Ability to maintain primary and secondary plant chemistry within allowable limits.

3.4 A1.15 Ability to use plant computer to obtain and evaluate parametric information on system and component status.

3.4 A1.16 Ability to take actions called for in the Facility Emergency Plan, including (if required) supporting or acting as the Emergency Coordinator.

4.7*

2-39 Examiners

  • Handbook BWR: SRO 2 of 10

\\

Kn:wlidga cnd Abilitirs Rscrrd Fcrm PLANT SYSTEMS BWR - Senior Reactor Operator - 30%

Plant SpeclAc Priorities Symfem # K/A #

K/A Tanic D atin.

Group I Plant Systems - 17%

201005 Rod Control and [nformation 215004 Source Range Monitor 226001 RHR/LPCI: Containment Spray 202002 Recirculation Flow Control 215005 Average Power Range Monitor /

System Mode 203000 RHR/LPCI: Injection Mode local Power Range Monitor 239002 Relief / Safety Valves 216000 Nuclear Boiler Instrumentation 241000 Reactor / Turbine Pressure Regul.

206000 High Pressure Coolant Injection 217000 Reactor Core Isolation Cooling 259002 Reactor Water Level Control 207000 Isolation (Emergency) Condenser 218000 Automatic Depressurization 261000 Standby Gas Treatment 209000 Low Pressure Core Spray 223001 y

tainment and 262001 A.C. Electrical Distribution 209002 High Pressure Core Spray 211000 Standby Liquid Control 223002 Primary Containment Isolation /

212000 Reactor Protection Nuclear Steam Supply Shut-off k condary Containment System # K/A #

K/A Tanle R = *1= =

i 2-40 Examiners' Handbook BWR: SRO 3 of 10

Plant Sy:tems (Certinnd)

Group II Plant Systems - 10%

]

201001 Control Rod Drive Hydraulic 215002 Rod Block Monitor 259001 Reactor Feedwater 201002 Reactor Manual Control 215003 Intemediate Range Monitor 262002 Uninterruptable Power Supply 201004 Rod Sequence Control 219000 RHR/LPCI: Torus /$uppression (A.C./D.C.)

Pwl Cooling %

263000 D.C. Electrical Distribution 201006 Rod Worth Minimizer lH S/$uppression 230000 271000 Offgas 202001 Recirculation 204000 Reactor Water Cleanup 272000 Radiation Monitoring 234000 Fuel Handling Equipment 286000 Fire Protection 205000 $hutdown Cooling 239003 M$ly Leakage Control 214000 Rod Position Infomation 2*3

I "'U" "

245000 Main Turbine Generator and Auxiliary Sya**= # E/A #

E/A Taple Rating _

Group III Plant System - 3%

201003 Control Rod and Drive Mechanism 256000, Reactor Condensate 215001 Traversing In-Core Probe 268000 Radweste 233000 Fuel Pool Cooling and Clean-up 288000 Plant Ventilation 239001 Main and Reheat $ team 290002 Reactor Vessel Internals Syntam # E/A #

E/A Taple pneing 2-41 Examiners' Handbook BWR: SRO 4 of 10

T Kn:wl:dgy and Abiliti:s Rsctrd Fcrm EMERGENCY AND ABNORMAL PLANT EVOLUTIONS BWR - Senior Reactor Operator - 33%

Plant Specific Priorities E/A#

K/A #

K/A Tonic Ratino Group I Emergency and Abnormal Plant Evolutions - 20%

295003 Partial or Complete loss of AC 295015 Incomplete SCRAM 295027 High Containment Temperature

'0"

295016 Control Room Abandonment 295030 Low Suppression Pool Water 29 295017 High Off-$lte Release Rate

I 295007 High Reactor Pressure 295023 Refueling Accidents 295031 Reactor Low Water Level 295009 Low Reactor Water Level 295024 High Drywell Pressure 295037 SCRAM Condition Present and 295010 High Drywell Pressure 295025 High Reactor Pressure cal rU n

295013 High Suppression Pool Temp.

295026 Suppression Pool High Water 295038 High Off-Site Release Rate 295014 Inadvertent Reactivity Addition Temperature E/A #

K/A W K/A Tnnie Ratino 4

2-42 Examiners' Handbook BWR: SRO 5of10

j l

Givup H Emergency and Abnormal Plant Enlutions - 13%

295001 Partial or Complete Loss of 295018 Partial or Complete loss of 295032 High Secondary Containment Area Forced Core Flow Circulation Component Cooling Water Temperature 295002 Loss of Main Condenser Vacuum 295019 Partial or Complete Loss of 295033 High Secondary Containment Area 295004 Partial or Complete Loss of DC Instrument Air Radiation Levels Power 295020 Inadvertent Containment 295034 Secondary Containment Ventilation 295005 Main Turbine Generator Trip Isolation High Radiation 295008 High Reactor Water Level 295021 Loss of Shutdown Cooling 295035 Secondary Containment High 295011 High Containment Temperatum 295022 Loss of CRD Pumps i enntial PM55um 295036 M ondary Contatnu -t Higt 295012 High Drywe11 Temperature 295028 High Drywell Temperature Sump / Area Water Level 295029. High Suppression Pool Water Level F1A d E/A d E/A Tante htine i

l l

l l

I i

2-43 Examiners' Handbook BWR: SRO 6 of 10

Kn:wledg3 ctd AbilitiIs Record Farm COMPONENTS BWR - Senior Reactor Operator - 10%

Plant SpeclSc Pdorities Onmn # K/A #

K/A Tante D a*1==

Group I Components - 10%

291001 Valves 291002 Sensors / Detectors 291003 Controllers and Positioners 291004 Pumps 291005 Motors and Generators 291006 Heat Exchangers and Condensers 291007 Demineralizers and Ion Exchangers 291008 Breakers. Relays and Disconnects Cnmn. # K/A #

K/A Tanic

~P =*la-2-44 Examiners' Handbook BWR: SRO 7 of 10

... ~. -

COMPONENTS (Continred)

P-y e K/A #

K/A Tant, u nten.

l l

l t

l l

l l

1 r

l l

l t

l l

2-45 Examiners' Handbook BWR: SRO 8 of 10

Kazwl:dg3 and Abilitlis Rzord h;rm FUNDAMENTAL THEORY BWR - Senior Reactor Operator - 14%

Plant Spec!8c Priorities

%marve E/A #

E/A Tneste Daetne Group I Reactor neory - 2%

292004 Reactivity Coefficients 292008 Reactor Operational Physics Thanewd E/A d E/A Tarste D aelne Group H Reactor Theory - 5%

292001 Neutrons 292002 Neutron Life Cycle 292003 Reactor Kinetics and Neutron Sources 292005 Control Rods 292006 Fission Product Poisons 292007 Fuel Depletion and Burnable Poisons Daarve E/ A #

E/A Tarste U netne 2-46 Examiners' Handbook BWR: SRO 9 of 10

I i

Fundamental Theory (Cc;.tinued)

Plant Speci8c Priorities Them=8 E/A #

E/A Tanic unting Group I 'Ihermodynamics - 3%

293007 Heat Transfer and Heat Exchanges 293008 Thermal Hydraulics 293009 Core Therinal Limits The m d E/A #

E/A Tanic R ating Group II Thermodynamics - 2%

293003 Steam 293004 Thermodynamic Processes 293010 Brittle Fracture and Vessel Thermal Stress Thermnd E/A #

E/A Tanic Rnting _

Group III Thermodymanics - 2%

293001 Thermodynamic Units and Properties 293002 Basic Energy Concepts 293005 Thermodynamic Cycles 293006 Fluid Statics Thermnd E/A #

E/A Tanic Hnting 2-47 Examiners' Hanobook BWR: SRO 10 of 10

i.

l I

/

q REACTOR THEORY THERMODYNAMICS 9

i (8%)

(7%)

i l

1s I

9%

i 14%

2%

THEORY 3%

j d5 2%

f' i

4 l

t s

j l

11%

COlrONENTS j+Y,#

i d

S l

K1 K2 K3 A1 A2 SG l

0 5

f 27%

E P E*S

/ /

l l/ ////////'

/

\\

j K1 K2 K3 K4 Kg K8 A1 A2 A3 A4 se O

S

/

i Pl ANT SYSTEMS 38%

F 10%

PLANT-WIDE GENERICS EXAMINATION OUTLINE MODEL us Examiners

  • Handbook PWR - Reactor Operator

l Knowledge and Abilities Record Form PLANT-WIDE GENERIC RESPONSIBILITIES PWR - Reactor Operator - 10%

Check if 194001 included K/A #

Statement Rating K1.01 Knowledge of how to conduct and verify valve lineups.

3.6 K1.02 Knowledge of tagging and clearance procedures.

3.7 K1.03 Knowledge of 10 CFR 20 and related facility radiation control requirements.

2.8 K1.04 Knowledge of facility ALARA program.

3.3 K1.05 Knowledge of facility requirements for controlling access to vital / control areas.

3.1 K1.06 Knowledge of safety procedures related to rotating equipment.

3.4 K1.07 Knowledge of safety procedures related to electrical equipment.

3.6*

K1.'08 Knowledge of safety procedures related to high temperature.

3.5*

K1.09 Knowledge of safety procedures related to high pressure.

3.4 I

K1.10 Knowledge of safety procedures related to caustic solutions.

3.0*

K1.11 Knowledge of safety procedures related to chlorine.

3.4*

K1.12 Knowledge of safety procedures related to noise.

2.6*

K1.13 Knowledge of safety procedures related to oxygendeficient environment.

3.3*

K1.14 Knowledge of safety procedures related to confined spaces.

3.3 K1.15 Knowledge of safety procedures related to hydrogen.

3.4

. K1.16 Knowledge of facility protection requirements, including fire brigade and portable fire-fighting equipment usage.

3.5 K1.17 Knowledge of the equipment rotation schedules and the reasoning behind the rotation procedure.

2.1 s

4 x

2-49 Examiners' Handbook PWR: RO 1 of 10 1

Knowledge and Abilities Record Form I

PLANT-WIDE GENERIC RESPONSIBILITIES PWR - Reactor Operator (Continued)

Check if 194001 included K/A #

Statement Rating A1.01 Ability to obtain and verify control procedure copy.

3.3 A1.02 Ability to execute procedural steps.

4.1*

A1.03 Ability to locate and use procedures and station directives related to shift staffing and activities.

2.5 A1.04 Ability to operate the plant phone, paging system, and two-way radio.

3.0 i

A1.05 Ability to make accurate, clear, and concise verbal reports.

3.6 A1.06 Ability to maintain accurate, clear and concise logs, records, i

status boards and reports.

3.4 A1.07 Ability to obtain and interpret station electrical and mechanical drawings.

2.5 l

Al'.08 Ability to obtain and interpret station reference material such as graphs, monographs, and tables which contain system performance data.

2.6 A1.09 Ability to coordinate personnel activities inside the control room.

2.7 A1.10 Ability to coordinate personnel activities outside the control room.

2.9 A1.11 Ability to direct personnel activities inside the control room.

2.8 A1.12 Ability to direct personnel activities outside the control room.

3.1 A1.13 Ability to locate control room switches, controls, and indications, and to determine that they are correctly reflecting the desired plant lineup.

4.3*

A1.14 Ability to maintain primary and secondary plant chemistry within allowable limits.

2.5*

A1.15 Ability to use plant computer to obtain and evaluate parametric information on system and component status.

3.1 j

A1.16 Ability to take actions called for in the Facility Emergency Plan, including (if required) supporting or acting as the Emergency Coordinator.

3.1 2-50 Examiners' Handbook PWR: RO 2 of 10 l

t Knowl:dg2 cnd Abilitizs Rzerrd Ferm PLANT SYSTEMS PWR - Reactor Operator - 38%

Plant Specific Priorities System # K/A #

K/A Tonic D attne Group I Plant Systems - 17%

001 Control Rod Drive System 017 In-Core Temperature Monitor 061 Auxiliary / Emergency Feedwater System System 003 Reactor Coolant Pump System 022 Containment Cooling System 068 Liould Radweste System 004 Chemical and Volume Coritrol System 025 !ce Condenser System 071 Waste Gas Disposal System 01 3 Engineered Safety Features 056 Condensate System 072 Area Radiation Monitoring Actuation System System 059 Main Feedwater System 015 Nuclear Instrumentation System System # K/A #

K/A Tonic untine i

l 2-51 Examiners' Handbook PWR: RO 3 of 10

Plant Sy:tems (Ce::tinnd)

Group II Plant Systems - 15%

002 Reactor Coolant System 026 Containment Spray System 062 AC Electrical Distribution System 006 Emergency Core Cooling System 029 Containment Purge System 063 DC Electrical Distribution 010 Pressurizer Pressure Control 033 Spent Fuel Pool Cooling System System System 064 Emenency Diesel Generator 011 Pressurizer Level Control 035 Steam Generator System 3: stem System 039 Main and Reheat Steam 073 Process Radiation Monitoring 012 Reactor Protection System System System 014 Rod Position Indication 055 Condenser Air Removal 075 Circulating Water System Sy, stem System 079 Station Air Systea 016 Non. Nuclear Instrumentation 086 Fire Protection System System Svatem # E/A #

K/A Tanic u nein, Group III Plant System - 6%

027 Containment !adiae Removal 005 Residust Heat Removal System System 045 Main Turbine Generator 007 Pressurizer Relfef Tank / Quench o77ai fy 076 Service Water system Tank System 034 Fuel Handling Equipment System 078 Instrument Air System 008 Component Cooling Water System 041 Steam Dump System

'03 Containment System Symfem # E/A #

E/A Tanic n otin, 2-52 Examiners' Handbook PWR: RO 4 of 10

e Kn:wledg2 and Abilitlis Rze:rd Fcrm EMERGENCY PLANT EVOLUTIONS

~*

PWR - Reactor Operator. 27%

Plant SpectSc Pdorities F1AW E/A #

E/A Tante Dattne Group I Emergency and Abnormal Plant Evolutions - 12%

000005 Inoperable / Stuck Control Rod 100067 Plant Fire of $f te 000015 RCP Motor Malfunction 100068 Control ?oom Evacuation 000051 Loss of Condenser Vacuum 000024 Emergency Boration 00C069 Loss of Containment Integrity 900055 Loss of Offsite and Onsite 000026 Loss of Component Cooling Power 90007a Inadeouate Core Coo 11og Water S00057 Loss of Vital AC Electrical 300076 dign neactor Coolant Activity 000027 Pressurizer Pressure Control Instrument Bus System Malfunction F1A #

E/A #

E/A Tanle R attne 2-53 Examiners' Handbook PWR: RO 5 of 10

Emergency Plant Evol:tioza (Certinu:d)

/

j Group H Emergency and Abnormal Plant Evolutions - 13%

000001 Continuous Rod withdrawal 000025 Loss of Pesidual Heat Remova! System 000054 Loss of "a f n Feedwater 000003 Oropped Control Rod

)

000029 Anticipated Transient 100058 Loss of DC Power 000007 Reactor Trip Without Scram d

000059 Accidental Liquid #adioactive.

000008 Pressurizer Vapor Space 300032 Loss of Source Range Nuclear Waste Delease 3

Acctdent,

Instrumentation 000060 Accidental Gaseous-Waste 000009 Small Break LOCA 000033 Loss of Intermediate Range r lease e

Instrumenta tion 000011 Large Break LOCA 000061 Area Radiation Monitoring 3000$7 Steam Generator Tube Leak System Alarms 000022 Loss of Reactor Coolant Makeup 00n078 Steam Generator Tube Rupture F1A #

E/A #

E/A Tante Rating

+

i a

i i

Group HI Emergency and Abnormal Plant Evolutions - 2%

000029 Pressurizer Level Malfunction 000036 Fuel Handling Incident 000056 Loss of Offsite Power 300065 Loss of Instrument Air F1A #

K/A #

E/A Tanic Hnting 2-54 Examiners' Handbook PWR: RO 6 or 10

g Kn:wirdg2 cnd AbilitlIs RIctrd Ferm COMPONENTS PWR - Reactor Operator - 11%

Plant Specinc Priorities Cnmn # K/A #

K/A Tnnie R atino Group I Components - 9%

i l

191001 Valves 191002 Sensors & Detectors i

191003 Controllers & Positioners 1

191004 Pumps i

191006 Heat Exchangers & Condensers 191008 Breakers, Relays. Ofsconnects Cnm,n. # E/A #

E/A Tonic Datino,

1 1

l 1

I 1

1 2-55 Examiners' Handbook PWR: RO 7 of 10

COMPONENTS (Cc;tinrd)

Group II Components - 2%

191005 Motors and Generators 191007 Demineralizers & fon Exchangers Comp.# E/A #

E/A Tnple Tinting 2-56 Examiners' Handbook PWR: RO 8 of 10

i Kn:wlidgi and AbilitiIs R:ctrd Frrm FUNDAMENTAL THEORY PWR - Reactor Operator - 14%

Plant Specl8c Priodtles Thenswd E/A #

E/A Tante n atin.

Group I Reactor Theory - 4%

192004 Reactivity Coefffefents

~ 192005 Control Rods IFull and/or Part length) 192008 Reactor Operational Physics Thenrve E/A #

E/A Tante D a*In.

Group II Reactor Theory - 2%

'92003 Reactor Kinetics & Meutron Sources 192006 Ff ssion Product Poissons Thennve E/A #

E/A Tnnie D a*la.

Group III Reactor Theory - 1%

192001 Neutrons 192002 Neutron Life Cycle 192007 Fuel Depletion & Burnable i

Poisons i

Thenrv#

E/A #

E/A Tanic n.esa.

i 1

1 1

2-57 Examiners' IIandbook PWR: RO 9 of 10 l

Fundame:tal Th:ory (Continued)

/

Group I Thermodynamics '; 2%

l 193009 Core Thermal Limits 193010 Brittle Fracture & Vessel Thermal Stress he.~ " E/A #

E/A Tant, Rattne Group II Thermodynamics - 3%

193003 Steano 193007 Heat Transfer 193008 Thermal Hydraulics

~

heman K/A #

E/A Tanic R ating

\\

l Group III Thermodymanics - 2%

193001 Thermodynamic Units & Properties 193004 Thermodynamic Processes

'93005 Thermodynamic Cycles 193006 Fluid Statics and Dynamics hermnd E/A #

E/A Tante R a tin,o i

i 2-58 Examiners' Handbook PWR: RO 10 of 10

f l

REACTOR THEORY THERMOOYNAANCS l

(7%)

8 i

1%

i 2% _

s i

14%

s%

THEORY 2%

j 1%

1 3%

l t

8 8

l 1

8

,+

1 I

a o'

m 10%

COWONENTS I

l

///

/

K1 K2 K3 A1 A2 SG l

[

/

  • /

33%

EPE'S i

e

//

/

fr 4

K1 K2 K3 K4 K5 KS A1 A2 AS A4 SG 30%

PLANT SYSTEMS I

/

1 1

1 I

13%

PLANT-WIDE GENERICS EXAMWATION OUTLINE MODEL Examiners' Handbook PWR - Senior Reactor Operator 2-59

'~

f Knowledge and Abilities Record Form l

PLANT-WIDE GENERIC RESPONSIBILITIES PWR Senior Reactor Operator - 13%

Check if 194001 included K/A #

Statement Rating K1.01 Knowledge of how to conduct and verify valve lineups.

3.7 K1.02 Knowledge of tagging and clearance procedures.

4.1 3

K1.03 Knowledge of 10 CFR 20 and related facility radiation control requirements.

3.4 K1.04 Knowledge of facility ALARA program.

3.5 K1.05 Knowledge of facility requirements for controlling

[

access to vital / control areas.

3.4*

l K1.06 Knowledge of safety procedures related to rotating equipment.

3.4*

K1.07 Knowledge of safety procedures related to electrical equipment.

3.7*

K1.08 Knowledge of safety procedures related to high temperature.

3.4 K1.09 Knowledge of safety procedures related to high

)

pressure.

3.4 K1.10 Knowledge of safety procedures related to caustic solutions.

3.3 K1.11 Knowledge of safety procedures related to chlorine.

3.5*

K1.12 Knowledge of safety procedures related to noise.

2.9 K1.13 Knowledge of safety procedures related to oxygen-deficient environment.

3.6 K1.14 Knowledge of safety procedures related to confined 3.6 spaces.

K1.15 Knowledge of safety procedures related to hydrogen.

3.8*

K1.16 Knowledge of facility protection requirements, including fire brigade and portable fire-fighting equipment usage.

4.2*

K1.17 Knowledge of the equipment rotation schedules and the reasoning behind the rotation procedure.

2.5 2-60 Examiners' Handbook PWR: SRO 1 of 10

s

+

Knowledge and Abilities Record Form PLANT-WIDE GENERIC RESPONSIBILITIES l

PWR - Senior Reactor Operator (Continued) t Check if 194001 included K/A #

Statement Rating i

A1.01 Ability to obtain and verify control procedure copy.

3.4 A1.02 Ability to execute procedural steps.

3.9 A1.03 Ability to locate and use procedures and station directives related to shift staffing and activities.

3.4 j

A1.04 Ability to operate the plant phone, paging system, and two-way radio.

3.2 J

A1.05 Ability to make accurate, clear, and concise verbal reports.

3.8 A1.06 Ability to maintain accurate, clear and concise logs, records, status boards and reports.

3.4 j

A1.07 Ability to obtain and interpret station electrical and mechanical drawings.

3.2 A1.08 Ability to obtain and interpret station reference material 3

such as graphs, monographs, and tables which contain system 1

performance data.

3.1 d

A1.09 Ability to coordinate personnel activities inside the control 1

room.

3.9*

i A1.10 Ability to coordinate personnel activities outside the control room.

3.9*

4 A1.11 Ability to direct personnel activities inside the control room.

4.1*

A1.12 Ability to direct personnel activities outside the control room.

4.1*

A1.13 Ability to locate control room switches, controls, and indications, and to determine that they are correctly reflecting the desired plant lineup.

4.1 i

A1.14 Ability to maintain primary and secondary plant chemistry within allowable limits.

2.9 A1.15 Ability to use plant computer to obtain and evaluate parametric information on system and component status.

3.4 A1.16 Ability to take actions called for in the Facility Emergency Plan, including (if required) supporting or acting as the Emergency Coordinator.

4.4*

2-61 Exam!ners' Handbook PWR: SRO 2 of 10

Kn wledga and Abiliti:s R9 crd Fcrm PLANT SYSTEMS PWR - Senior Reactor Operator - 30%

Plant SpeciSc Priorities Syntam # E/A #

E/A Tonle R a tin s, Group I Plant Systems - 14%

001 Control Rod Drive System 017 In-Core Temperature Monitor 061 Auxiliary /cmergency Feedwater System System 003 Reactor Coolant Pump System 022 Containment Cooling System 063 DC Electrical Distribution 004 Chemical and Volume Control System System 025 !ce Condenser System 0' 3 Engineered Safety Features 026 Containment Spray System 056 Condensate System 014 Rod Position Indication Systei 072 Area Radiation Monitoring 059 Main Feedwater System System 015 Nuclear Instrumentation System Syntam # E/A #

E/A Tante Ratino _

2-62 Examiners' Handbook PWR: SRO 3 of 10

Plant Systems (Cctinnd) 035 Cond:nser Air Removal System Group II Plant Systems - 13%

062 AC Electrical Distribution 027 Containment todine Removal System 002 Reactor Coolant System System 064 Emergency Diesel Generator 006 Emergency Core Cooling System 028 Hydrogen Pecombiner and Purge System Control System 010 Pressuriter Pressure Control 073 Process Radiation Monitoring System 02g Containment Purge System System 01 1 Pressurizer Level Control 033 Spent Fuel Pool Cooling System 075 Circulating Water System System 034 Fuel Handling Equipment System 079 Station Air System 012 Reactor Protection System 016 Non. Nuclear Instrumentatien System 039 Main and Reheat steam System 103 Containment System Siwate- # E/A #

K/A Tante h ein.

1 l

i Group III Plant System - 3%

Stl'" $,8(' g /T"'6'"'

0'3 005 Residual Heat Removal System 045 Main Turbine Generator System 007 Pressurizer Relief Tank / Quench Tank $ystem 076 Service Water System 008 Component Cooling Water System 078 Instrument Air System 2-63 Examiners' Handbook PWR: SRO 4 of 10

=

Knowledgi cad Abilitlis R: cord F:rm EMERGENCY PLANT EVOLUTIONS PWR - Senior Reactor Operator - 33%

Plant Speciac Priorities F1Ad E/A #

E/A Tante R atin e Group I Emergency and Abnonnal Plant Evolutions - 19%

000059 Accidental Liquid Radioactive.

000001 Continuous Rod Withdrawal Wat DmW Contml W 000029 An c sted Transient Without 000067 P1 Fir On $tte 000005 Inoperable /$ tuck Control Rod 0040 Steam Une Retum MM QW % hanah 000011 Large Break LOCA M WMMah WWy 000051 Loss of Condenser vacuum 000015 RCP Motor Malfunction 000055 Lo f Offsite and hsue MM MW M M W 000024 Emergency Boration p

000076 High Reactor Coolant Activity 000057 Loss of Vital AC Electrical Instrument Bus F1A d E/A #

E/A Tante R atin o 2-64 Examiners' Handbook PWR: SRO 5 of 10

Em:rgincy PI:nt Erzluti:ns (C:ntinuid)

Group II Emergency and Abnormal Plant Evolutions - 12%

000007 Reactor Trip 000027 Pressuriger Pressure Control 000054 Loss of Main Feedwater System Malfunction 000008 Pressuriger vapor Space 000058 Loss of DC Power Accident 000032 Loss of Source. Range Nuclear Instrumentation 000060 Accidental Gaseous-Weste 000009 Small Break LOCA Release i

000033 Loss of Intemediate-Range 000022 Loss of Reactor Coolant Instrumentation 000061 Area Radiation Monitoring 000037 Steam Generator Tube Leak 200025 Loss of Residual Heat 000065 Loss of Instrument Air Removal System 000038 Steam Generator Tube Rapture F3A #

E/A #

E/A Tnnic Reelne t

Group III Emergency and Abnonnal Plant Evolutions - 2%

000028 Pressuriger Level Malfunction 000036 Fuel Handling Incident 000056 Loss of Offsite Power F1A #

K/A #

E/A Tanic R =*l==

2-65 Examiners' Handbook PWR: SRO 6 of 10 i

l l

Knowledge and Abilitlis Rxord Fonn COMPONENTS PWR - Senior Reactor Operator - 10%

Plant SpeelAc Priorities Pa==m#

E/A #

E/A Tante D efine i

Group I Components - 6%

'91001 Valves 191002 Sensors & Detectors 19'004 Pumps 191008 Breakers. Relays. Disconnects Cama # E/A #

E/A Tante D atino 2-66 3

Exarniners' Handbook PWR: SRO 7 of 10

COMPONENTS (Centh=d)

Group II Components - 4%

i l

181003 Controllers & Positioners 191005 Motors and Generators 191006 Heat Exchangers & Condensers 191007 Domineralf rers & fon Exchangers i

Canm W E/A &

E/A Tante RaeIne 2-67 Examiners' Handbook PWR: SRO 8 of 10

i Kn2wl:dg2 and Abilitlis R:ctrd Ferm FUNDAMENTAL THEORY PWR - Senior Reactor Operator - 14%

Plant Specific Priorities honev#

K/A #

E/A Tanic Watine Group I Reactor Theory - 1%

'92008 Reactor Operational Physics honewd E/A #

K/A Tnnie W attne Group H Reactor neory - 5%

192003 Reactor Kinetics & Neutron

)

Sources

'92006 Fission Product Poissons 192004 Reactivity Coefficients 192007 Fuel Depletion & Burnable 192005 Control Rods ' Full and/or Part length'

% newd K/A #

E/A Tonic D atino.

Group HI Reactor neory - 1%

192001 Neutrons 192002 Neutron Life Cycle henrw#

K/A #

K/A Tonic Ratino 2-68 Examiners' Handbook PWR: SRO 9 of 10

h Fundamrtal Th:ory (Continnd)

Group I Thermodynamics - 3%

+

193003 Steam 193009 Core Thepnal 1,1mits 19?010 Brittle Fracture & Vessel Thermal Stress Thermad E/A e E/A Tante Ratino 4

1 Group II Thermodynamics - 2%

193007 Heat Transfer 193008 Themal Hydraulics Thermad E/A #

E/A Tanic Ratino l

Group III Thermodymanics - 2%

193001 Thermodynamic Units & Properties

'93004 Thermodynamic Processes 193005 Thermodynamic cycles 193006 Fluid Statics and Dynamics h rmnd E/A #

E/A Tonic Rntinb 2-69 Examiners

  • Handbook PWR: SRO 10 of 10

3 DEVELOPING TESTING OBJECTIVES The completed test outline or blueprint provides the examiner

'with a list of K/As upon which to base the test questions. Please remember, however, that many of the K/As in the catalogs, and therefore, in your test blueprint, are stated in general terms.

Accordingly, there is some room for interpretation with regard to translating the K/A statement into a test item. Consideration of a testing objective should be helpful in clarifying the relationship or link between the K/A as stated and the actual test item. The testing objective will help you focus the item so that it measures the specific aspect of the K/A statement that you desire to test.

For the purposes of discussion, two K/As have been selected from the PWR Catalog and two K/As have been selected from the BWR Catalog. Consider K5.15 selected from the Control Rod Drive System (CRDS) and K6.07 taken from the Chemical and Volume Control System (CVCS) in the PWR Catalog. These K/As are as follows:

CRDS-K5.15 Knowledge of the following theoretical concepts as they apply to the CRDS: Relationship between RCS and MTC CVCS-K6.07 Knowledge of the applicable performance and design attributes of the following CVCS components:

Reason for venting VCT and pump castings while filling Alternately, examine K1.08 selected from the Reactor Protection System (RPS) and A2.16 taken from the Radiation Monitoring System 1

(RMS) in the BWR Catalog. These K/As are as follows-i RPS-K1.08 Knowledge of the physical connections and/or cause-effect relationships between RPS and the following: Control rod and drive mechanism RMS-A2.16 Ability to (a) predict the impacts of the following on the RMS; and (b) based on those predictions, use procedures to correct, control, or mitigate the consequences of those abnormal conditions and operations: Instrument malfunctions Although these K/As are clearly stated with respect to their content, they do not provide clear direction regarding how candidates should demonstrate the appropriate knowledge and ability.

Properly designed testing objectives will assist the examiner in determining more precisely the aspects of the knowledge or ability to be measured by the test item. The testing objective will also provide a standard against which the examiner and other reviewers can judge the test items to ensure that they are eliciting the specific knowledge and ability intended by the K/A.

1 l

Examiners' Handbook 3-1

4 Please remember--documentation of the actual test objective is not required. However, it is suggested that testing objectives be informally developed, or alternately, be considered by the examiner using the guidelines provided below, before and during the attempt to translate a K/A into a test item.

3.1 Elements of A Testing Objective Each testing objective should give consideration to the following elements:

i (1) The Performance:

A statement of what the candidate should do 3

or exhibit to demonstrate mastery of the objective.

l (2) The Conditions:

This refers to what the candidate will be given or provided with when asked to demonstrate the performance.

i Some examples of the conditions element of the testing objective include copies of procedures, graphs, and steam tables.

(3) Standards of Performance: This element refers to the level or degree to which the candidate should perform in order to demonstrate mastery of the objective. Examples of this element of the testing objective would include reference to time limits and the completeness or comprehensiveness of the response that would constitute the correct answer.

It is important that every testing objective include each of the above three elements. However, it is not necessary to state obvious conditions (such as "given paper-and pencil"), nor is it necessary in many cases to state the standards of performance (such as " answer the question correctly"). However, every testing objective should include a specific statement of the performance; That is, what the candidate must do to demonstrate mastery of the testing objective. The more precise the examiner can be in identifying the performance, the greater assurance there will be that the resulting test question provides the best measure of the selected K/A.

3.2 Sources of Testing Objectives Plant specific learning objectives are an important source of potential testing objectives. As part of its accreditation effort, each facility is required to develop learning objectives based on the conduct of a job analysis. The INPO accreditation manual states that each facility's training program will include

" learning objectives stating the action (s) the trainee must demonstrate, the conditions under which the actions will take place, and tte sten.-iards of performance the trainee should achieve upon completion of the training activity".

In addition to being resource efficient, use of the facility learning objectives can help assure that your testing objectives are appropriate for the specific job requirements at that site.

Review the relevant facility learning objectives, and ar.y other supporting instructional materials, such as lesson plans, at the Examiners' Handbook 3-2

1 l

i l

same time as you review other plant reference material for use in examination development and construction is sought.

PLEASE NOTE: Before adopting any facility learning objective as a testing objective, be sure that it meets the standards discussed throughout this Handbook. Facilities are presently in various i

stages of completing their own job-task analyses, and therefore, their learning objectives may or may not be based on a valid set of job requirements.

Furthermore, some learning objectives may l

be relevant for training, but not for licensing examinations. In attempting to make meaningful distinctions here, it will be useful for you to think of facility learning objectives as covering all of the things that an R0 or SR0 should know. In a licensing examination, however, only those things that are important in terms of the safe operation of the plant and ensuring personnel and public health and safety should be included. Accordingly, the things tested in a licensing examination represent a sub set of what is emphasized in training. For example, from the perspective of the utility, it is very useful for the trainee to know all of the things that will make the plant run more efficiently or cost-effectively. However, this type of knowledge will typically be excluded from l

consideration on the licensing examination because it does not focus squarely on what must be known to protect the public. It is strongly recommended that examiners follow this overall approach in deciding which learning objectives satisfy the public protection function of licensing.

3.3 Selected Examples of Poor and Good Testing Objectives This section provides several examples of testing objectives in order to clarify some of the issues that have been discussed above. PLEASE REMEMBER: These examples are merely representative and illustrative of some of the appropriate and inappronriate ways of stating testing objectives. It is not intended as a definitive treatment of all of the ways of handling testing objectives.

Examples of vague testing objectives are as follows:

--The candidate will know the major parts of the

(

).

--The candidate will be aware of the reasons for

(

).

--The candidate will show an appreciation for the importance of (

).

--The candidate will understand why (

) is so important.

In the first example, the term know is open to interpretation.

How will the candidate show he or she knows? By identifying them Examiners' Handbook 3-3

on a diagram? By recalling parts from memory? By explaining how each part operates? This issue must be clarified in order to improve the testing objective.

In the second example the term aware is also vague. Again, how will the candidate demonstrate that he or she is aware of the reasons? By picking the correct reason or reasons from a list?

By explaining the reason or reasons upon request from the examiner? A decision must be made by the examiner as to how the knowledge is to be made operational.

In the third example, the word appreciate is equally unclear.

What a candidate must do to exhibit appreciation is open to interpretation by the examiner. Since the testing objective could

)

be interpreted differently by examiners, the statement might lead to different types of examination items from examination-to-examination, thereby contributing to the unreliability of examinations from administration-to-administration. Again, greater specificity is required.

Finally, the term understand in the fourth example needs to be clarified. What must the candidate do to demonstrate understanding? What criteria will be applied by the examiner?

Should the candidate merely explain why it is important? Or alternately, should the candidate distinguish among correct and incorrect reasons from a listing of those provided? The examiner will need to consider the most appropriate way of providing this additional clarification.

The following examples illustrate testing objectives that are more precise in description:

--The candidate will differentiate between three types of

(

).

--Without the aid of references, the candidate will define all of the terms found in (

).

--Given a steam table, the candidate will calculate

(

).

--The candidate will read and correctly interpret (

)

within 100 degrees Centigrade.

These objectives provide a much clearer indication of what the candidate will be expected to do to demonstrate mastery of the 4

objective as an indication of competence in the underlying K/A.

Equally important, possible questions that could be constructed to test for these objectives are more apparent than for the earlier examples. It is clear that most of the above testing objectives could be translated into written examination items.

Examiners are encouraged to follow the guidelines on Selecting K/As For The Written Exam presented in section 2.2.2 on Specific Steps for Developing a Test Outline or Blueprint.

Examiners' Handbook 3-4

PLEASE REMEMBER: The stem categories in the catalogs do not relate to a particular mode of testing. However, the use of the testing objectives is intended to assist in clarifying which testing mode is most appropriate.

3.4 Testing Objectives and Level of Knowledge In addition to making sure that the testing objectives are relevant, important to job competence, and measurable, the examiner should also pay close attention to the level of knowledge implied in the selection of the testing objective, and therefore, in the actual test question. Level of knowledge refers to the degree and/or type of thinking process that is required by the candidate to meet the objective or answer the question. For ease of description and use, three major levels of knowledge are defined:

--Memory

--Comprehension

--Application / Analysis / Problem-Solving Memory, the basic or fundamental level of knowledge, involves recalling, recognizing, or remembering information such as facts, procedures, rules, principles, or definitions.

Questions at the memory level are relatively easy to write. The important decision is determining whether the knowledge is worth remembering in terms of its public protection value. Although some facts or other pieces of information are important to know in their own right, memory-level knowledge of ten serves only as a means to an end, rather than as an end in itself. That is, candidates need not only know information at the memory-level, but more importantly, they need to be able to use that information in fulfilling an important job task or responsibility related to assuring the safe operation of the plant. Questions posed at the memory level do not necessarily test for the complete understanding of the underlying concepts or issues in comparison to questions at the two other levels of knowledge.

The comprehension level is often considered to represent the lowest level of true understanding. It requires the candidate to interpret, translate, summarize, and/or discuss the implications or logical extensions, or otherwise demonstrate that he or she understands the concept beyond simply recalling a fact or principle.

Questions that require candidates to contrast, differentiate, apply, or illustrate are generally at the comprehension level.

Testing at the comprehension level means, among other things, that the candidate will be required to demonstrate understanding in ways other than simply repeating the information in the same way it is listed in the training and/or reference materials.

Examiners' Handbook 3-5

4 The application / analysis / problem-solving level involves the ability to determine or identify the appropriate fact, rule, or principle, and correctly apply it to a novel or "what-if" situation or problem to arrive at a solution or course of action.

Objectives and questions that require the candidate to estimate, analyze, calculate, or evaluate are often at the application level. In many respects, objectives and questions at this level are the most consistent with the purpose of the licensing examination which is to make sure that candidates who pass can apply their knowledge and ability to assure the safe operation of the plant.

To illustrate the three levels of knowledge, consider the following testing objectives developed for a K/A appearing in both the PWR or BWR Catalog.

(1) The candidate will state the automatic turbine trip signals.

t (2) The candidate will state the actions which occur to close the control and stop valves on a turbine trip signal.

(3) The candidate will analyze the effect of a turbine trip f

signal on the EHC Load Control Logic Circuitry.

The first objective is at the memory level.

It simply requires the candidate to list a response to a condition.

The candidate's response to this question will not give the examiner an indication as to whether the candidate can use his or her knowledge to assure the safe operation of the plant.

The second objective is closer to the comprehension level. It requires the candidate to demonstrate his or her knowledge of turbine trip signals as they affect turbine operation.

Yet it still will not provide an indication of whether the candidate knows how to analyze the effects of a turbine trip on plant operations.

The third objective is at a more operationally oriented, application level. The candidates must demonstrate his or her knowledge of turbine trips by showing that they understand the implications of a turbine trip on specific subsystems such as turbine control circuitry.

PLEASE REMEMBER: Your testing objectives should be developed to reflect the highest level of knowledge appropriate for the K/As that have been selected for inclusion in the test outline.

However, this does not mean that all objectives must be at the application level. Knowing the meaning of various terms, or being j

able to list immediate action steps might be at the memory level,

)

but are, none-the-less, appropriate for inclusion on a licensing examination.

Examiners' Handbook 3-6

The important point is that each level of knowledge makes a different demand on the candidate and provides a somewhat different perspective on the competence of the candidate. Each examiner must consider this issue when considering testing objectives and developing test items to make sure that knowledge is not tested for knowledge's sake, nor is an important K/A tested in such a way that it is far removed from the candidate's use of the K/A in the actual performance of the job. In this regard, it is critical for the examiner to note whether the selected K/A has a dagger (+) to the left of the statement in the BWR Catalog. The daggers indicate that the level of knowledge or ability required by an R0 is different than the level required by an SR0. In such cases, different testing objectives and different test items should be developed from the same K/A.

3.5 Summary Guidelines All examiners are encouraged to use the following guidelines in developing and judging the effectiveness of the testing objectives:

(1) Base the testing objective on K/As from the Catalogs in order to show a clear link between mastery of the objectives and safe and competent performance of the job.

(2) Make sure the testing objective is precise enough that any test items written from it will measure the same competence.

(3) To keep the testing objective precise, avoid the use of vague terms such as " demonstrate knowledge of", "be familiar with",

"know", or other terms that do not indicate how the objective will be demonstrated or measured.

l (4) Base the testing objective on performance that is intended to differentiate between competent and less competent performance.

The examiner should ask the following questions:

--Does being able to meet the objective relate to being a safe R0/SRO?

--Does not meeting the objective provide an indication that the candidate is less than competent?

(5) When the selected K/As have been noted with a dagger (+),

examiners should give strong consideration to the development of different testing objectives and different test items at the R0 and SR0 levels.

Examiners' Handbook 3-7

4 WRITTEN EXAMINATION ITEMS:

CONSTRUCTION AND SCORING Test questions consist of two components:

the content (what is asked) and the style or format (the way it is asked).

The K/As in the catalog are intended to ensure that the content of your questions is valid and appropriate.

However, the quality of the question depends as much on the way the item is constructed as on the content.

Important topics that are tested by ambiguous, awkward, or poorly specified questions cannot be considered valid.

Therefore, the selection of valid topics and careful construction of test items are equally important parts of a valid test development process.

The conversion of a K/A or testing objective into a test item is in good part a creative process.

Yet there are certain procedures and guidelines that can help in writing your test item and in ensuring that the item will measure the knowledge or ability that it is intended to measure.

4.1 Objective vs. Subjective Test Items The following sections offer guidance on how to select, construct, and score five different types of written test items:

short answer, completion, multiple choice, true-false, and matching.

Traditionally, questions that require the candidate to supply an answer (e.g., short answer, essay) have been considered

" subjective"; questions requiring the candidate to select an answer (e.g., multiple choice, true-false, matching) have been considered " objective." The names arose from the scoring of the items.

If graders require subject matter expertise to interpret the answers of test takers, then the question has been considered subjective; if the examination can be scored on a machine, it has been considered objective.

All of the guidance in this handbook will deal with objective items.

However, the definition of " objective" is different than the traditional one described above.

An objective test item is defined here as one in which:

(1) there is only one correct answer; and (2) all qualified graders would agree on the amount of credit allowed for any given candidate's answer.

All written questions'on the licensing examination should meet these conditions.

Questions with no single correct answer, or for which the credit given for candidates' answers vary, depending on who graded it or when it was graded, have no place in the written examinations that you develop.

4.2 Source Questions for Test Items If you are having difficulty translating a K/A or testing objective into a test question, asking yourself the following questions may heb yoa generate ideas for potential test questions:

(1) What are the common misconceptions about

?

Examiners' Handbook 4-1

(2) Why is important to satisfactory job performance?

i (3)

In what sort of circumstances might it be important to understand

?

(4) What might the individual do who does not understand

?

(5) What might be the consequences of a lack of knowledge about

?

(6) How can the individual demonstrate the knowledge?

These questions may not only help in the development of test questions, but they also may be useful to keep in mind when generating testing objectives.

4.3 Development of Written Test Items:

Guidance on the construction of each item type is provided in the following sections.

However, there are basic principles that apply across all item formats:

(1) Ensure that the concept being measured has a direct, important relationship to the ability to perform the job.

Although the importance of relevant K/As and testing objectives was stressed in earlier sections, it is equally important that construction of the question itself clearly reflects the importance of the topic.

Word the question so that is has " face validity" as well as underlying content validity.

That is, make sure that the question would be considered reasonable to other subject-matter-experts utilizing the same reference materials.

(2) State the question unambiguously and precisely.

State the question as concisely as possible, but provide all necessary information.

j Often the individuals who develop a question assume that certain stipulations or conditions are inherent in the question when in fact they are not.

It is very difficult for the person who authored the question to review it impartially or through the eyes of a new reader.

Therefore, it is very important to have others review your questions to ensure that all necessary information is included, and that all extraneous or superfluous information is deleted.

You and others should ask yourselves:

Will the candidates clearly know what they are expected to do?

Do they have all the information they need to work with? Does answering the question depend on certain assumptions that must be stated?

.pa (3) Write the question at the highest level of knowledge reflected in the testing objective.

Examiners' Handbook 4-2

i The advantages of higher level knowledge questions were discussed in Section 3 of the Handbook.

As mentioned there, objectives and questions should be written to reflect the level of knowledge 1'

that is most appropriate for a specific K/A; however, try to j

avoid high percentages of memory-level questions on your exam.

When you have a choice between two levels of knowledge, try to write your question to reflect the higher level.

(4) Make sure that the question matches the testing objective and intent of the K/A It is very easy to end up with a question that tests a relatively trivial aspect of an important K/A topic.

When reviewing your draft question, ask yourself whether it is likely that someone could answer the question correctly and still not meet the objective or intent of the K/A or perform the responsibilities or tasks for which this K/A is needed.

(5) Cull questions that are unnecessarily difficult or irrelevant.

As discussed in items 1 and 2 above, unintended irrelevancies or trivialities can easily end up in well-meant questions.

When reviewing your draft question, ask yourself:

Could someone do the job safely and effectively without being able to answer the question? If so, is it because the content is inappropriate, because the wording is unclear, or because the level of understanding is too great?

(6) Limit the question to one concept or topic, unless a synthesis of concepts is being tested.

There is a common misconception that testing for multiple K/A topics in one question is a time-efficient way to examine.

Questions containing a variety of topics and issues only serve to confuse the candidate about the purpose of the question, and therefore what is expected in terms of a correct response.

Each individual question should be reserved for testing one K/A topic, and that topic, as well as the intent of the question, should be clear to both examiner and candidate.

(7) Avoid copying text directly from training or other reference material.

Another common tendency among exam developers is to copy sentences directly from reference material and turn them into test questions.

Unfortunately, questions written in this way generally encourage rote memorization.

Further, copying from reference material can cause ambiguity or deficiency in questions because the material lifted often draws its meaning (and impurtance) from its surrounding context.

Therefore, important assumptions or stipulations stated elsewhere in the material are often omitted from the test question.

Finally, these types of questions can frequently be answered correctly by candidates who Examiners' Handbook 4-3

1 1

do not really understand the concept, but do remember the specific wording on a page of material.

Conversely, candidates who understand the topic, but not in the exact way it was written in the material, may miss the question because of unstated assumptions or other missing information.

(8) Avoid " backwards logic" questions--those questions that ask for what should be provided in the question, and provide what should be required in the candidate's response.

In addition to testing on valid topics, it is important to examine on those topics in a way consistent with how the K/A should be remembered and used.

Don't test on the topic in a backwards way.

For example, consider the following question:

"If it takes 12.5 cubic feet of concrete to build a square loading pad 6 inches thick, what is the length of one side of the pad?"

This question gives the test takers information they should be asked to calculate, while it requires them to provide information that would be supplied in an actual work situation.

In constructing your questions, make sure that you include information that candidates would typically have or have access to, and require responses that reflect the decisions, or calculations, or other information they would typically have to supply.

(9) Place the easier questions at the beginning of each section.

Starting each exam section with the easier questions helps candidates gain composure and confidence.

However, this is not to say that extremely easy, non-differentiating questions should be included in the exam for the sole sake of relieving candidate tension.

)

4.4 Selecting the Type of Written Examination Item to Construct Although each type of written item has its own advantages and disadvantages, it is important to note that the quality of a test question depends more on the quality of the test objective and the consistency between the objective and the test question than on the format of the question.

Good true-false questions can require high levels of knowledge; poor essay questions can encourage rote learning.

The quality of your test questions, regardless of the item type you select, is a function of the i

importance of the K/As, the appropriateness of the testing objectives, and the clarity and absence of ambiguity in the questions themselves.

Table 4.1 summarizes the best uses, strengths and weaknesses of each of the five item types for written examination items.

Use the table.to help you select the particular type of item to l

construct for a given K/A topic.

Keep in mind, however, that Examiners' Handbook 4-4 l

l

l many K/As can be tested by several types of items.

The table is intended to provide considerations, not specific rules, for I

deciding what kind of item to write.

4.5 Constructing Supply-Type Items (Short Answer and Completion) 4.5.1 Short-Answer items Short-answer items require the candidate to compose a response in contrast to selecting from among a set of alternative responses.

A short-answer item may be presented in such a way as to give the candidate complete freedom to express ideas or it may restrict a response to a given content area or answer format.

Keep the following guidelines in mind when constructing these open-ended items:

(1) Provide clear, explicit directions / guidelines for answering the question so that the candidate understands what constitutes a fully correct response.

Choose words carefully to ensure that the stipulations and requirements of the question are appropriately conveyed.

Words such as " evaluate," " outline," and " explain," can invite lots of detail that is not necessarily relevant.

(2) Avoid extra or superfluous wording /information in the question.

l In an attempt to make a question operationally oriented and/or l

meaningful, there can be a tendency to add more information than required for a fully correct response.

The two examples below illustrate this point:

"The term undermoderated is used to describe one of the major characteristics of a typical PWR core.

Discuss the effect that l

an increase in power would have on the moderator-to-fuel ratio, and what effect that would have on K-eff."

The first sentence of the above question adds nothing but extra reading time.

"Two isotopes, one with a half life of 10 days and another with a half life of 5 days, sit on opposite trays of a set of scales.

Assuming the isotopes have identical atomic weights, and there is 1 Curie of each isotope, which way will the scales tilt (which weighs more) and why?"

The situation in the above question bears little relationship to a situation that an operator would encounter on the job.

(3) Make sure that the answer key response matches (and is limited to) the requirements posed in the question.

i I

i Examiners' Handbook 4-5 l

" Explain the effect of increasing boron concentration on the MTC."

(total point value:

1.5)

Answer key:

Increasing the boron concentration in the reactor coolant reduces the magnitude of the negative MTC coefficient (.6 points).

When the moderator temperature is increased, the coolant density is decreased.

When the coolant density is decreased, the slowing-down effect of the water is decreased and the absorption effect of the B-10 is decreased (.5 points).

Decreasing the slowing-down effect reduces k

(.1 point).

Decreasing the B-10 absorption rate increases K-eff

(.1.

4 point); and, the larger the boron concentration, the larger the l

later effect (.2 points).

In the above example, candidates who fail to discuss how the effect of boron concentration occurs will lose 60% of the point value assigned, yet the question itself did not stipulate that this discussion was required.

"Can a licensed operator who has not been at the facility for a period of six months immediately resume duties in a licensed position?"

Answer key:

No, in any case the NRC would have to be notified before the operator resumed duties, and only after certification

'by an authorized representative of the facility or by a demonstration by the individual that his/her understanding of facility operations and administration are satisfactory.

The above question'only asks for a yes or no response, yet the answer key indicates that an explanation of that one-word response is required.

(4) Avoid giving away part or all of the answer by the way the question is worded.

"If the letdown line became obstructed, could boration of the plant be accomplished shortly after a reactor trip to put the plant in cold shutdown?

(1 point)

If so, how?

(1 point)"

Half credit was alloted for answering the first part of the question "yes."

However, a test-wise candidate can realize that the answer has to be yes, or else the second part of the question would have read something like "If so, how? If not, why not?"

(5) Avoid writing what could be considered " trick" questions.

Trick questions can occur unintentionally when the answer key does not precisely match the question ( see #3 above).

"How doe the SI termination criteria change following a SI reinitiation?"

Answer key:

It does not change.

Examiners' Handbook 4-6

i t

J, l

The question above asks for how,.not if, the termination criteria 7

I change.

(6) For questions requiring computation, specify the degree of precision expected.

Try to make the answer turn out to be whole numbers.

4.5.2 Completion (" fill in the blank") items Completion items require the candidate to supply a word, symbol, or short phrase in response to a question or to complete a statement.

The items are most appropriate for testing formulas, numerical solutions, or the recall of simple facts.

Completion items should be constructed according to the following l

guidelines:

l l

(1) The statement should be complete enough so that (a) there can be no doubt as to its meaning; and (b) there is only one correct answer.

l Poor:

Saturation conditions are being approached in the reactor l

coolant system as indicated by Better:

One indication from the control room that saturation conditions are being approached in the reactor coolant system is (2) Avoid the use of too many blanks.

Use no more than two in j

any question.

l Poor:

A is a device for converting energy into energy.

Better:

A generator is a devico for converting mechanical energy into energy.

(3) If possible, blanks should be kept near the end rather than near the beginning of the sentence.

Poor (must be read twice):

A is used to prevent an overload in an electric light circuit.

Better:

A device to prevent an overload in an electric light circuit is a (4) Avoid giving clues to the answer or making it easier to guess.

For example, do not use "a" or "an" which can indicate whether the answer begins with a consonant or a vowel.

I Examiners' Handbook 4-7 j

1 (5) Except in rare cases, verbs should not be omitted.

(6) Only those key words which the candidate should know should be omitted.

Do not ask for the recall of trivial details.

4.6 Constructing Select-Type Items (Multiple-Choice, True-False, Matching)

Constructing good select-type items can take considerably more time than constructing supply-type items.

However, these items have two major qualities which make the investment of time and

-effort in constructing them worthwhile.

First, the scoring of objective items is considerably more reliable and less time consuming than scoring open-ended response items.

Second, since the item requires less time to answer, more items can be used to test K/As.

This will provide better content coverage, which will also increase test reliability 4.6.1 Multiple-Choice items Multiple-choice items are the most common and most popular of the select-type items.

Although multiple-choice items are not as easy to construct as other forms, they are very versatile, can be used to test for all levels and types of knowledge, and minimize the likelihood of the candidate obtaining the correct answer by i

guessing.

Construct multiple-choice items using the following guidelines:

(1) Use four answer options, if possible.

Five answer options may be used if there are four logical distractors and as few as three answer options may be used if there are only two logical distractors.

However, in most cases more than five answer options contributes nothing to the question but confusion; questions with fewer than three options are, by definition, true-false questions.

(2) Do not use "none of the above" or "all of the above."

"All of the above" questions provide inadvertent clues to the candidate.

When the "all of the above" option is the correct response, the candidate need only recognize that two of the options are correct to answer the question correctly.

When the "all of the above" option is used as a distractor, the candidate needs only to be able to determine that one option is incorrect in order to eliminate this option.

"None of the above" responses should not be used with "best answer" multiple-choice questions, since it may always be defensible as a response.

(3) Don't present a collection of true-false statements as a multiple-choice item.

As discussed earlier, each item should be focused on one K/A topic.

A question containing answer options related to many Examiners' Handbook 4-8 i

1 separate issues does not increase the efficiency of the question.

On the contrary, questions with multiple topics only confuse the candidate about the meaning and purpose of the question.

The following statements are based on the Unit 1 procedure " Low Condenser Vacuum," 1-AP-14.

Which one is a true statement?

A.

Purpose:

This procedure provides indications of probable

)

causes for and immediate long-term actions to be taken during a complete or partial loss of condenser vacuum.

B. Indication:

Increasing exhaust hood temperature.

j C.

Probable Cause:

Low condenser hotwell level.

j D.

Immediate Operator Action:

If condenser pressure decreases below 9.5" Hg abs. and the turbine has not tripped automatically, manually trip the turbine.

Trip the reactor if power is greater than 10%.

Another version of this type of question is what has been dubbed by one examiner as the " apples-apples-oranges-bananas" phenomenon, in which two answer options are related to the same topic, but the other two options deal with different topics.

Not only does this type of question suffer from the same problems j

discussed above, but it can clue the candidate that one of the two " apples" is the correct response, which is usually the case, j

(4) Define the question, task or problem in the stem of the question.

Include as much necessary information about the problem or situation in the stem, leaving only the solution, action or effect for the answer options.

Poor:

At 50% power:

A.

The equilibrium xenon reactivity worth is approximately equal to the equilibrium xenon worth at 100% power.

B.

The equilibrium xenon reactivity worth is approximately one-half the equilibrium xenon worth as 100% power.

C.

The equilibrium xenon reactivity worth is approximately two-thirds the equilibrium xenon worth at 100% power.

D.

The equilibrium xenon reactivity worth is approximately three-fourths the equilibrium xenon worth at 100% power.

Better:

Examiners' Handbook 4-9

The equilibrium xenon reactivity worth at 50% power is approximately the equilibrium xenon reactivity worth at 100% power:

A.

equal to B

one-half C.

two-thirds D.

three-fourths (5) When possible, avoid using negatively stated stems.

If a negative stem is necessary, highlight the negative word (e.g.,

not, never, least).

It is very tempting to write negatively stated questions, since they can be constructed by picking three true statements out of the reference material and changing a fourth statement to make it false.

However, studies have shown that examinees do not do as well on negatively stated questions, either because they overlook the negative word and/or because negatively stated questions require examinees to pick an answer that is not true or characteristic, which can be somewhat confusing.

In addition, these questions tend to emphasize negative learning.

For example, consider the following stem of a multiple-choice question:

During 100% power operation, the feedwater 2A high level dump valve opens incdvertently.

The condensate pumps will not do which of the following:

This stem can be made to read positively:

During 100% power operation, the feedwater 2A high level dump valve opens inadvertently.

The condensate pumps will:

A.

increase flow to maintain feedwater flow rate.

B.

trip due to a runout condition.

C.

no response.

D.

trip due to low suction pressure.

There are times when a negatively stated question is unavoidable.

However, never use a negatively stated stem with a negatively stated answer option.

Which of the following indications would not be expected and might indicate an instrument failure?

A. The CRD " travel" lamp does not indicate when group 8 rods are in motion.

B. Group 7 out-motion is prevent past 91.4%

C. When you depress the "CRD travel'in" lamp test pushbutton, the "CRD travel out" lamp comes on.

Examiners' Handbook 4-10

)

i D. During a transfer of a group from DC hold to auxiliary, when you select " SEQ-0R," the " SEQ-0R" lamp is on and the " SEQ" lamp goes off.

Notice how confusing answer option A is in combination with the question stem.

(6) Provide sufficient counterbalance in questions with multi-part answers.

Multiple-choice questions can legitimately contain multi part answer options.

However, if the answers contain too many parts and/or too many options for each part, cues indicating the l

correct answer may be unavoidable.

Consider the following l

example:

The RCS is in hot standby with no reactor coolant pumps running.

l If 0TSG pressure in decreased, according to the plant verification procedure, which of the following temperature responses best indicates the presence of natural circulation?

i A. T-H increases, T-C remains the same.

B. T-H increases, T-C decreases.

C. T-H decreases, T-C decreases.

D. T-H remains the same, T-C decreases.

The candidate could choose the correct answer (C) without knowing about the T-C temperature response in this situation, since "T-H decreases" only occurs in option C.

A better way to test for knowledge of both T-H and T-C could be:

j The RCS is in hot standby with no reactor coolant pumps running.

If OTSG pressure is decreased, according to the plant verification procedure, which of the following temperature responses best indicates the presence of natural circulation?

Select:

(A) increases (B) decreases (C) remains the same T-H T-C The above question tests the candidate's knowledge of T H and T-C independently.

Notice that two part answers, with each part containing a two-option response, provides complete counterbalance, since all contingencies can be covered in four responses.

For example:

I Which of the following is a definition of quadrant power tilt ratio (QPTR)?

Examiners' Handbook 4-11

I l

1

[

l j

A. Minimum upper detector output divided by average upper l

j detector output.

l B. Maximum upper detector output divided by average upper l

detector output.

l C. Minimum upper detector output divided by average lower i

detector output.

i l

D. Maximum upper detector output divided by average lower detector output.

A multi part question which is highly recommended is one in which the two part answer options consist of a two-level response l

(e.g., yes/no; off/on) and a reason.

For example:

Which of the following best describes the behavior of equilibrium xenon reactivity over core life?

A. It decreases, because of the increased fuel burnup.

B. It decreases, because of the decrease in plutonium-xenon yield.

C. It increases, because of the increase in thermal flux.

D. It increases, because of the decrease in boron concentration.

(7) When possible, include common misconceptions as distractors Since the purpose of the licensing examination is to differentiate between competent and less-than-competent candidates, a good source of questions involves topics in which there are common misconceptions about important K/A topics.

For example, the following question was based upon a common misconception about loss of subcooling margin:

During a LOCA with a resultant loss of subcooling margin, why are the reactor coolant pumps (RCPs) secured?

A. To prevent pump damage resulting from operation under two-phase conditions.

B. To prevent core damage resulting from separation upon subsequent loss of RCS flow.

C. To reduce RCS pressure by removing'the pressure heat developed by the RCPs.

D. To remove the heat being added to the RCS by the operating RCPs.

Examiners' Handbook 4-12

i (8) Make all answer options homogeneous and highly plausible.

On a loss of condenser circulating water intake canal, the upper surge tank, hotwell, and condensate storage tank will supply i

sufficient feedwater to allow decay heat removal for approximately:

Poor:

Better:

A. 15 minutes A. 8 hours9.259259e-5 days <br />0.00222 hours <br />1.322751e-5 weeks <br />3.044e-6 months <br /> B. 8 hours9.259259e-5 days <br />0.00222 hours <br />1.322751e-5 weeks <br />3.044e-6 months <br /> B. 24 hours2.777778e-4 days <br />0.00667 hours <br />3.968254e-5 weeks <br />9.132e-6 months <br /> C 48 hours5.555556e-4 days <br />0.0133 hours <br />7.936508e-5 weeks <br />1.8264e-5 months <br /> C. 48 hours5.555556e-4 days <br />0.0133 hours <br />7.936508e-5 weeks <br />1.8264e-5 months <br /> D. 3 months D. 72 hours8.333333e-4 days <br />0.02 hours <br />1.190476e-4 weeks <br />2.7396e-5 months <br /> Notice how one method of changing the difficulty level of a question is to vary the similarity among answer options.

Develop distractors that are similar enough to be chosen by those who do not meet the testing objective, yet different enough so that they do not test trivial issues or distinctions.

(9)

If the answer options have a logical sequence, put them in order (as in # 8 above.)

{

(10) Avoid overlapping answer options.

The SPND uses rhodium which decays with a half-life of 42 seconds.

How long will it take for a detector to indicate approximately 95% of an instantaneous power level change?

Poor:

Better:

A. 2-4 minutes A. 1-2 minutes B. 4-6 minutes B. 3-4 minutes C. 6-8 minutes C. 5-6 minutes D. 8-10 minutes D. 7-8 minutes (11) Do not include trivial distractors with more important distractors.

In the search for distractors, it is very tempting to include relatively trivial facts along with options focused on more important issues or concepts.

For example:

Which of the following is true concerning the turbine?

A. The turbine is rotated at low speed when shut down in order to prevent distortion of the turbine casing.

B. Turbine eccentricity is the measure of turbine speed.

C. The turbine blades are cooled by hydrogen gas.

D. Tech Specs require at least one turbine overspeed protection system be operable in Mode 2.

Examiners' Handbook 4-13

l Relative to the other options, C could be considered a trivial distractor.

Even if included as a wrong answer, the inclusion of relatively unimportant information jeopardizes the content validity of the question.

(12) Check your questions over for " specific determiners," which give clues as to the correct answer.

4 Specific determiners include:

(a) distractors which do not follow grammatically from the stem.

I During 100% normal power operation a single steam flow element to the steam generator feedwater fails high.

The steam generator j

feedwater control system will cause-

?

A.

the feedwater valves to increase steam generation level slightly before returning the level to normal.

B.

before returning the level to slightly above normal, the feedwater valves to increase the steam generator significantly.

C.

the feedwater valves to increase the steam generator level to the level of a reactor trip.

D.

the feedwater valves to increase the steam generator level slightly and maintain the increased level.

1 1

Improved Distractors:

A.

the feedwater valves to increase steam generator level slightly before returning the level to normal.

B.

the feedwater valves to increase the steam generator level significantly before returning the level to slightly above normal.

C.

the feedwater valves to increase the steam generator level to the level of a reactor trip.

D.

the feedwater valves to increase the steam generator level slightly and maintain the increased level.

(b) options which can be judged correct or incorrect without reading the stem; (c) equivalent and/or synonymous options, which rule out both options for a candidate who recognizes the equivalence; (d) an option which includes another option (for example:

A) less than 5; B) less than 3...);

(e) implausible distractors; Examiners' Handbook 4-14

1 l

(f) a correct answer which is longer than the distractors, unless longer options occur as the correct answer with equal frequency; (g) qualifiers in the correct answer (e.g., probably, ordinarily, etc.) unless they are also used in the distractors; (h) words such as "never" or "always" which suggest a wrong option; (i) a correct option that differs from the distractors in favorableness, style, or terminology.

For example:

Which of the following action or occurrence is likely to cause water hammer?

A. Maintaining the discharge line from an auto starting pump filled with fluid.

B. Water collecting in a steamline.

C. Pre-warming of steam lines.

D. Slowly closing the discharge valve of an operating pump.

In the above question, all options except for B (the correct answer) describe preventive actions.

However, option B describes a condition that occurs as a result of negligence or oversight.

A test-wise candidate need only know that water hammer is not a i

desired occurrence to determine the B is the least favorable and therefore correct answer.

(13) Vary the location of the correct answer; avoid a pattern.

Make sure the position of the correct answer is randomized throughout the set of multiple-choice questions.

This means that the "A", "B", "C" and "D" options should be correct about an equal number of times but in no specific order.

4.6.2 True-False items True-false items present the candidate with a statement requiring dichotomous response (e.g., true or false, right or wrong, yes or no).

They are best suited to test content areas that are precise and clearcut.

Construct true-false questions according to the following guidelines:

(1) Be sure that the item is either completely true or completely false.

True or False:

Verifying that the rod bottom light is on is the correct way of ensuring that a control rod is fully inserted.

(true)

Examiners' Handbook 4-15

Although true in the large majority of instances, someone could argue that other methods of verifying control rod insertion are appropriate if, for example, the light bulb is out and/or the limit switch doesn't work.

(2) Don't make an essentially true item false on a technicality or by simply inserting the word "not" in the statement.

True or False:

To prevent pressurizer heater burnout, the heaters will automatically de energize when level falls to 30" and will not re-energize until the heater cutout reset switch is activated. (false)

This statement is false solely because the correct setpoint is 40".

This type of question tests more for the candidate's ability to attend to detail than to actual knowledge about the topic.

(3) Avoid negative terms (e.g., not, never, nothing), especially double negatives.

Poor:

One should not work on any electrical or radio equipment if he is not sure that no circuits are energized.

Better:

Before working on any electrical or radio equipment, one should be certain that all circuits are de energized.

The first set of terms typically indicate a false statement, the second set often indicates a true statement.

It is always good operating practice to reduce power slowly if an LCO has been violated.

(False)

In general, it is good operating practice to reduce power slowly if an LC0 has been violated.

(True)

In the first example, the word "always" clues the candidates that the answer is false, since there are typically exceptions to every rule or statement of fact.

In the second example, "in general" can clue candidates that the correct response is true, since that conditional term acknowledges that there are exceptions to the statement.

(5) Do not make true statements noticeably different (e.g.,

longer) from false statements.

4.6.3 Matching Items I

With matching items, a candidate is required to match each word, sentence, or phrase in one column with a word, sentence, or phrase in another column.

The items in the first column are called premises.

The answers in the second column are called responses.

Consider the following guidelines when constructing matching items.

Examiners' Handbook 4-16

(1) Give specific directions for each matching item, indicating on exactly what basis the matching is to be done.

(2) State in the directions whether a response can be used only once or more than once.

(3) Only use logically related material in a single matching item.

Nothing should be listed in either column that is not part of the subject in question.

(4) Consider the following possibilities for pairing premises and responses:

j Premise

Response

terms or words definitions 4

short questions answers symbols proper names causes effects l

principles situations in which they apply j~

(5) For any single matching item, use only one set of premise-response types, e.g., only terms with definitions.

(6) Each response should be a plausible answer for each premise.

Match the alarm (letter) with its appropriate tone (number).

Responses (numbers) may be used more than once.

a.

fire 1.

pulse tone b.

reactor building evacuation 2.

wailing siren c.

site evacuation 3.

steady tone (7) Present an unequal number of premises and responses, or allow responses to be used more than once.

This will inhibit the candidate from obtaining clues to a correct answer through a process of elimination.

(8) Arrange the responses in a logical order, such as alphabetical or numerical.

(9) Generally, examinees should be required to make at least 5 and not more than 12 total responses to one matching item.

(10) Place the entire matching item on one page so the candidate need not flip back and forth between pages.

(11) Before scoring a matching item, list all possible responses you will accept as correct for each premise.

Examiners' Handbook 4-17

(12) If a przmiss has more than one response, assign points for each premise-response pair individually, not the entire matching item.

1 Examiners' Handbook 4-18

Table 4.1 Considerations for Selecting Type of Written Examination Item to Test K/As Short Answer items are useful for:

Integration of K/As, systems, events, effects Information required to be recalled in a precise form Advantages Disadvantages Can test for recall (rather Longer time to answer; than recognition) of therefore fewer K/As tested, information resulting in lower test reliability Can measure global approach More susceptible to vagueness to problems in question and misinterpretations Allows multiple ways to Time-consuming and sometimes respond correctly difficult to grade Relatively easy to write Answer must be known; cannot Difficult to grade reliably guess Completion Items are useful for:

Simple recall Knowledge of terminology, formulas, calculation Advantages Disadvantages Scope of K/As tested limited Can test recall of facts to simple word or phrase Answer must be known; cannot Can be difficult to score, guess unless only one word or phrase is correct answer Examiners' Handbook 4-19

s 1

Table 4.1 (Continued)

Considerations for Selecting Type of Written Examination Item to Test K/As True-False items are useful for:

K/As with a definitely right answer i

K/As involving cause and effect i

K/As with only two alternative answers (yes/no)

Advantages Disadvantages Short response time--can test Encourages fact memorization many K/As, therefore higher test reliability 50% chance of getting answer correct by chance Can test facts or reasoning Prone to trivial or ambiguous Easy, objective scoring wording Candidates not penalized for Difficult to write statements poor writing skills that are true or false in all situations Not well-suited to K/As without clear-cut answers Multiple-Choice items are useful for:

K/As involving recognition of correct answer (vs. recall)

Simple factual K/As or more complex reasoning Advantages Disadvantages Can test many K/As with more Wrong answers can help questins, therefore higher diagnos weaknesses test reliability Candidates not penalized for Difficult to construct poor writing skills Examiners' Handbook 4-20 t

1 1

Table 4.1 (Continued)

Considerations for Selecting Type of Written Examination Item to Test K/As i

Advantages Disadvantages 4

Can test many levels of K/As, Difficult to generate i

}

knowledge depth good " distractor"

]

answer-options i

j Easy, objective scoring Requires recognition rather than recall I

4-Matching items are useful for:

Associations, relationships within homogeneous K/A material that is related by concept of objective 1

5 Advantages Disadvantages 4

Can test many K/As in short Limited to homogeneous time and space material i

j Easy, objective scoring Hard to control overlapping items 1

Good test of associative thinking May encourage memorization Candidates not penalized for poor writing skills Examiners' Handbook 4-21

5 EXAM DEVELOPMENT CHECKLISTS 5.1 Constructing and Scoring Written Examinations The following checklist presents points that should be incorporated when constructing examination items from the K/As for the written portion of the licensing examination.

5.1.1 General Guidance (1) Does the concept being measured have a direct, important relationship to the ability to perform the job?

)

(2) Does the question match the testing objective and intent of the K/A?

i I

(3) Is the item clear, concise, and easy to read? Could it be stated more simply and still provide the necessary information?

I Can it be reworded or split up into more than one question?

(4) Does the question provide all necessary information, stipulations, and assumptions needed for a fully correct response?

(5) Is the question written at the highest appropriate level of knowledge or ability for the job position of the candidate being i

tested?

(6) Is the question free of unnecessary difficulty or irrelevancy?

(7) Is the question limited to one concept or topic?

(8) Does the question have face validity?

(9) Are key points underlined?

l (10) Is each question separate and independent of all other j

questions?

i (11) Are the less difficult questions at the beginning of each section?

i (12) Have your questions been reviewed by others?

I Examimer Standards ES-107 and Attachment 1,

" Written License Examination Quality Assurance Checkoff Sheet," contain additional areas for review.

5.1.2 Short Answer Questions (1)

Is there one, short definitely correct answer for each item?

Examiners' Handbook 5-1

l I

(2) Does the scoring key follow directly from the question?

(3) Are clues to the answer avoided?

(4) Is the reqJired degree of precision specified?

5.1.3 Completion Questions (1) Is the statement sufficiently clear and complete that there can be no doubt as to its meaning?

(2) Is there one, short definitely correct answer?

(3) Are there a minimum number of blanks in the question?

(4) Does the blank come at the end of each completion question, if possible?

(5) Are clues to the answer avoided (e.g., grammatical clues)?

(6) Are the words omitted something other than verbs?

(7) Are only important, need-to-know words omitted?

5.1.4 Multiple-Choice Items (1) Does the question have one focused topic, making it something other than a collection of true-false items?

(2)

Is as much information as possible included in the stem?

(3)

Is the question or problem defined in the stem?

(4) Arc tricky or irrelevant questions avoided?

(5) Are the answer options homogeneous and highly plausible?

(6) Are "none of the above" and "all of the above" avoided?

l (7) Are there an appropriate number of options for each question?

(8) Is each item stated positively, unless the intent is to test knowledge of what not to do?

(9) Is the question free of " specific determiners" (e.g., logical or grammatical inconsistencies, incorrect answers which are consistently different, verbal associations between the stem and the answer options)?

(10) Are common misconceptions used as distractors?

(11) Are the answer options of the items ordered sequentially?

(12) Is the question free of trivial distractors?

Examiners' Handbook 5-2

5.1.5 True-False Items (1) Is the item completely true or completely false in all circumstances?

(2) Are tricky or irrelevant questions avoided?

(3) Is each item stated positively?

l (4) Are clues to the answer avoided (e.g., grammatical clues, length of item)?

l (5) Are conditional terms (e.g., always, could, generally, never, only) avoided?

5.1.6 Matching Items (1) Are tricky or irrelevant questions avoided?

(2) Is there a clearly correct answer or answers to the item?

(3) Are clues to the answer avoided (e.g., grammatical clues, response patterns)?

(4) Do the directions clearly tell the candidates the basis on which to make the match and how to indicate their answers?

(5) Do the directions tell whether responses can be used more than once?

(6) Is each response a plausible answer for each premise?

(7) Are there more responses than premises if each response can only be used once?

(8) Are there 5-12 total responses required for the item?

(9) Are the responses arranged in a logical order and all on one page?

(10) Is the item arranged so that the candidate can mark answers easily and so the item can be scored efficiently?

Examiners' Handbook 5-3 l

, a e

Please forward any comments, recommendations, or modifications to:

Operator Licensing Branch Office of Nuclear Reactor Regulation US Nuclear Regulatory Commission Washington, D.C. 20555

  • U. S.COVERhENI PA!hilhG Drf jtE s1999 202-292s60332

-.