ML040370507

From kanterella
Jump to navigation Jump to search
Operator Licensing Meetings/Workshop Summary for August 2003
ML040370507
Person / Time
Site: Ginna Constellation icon.png
Issue date: 02/05/2004
From: Conte R
NRC/RGN-I/DRS/OSB
To: Watts R
Rochester Gas & Electric Corp
References
Download: ML040370507 (36)


Text

February 5, 2004 Mr. Richard Watts Manager, Nuclear Training Rochester Gas and Electric Corporation 89 East Avenue Rochester, NY 14649

SUBJECT:

OPERATOR LICENSING MEETINGS/WORKSHOP

SUMMARY

FOR AUGUST 2003

Dear Mr. Watts:

This letter and enclosures document the NRC support and information discussed or obtained from the joint sponsored NRC-MANTG (Middle Atlantic Nuclear Training Group) conference in August of 2003 related specifically to the Operator Licensing area. This summary also reflects answers to questions that come up during the conference interactions through the breakout sessions. is a summary of handout and presentation distributed during the conference and is available through the associated ADAMS accession numbers listed. Enclosures 2 and 3 reflect summary input on the conference from MANTG and NRC staff respectively. Enclosure 4 are the questions and answers developed primarily from the breakout sessions and elsewhere throughout the conference.

The consensus among the attendees was that, overall, the breakout sessions were properly focused and useful for participants. The sessions dealt with a mix of issues involving Operator Licensing in general, current practices and planned changes in the initial examination process as reflected in draft Revision 9 to NUREG 1021, Power Reactor Examination Standards, and the simulator rule implementation. More specific details by way of summary are addressed in and 3.

The NRC will continue to work with the industry to address these implementation issues and needed improvement in the licensing of operator at your facilities. I appreciate the continued support, and the active participation of all the members of MANTG during the workshop and throughout the year. I apologize for the delay in getting this summary issued but we wanted to ensure good answers to the questions raised and part of our process was to obtain the concurrence from the operator licensing program office in the Office of Nuclear Reactor Regulation. If any important issue discussed at the conference arises that was not addressed in this documentation, please let me know as soon a possible especially if it relates to Revision 9 of the Examination Standards.

2 If you have any questions, comments, or need additional information, please contact me at(610) 337-5183 or by email rjc@nrc.gov Sincerely,

/RA/

Richard J. Conte, Chief Operational Safety Branch Division of Reactor Safety

Enclosures:

1. Handout and Other Presentation Material Cross Reference to ADAMS File Nos.
2. MANTG Input to Operator Licensing Workshop Feedback Summary
3. NRC Staff Input to Operator Licensing Workshop Feedback Summary
4. Breakout Sessions - Questions and Answers cc w/Encls: (VIA E-MAIL)

J. Sickle, Calvert Cliffs (Internet: Julie.A.Sickle@constellation.com)

F. Maciuska, Ginna (lnternet:Frank_Maciuska@rge.com)

3 Distribution w/Encls:

Region I Docket Room (with concurrences)

Nuclear Safety Information Center (NSIC)

W. Lanning, DRS R. Conte, DRS Distribution w/o encls:

H. Miller, RA J. Wiggins, DRA Distribution w/encls: (VIA E-MAIL)

T. Quay, NRR D. Trimble, NRR F. Guenther, NRR M. Ernstes, RII, DRS R. Lanksebury, Rlll, DRS T. Gody, RIV, DRS R. Evans, NEI DOCUMENT NAME:

g:\osb\conte\mantg2003\Aug2003Conference\2003AugustMANTGReportLetter_SummaryREV 1.wpd After declaring this document An Official Agency Record it will/will not be released to the Public.

To receive a copy of this document, indicate in the box: "C" = Copy without attachment/enclosure "E" = Copy with attachment/enclosure "N" = No copy OFFICE RI:DRS RI:DRS NRR RI:DRS NAME CBuracker Djackson RJC for DTrimble RJC for RConte DATE 02/05/04 02/01/04 02/04/04 02/05/04 OFFICIAL RECORD COPY

Enclosure 1 August 2003 Region I - MANTG ADAMS File Nos. for Handout and Other Presentation Material NRC/Management Operator Licensing Exam Conference/August 25-27, 2003 Document List HANDOUTS Document Title ADAMS Number Miscellaneous Session ML033240458 2003 Aug Management Q&A ML033240461 2-03 ILOTF National Workshop Minutes ML033240471 R. J. Conte Opening ML033240473 Exam 2000 Total ML033240478 Exam 2001 Total ML033240486 Exam 2002 Totals ML033240490 Exam 2003 Totals ML033240493 JAP Welcome Management ML033240497 Mantg Rev 9 ML033240499 Non Critical Error Examples/Comments ML033240509 Op Test Item ML033240513 Rating Factors Rev ML033240517 R1-1021 Rev 9 Overview ML033240531 Simulatory NRC-Mgt Conference Mgt at Millstone Rev 2 ML033240539 The FitzPatrick Experience ML033240544 J. Wiggins Opening ML033240550 J. Wiggins Comments ML033240557 Written Handout Examples (825/03) ML033240559 Written Presentation Examples (8/25/03) ML033240572

Enclosure 2 August 2003 Region I - MANTG MANTG Input to Operator Licensing Workshop Feedback Summary Date: 08/25-08/27 Location: Millstone Training Center Scope:

The workshop consisted exam writing fundamental course and 4 breakout sessions covering the following:

Simulator Rule Good and Bad Examples - Operating Exam Good and Bad Examples - Written Exam Potpourri - Requal/Medical/LSRO Fundamentals Course Feedback:

We received 37 feedback forms from the attendees. The vast majority of the comments were extremely positive. The following were some comments that should be considered for future workshops:

1. Make more PowerPoint presentations versus just Transparencies.
2. Would have liked more time in exam writing workshop to actually write questions for practice and critiquing.
3. All speakers should use the microphones.
4. The exam writers course should have a 1/2 day longer to allow for a question development exercise.
5. Conference should be on a yearly basis.

Enclosure 3 August 2003 Region I - MANTG NRC Staff Input to Operator Licensing Workshop Feedback Summary Overall, a positive experience was noted based on staff participation, observations, and verbal reports along with a quick scan of public meeting feedback forms.

Positive Notes:

1. NRC staff was forthright open and honest in answering industry questions related to operator licensing while comforting program office policy and acknowledging areas of uncertainty.
2. FitzPatrick reported Rev.9 pilot as an overall positive experience with Region I staff even with some uncertainty on how simulator test was to be graded. NRC grading was fairly consistency with how they unofficially were keeping score related to noncritical errors.
3. Susquehanna reported Rev.9 written exam as more operationally oriented and efficient; FitzPatrick echoed the efficiency issue.

Areas to Work On/Feedback from Industry:

1. K/A catalog outdated - how does facility JTA and Learning objective fit into initial licensing process.
2. Cloud of mystery surrounding noncritical errors for Rev. 9 simulator test - however it may be too soon to criticize or be overly concerned - Region I leading the effort to clarify guidance in this area making the process more objective.

The branch chief cautioned industry representative to not start taking out of conduct of operations manuals requirements reflecting standards of excellence (or commitments from past errors/events) just because NRC may label a behavior inconsistent with these requirements as a noncritical error on the initial exam process. The following reasons were given: 1) the removal may impact past commitments to NRC; and, 2) examiners are not totally focused on administrative requirements in lieu of what is required to be done per the facility operating and technical procedures which is the primary emphasis.

Since this is a test for an NRC license, and the operator license requires the applicants to observe the facility license and procedure requirements, examiners can not ignore non-adherence to administrative requirements - it is accumulated in the competency review - the program office is considering that they may accumulate to a maximum of 1" noncritical in certain competencies if there is no effect on the scenario)

An open question existed for behavior inconsistent with how they are trained which may not be specifically listed in facility required procedures.

3. White Paper on test item repetition in requal and NRC position to not repeat any questions from failed exam appears to be an impact with respect to retakes e.g., they

2 like to use a future week written as retake for individual exam from the weeks exam that resulted in a failure.

4. Industry anxiously waiting to see changes on new LSRO test per ES 701 and how related to SAT based principles- will it be piloted - will it be a part of final Rev.9.
5. Lots of questions/concerns related to LSRO activation and when guidance on method acceptable to staff in order to comply with the rule will be available on the web page.
6. Questions came up on how to implement draft Rev. 9 ES 501 guidance related to SRO upgrade applicants failing the RO portion (<80%) of the full SRO test (75 RO and 25 SRO only questions).

The compilation of individual public meeting survey forms can be found in ADAMS Accession Number ML040360056.

Enclosure 4 August 2003 Region I - MANTG Questions and Answers from Conference/Breakout Sessions Session 1 MISCELLANEOUS OPERATOR LICENSING BREAKOUT SESSION ISSUES Q1. To address the concern of an SRO upgrade maintaining proficiency while in class, please consider the following:

- The candidate will receive 13 weeks on shift

- Simulator training (probably more than the requal crews)

- Classroom training (more than in requal and more in depth

- Must spend time on shift after NRC exam to reactivate license prior to going back on shift I believe that the SRO program itself should qualify the candidate to maintain proficiency. This would make this a non-issue.

Answer: Comment acknowledged - The clarification to be incorporated into Revision 9 is related to an SRO upgrade applicant waiving a portion of the full test required for all SRO applicants per 10 CFR 55.43. That approach is a method acceptable to the staff on how to comply with the waiver requirements of 10 CFR 55.47 and the requalification examination requirements in 10 CFR 55.59. The only thing your scenario appears to be missing (and is addressed in our clarification) is a comprehensive audit examination. Alternate approaches may be acceptable pending more detailed review by the NRC staff.

Q2. I would like to recommend publicizing the chief examiner schedule to the facility exam teams. With Rev. 9, there will be a greater need for communication between the chief examiner and the exam team. Having the chief examiners schedule would facilitate being able to communicate with each other more efficiently.

Answer: The comments are acknowledged. This information is published and already in place on the Operator Licensing Web Site with a view in the future of up to 18 months.

Q3. Are there any criteria for calling an exam submittal unsatisfactory based on excessive overlap between sections of the exam?

Answer: ES-301 D.1.h states that the walk-through and simulator tests should not be redundant, nor should they duplicate material that is covered on the written examination. ES-501 E.3.a states that If 20 percent or fewer of the test items for the submitted written examination and operating test (judged separately) required replacement or significant modification, the report will simply state that the facilitys submittal was within the range of acceptability ... Thus, the written examination test items are evaluated separately and the evaluation criteria, specified in Form ES-401-9 (LOK, LOD, Psychometric Flaws, Job Content Flaws, etc), does not specifically discuss overlap with the operating test. Given this criteria, it is not likely that questions replaced due to overlap with the

2 operating test will be counted per ES-501 E.3.a if their development was based on a random and systematic sample plan and the overlap was deemed inadvertent. However, if the number of overlapping written test items was excessive and/or their selection was not inadvertent, then the decision to count replaced/modified written examination test items per ES-501 E.3.a will be reconsidered on a case-by-case basis.

If any JPMs are determined to be essentially identical to simulator events such that replacement or significant modification is required, these replaced or modified test items will be counted per ES-501 E.3.a given the smaller test size, the criteria clearly delineated in ES-301 D.1.h, and the fact the operating exam test items are NOT randomly selected.

Q4. If an operator passes an initial license exam and doesnt participate in the utility biennial exam four months later: How do you address the fact that they may not take an biennial exam within the next 24 month period. Or must he take the exam.

AND Some utilities have procedural process/standard, which serves to exempt an individual from being required to take a particular calendar years Comprehensive Annual Requal Exam. If: The individual earned his NRC License within that calendar year - AND - it was earned/issued within 6 months of that years Annual Requal Exam cycle. Is this practice acceptable?

Answer: In general a new licensee should be prepared to take the requalification program testing with his or her normally assigned crews. The initial program before licensing should have kept the individual up-to-date with respect to requalification training. However, the NRR operator licensing program office is working to resolve this issue. In the interim, a facility would not incur a violation of NRC requirements by having all operators take the required requalification examinations.

Q5. Questions with one non-plausible distractor have been evaluated as unsat on LRQ questions as well as questions that are from memory [tested in an open reference forum] on LRQ examinations have been counted toward the 20% of unsat questions. Is this a real problem?

Answer: Yes, tests with questions having these type flaws should be fixed prior to administration. Any requal exam with a relatively large number of unsat test items is a real problem. All unsat test items will be discussed as observations if the number does not reach the threshold of a finding per MC 0609 Appendix I.

3 Q6. ES-301-3 para. 1.c requires that there should not be any duplicate items between the facility audit and the NRC exam. NOTE: Section D.1.a is referenced and it stated that this only applies to facility written exams.

This shows that the requirement is to address exam security .

This requirement introduces predictably into the NRC exam since the candidates know that they will not see repeated operator exam items.

Why does the standard not permit separate exam teams to select JPMs without concerns for overlap, just as is permitted when the NRC writes the NRC exam.

Answer: Comment acknowledged. However, the NRC has concluded that allowing any repeat of test items by the facility authors from the applicants recently administered audit operating tests would have a significant adverse effect on the integrity of the examination process vis-a-vis 10 CFR 55.49 given the limited sample size of the operating test and its test item selection process.

Therefore, even if separate teams develop the audit and NRC operating test, a check is needed for overlap. The difference between the operating test and the written examination is the amount of the data base from which testing is selected and the nature of the selection process. In the written examination, assuming instructions are followed, the selection process should be systematically random on a relatively large universal data base. Such characteristics are not practiced or not available for the operating test. Because of the higher degree of bias on the operating test, the check for duplication reflects sound testing practices.

Q7. Will the Rev. 9 simulator grading criteria (non-critical errors, revised competencies) also apply to Requal Exams?

AND Do you anticipate the concept of non-critical errors in RLO assessment to be incorporated in the Requal Licensing Standard?

Answer: For revision 9 there are no plans to update the ES 600 series with the specific details of the revised ES 300 series simulator grading methodology. However, be advised, that any Individual Operating Evaluations conducted according to ES-604 E.2 can use the competency grading sheets from ES-303, as appropriate.

But, it is also important to note, that the Crew Simulator Evaluation grading criteria for failing grades of 1" will not change, i.e., per Form ES-604-2 all rating factor grades of 1' must be linked to the performance of at least one critical task.

4 Q8. Why cant the exam be withheld from Public disclosure (ADAMS) for 18 months to allow facility to use prior NRC exam for facility audit for subsequent class?

Answer: Openness in government principles dictate all material be released to the public unless it meets the rules governing private or proprietary information. Keeping an exam from the public for future audit use does not meet these rules.

Q9. What is the new date/time frame for the 4th GFS exam? Draft of Rev 9 says that the GFS will be given 4 times starting in 2004, but we do not have that date yet! For planning purposes this date would be a big help.

Answer: 1st Wednesday after 1st Sunday of last month of the quarter.

March 10, 2004 June 9, 2004 September 8, 2004 December 8, 2004 Q10. Written exams may exceed 24 months as long as the operator takes a comprehensive requal written exam during each 24 month requal program.

I think this is trying to say the 24 month program may exceed 24 months but each operator must complete a comprehensive requal written exam before the end of the 24 month program. Is this correct?

Answer: As we understand the question, the answer appears to be no. Per 10 CFR 55.59(a)(1) each licensed operator must successfully complete a requalification program developed by the facility licensee and approved by the Commission. This program shall be conducted for a continuous period not to exceed 24 months in duration. But , for certain individual licensed operators, plant operating schedule adjustments may cause the testing interval between successive comprehensive requalification written examinations to exceed 24 months. This is acceptable, as long as each licensed operator takes a comprehensive requalification written examination during each 24-month requalification program. Therefore, an operator taking his/her requalification examination may exceed 24 months between examination dates for those requalification programs that are defined with a duration of 24 months. However, some Region I facilities have 12 month requalification programs which means a comprehensive written examination needs to be completed within that program time frame although an operators anniversary test date may similarly exceed 12 months.

Q11. Is there any requirement/guidance/expectation on the proximity of the biennial written exam and the annual operational exam? (e.g. Is it expected that an annual operational exam be conducted near the end of the biennial period?)

Answer: No. The annual operating test may be given any time in the calendar year to each licensed operator and senior operator. Notwithstanding the liberal definition of annual in Appendix F of NUREG-1021, we encourage facility licensees to conduct their annual operating tests at approximate 12-month

5 intervals (i.e., at the midpoint and end of their 24 month requalification training cycles). Furthermore, the operating test (scenarios and JPMs) must be comprehensive and conducted in accordance with the facility licensees approved, SAT (systematic approach to training) based training program. The comprehensive written examination should be given at the end of the requalification program in order to determine licensed operators and senior operators knowledge of subjects covered in the requalification program ...

according to 10 CFR 55.59(c)(4)(ii).

Q12. Please disseminate a brief synopsis from the NRCs perspective as to what caused the recent event @ Dresden, where NRC licensed personnel were not comprehensively examined within the time requirements delineated within NUREG-1021. Additionally, please describe the corrective actions which were implemented and what type of NRC finding was issued.

Answer: On July 1, 2002, the licensee identified that 54 licensed operators did not meet the requalification examination requirements of 10 CFR 55.59. A comprehensive written examination for the 24 month requalification period defined by the licensee as January 10, 2000, through January 4, 2002, was not administered to the operators by the station training department personnel within the time frame required by 10 CFR 55.59 causing 54 licensed operators to not be in compliance with 10 CFR 55.53 (h) on January 5, 2002. All licensed operators successfully completed a comprehensive written examination by July 17, 2002.

On July 31, 2002, the licensee identified that 28 licensed operators did not meet the requalification examination requirements of 10 CFR 55.59 (a)(1) and (a)(2).

A comprehensive written examination for the 24 month requalification period defined by the licensee as January 30, 1998, through January 30, 2000, was not administered to the operators by the station training department personnel within the time frame required by 10 CFR 55.59 (a)(1) and (a)(2), causing 28 licensed operators to not be in compliance with 10 CFR 55.53 (h) on January 31, 2000.

All licensed operators successfully completed a comprehensive written examination by February 21, 2000.

On August 25, 2002, the licensee identified that 10 licensed operators did not meet the requalification examination requirements of 10 CFR 55.59 (a)(1) and (a)(2). An annual operating test was not administered to 10 licensed operators during calendar year 2001. The 10 licensed operators successfully completed an operating test on January 4, 2002.

The failure of the licensee to ensure that the Dresden licensed operators maintained their licenses as required by USNRC regulations resulted in the NRC having to issue 47 Notices of Enforcement Discretion (NOED) letters to individual

6 operators as well as seven additional letters to individual operators who had not been in compliance with USNRC regulations but had since returned to being in compliance.

Additional information on the issue can be obtained by accessing USNRC INSPECTION REPORT 50-237/02-15(DRS); 50-249/02-15(DRS) (ADAMS

  1. ML023090405), an associated letter to the licensee dated June 3, 2003 (ADAMS #ML031560417), and NRC REGULATORY ISSUE

SUMMARY

2003-10.

Q13 (Comment) With the requirement for 15 LSRO JPMs, it will be nearly impossible to prevent overlap between NRC Exam and Audit Exams.

Response: This issue was included in the industrys formal comments on Draft Revision 9 and will be addressed in the final document.

Q14. Contractors are being used for fuel movement. Consider delineating what functions the SRO in charge of fuel handling is to perform. We may need to update the tasks the fueling SROs are required to perform, hence the license process.

AND I have a refueling SRO program and task hit based on what a refueling SRO at out station has to do. Manipulating dummy bundles or any other core components is not part of the refueling SRO tasks-Those fall under the fuel handlers quals.

AND Saying that all refueling SROs have to perform JPMs such as moving fuel bundles is contrary to the SAT Process.

Answer: An analogy is evident in the licensing of ROs and SROs with a fundamental principle being evident in that analogy. The system portion of the walkthrough for RO and SRO is normally identical because we expect SROs to be able to perform the functions of the subordinate. The regulatory basis for this is embodied in 10 CFR 55.43 and 45. The license issued typically states that the operator is authorized to manipulate and supervise [underline added for emphasis] the manipulations of the reactor. Moreover, pursuant to 10 CFR 55.31, SROs are required to perform 5 significant control manipulations even though they will only be supervising those activities after they obtain their license.

In the LSRO case we expect the supervising licensed operators to perform those functions of the equipment operator, we just dont license those operators. As to the functions to be performed by an LSRO, a good Job Task Analysis should support functions to be tested by an LSRO which should include the tasks of the associated equipment operator.

Q15. When will ES-701 be issued and will it be with Rev 9?

7 Answer: A draft of ES-701 was issued with the Revision 9 clarification of September 2003 and is available on the operator licensing website. The NRR operator licensing program office is reviewing the industrys comments on the draft, and it will be a part of Revision 9.

Q16. Do we plan on a Pilot exam for the LSRO changes to ES-701 Answer: No.

Q17. On the topic of the SRO (U) 25 Question waiver criteria....Does an SRO upgrade candidate need to be active, (on shift 12 of 24 months) to apply for waiver? All SRO (U) in the ICT [Initial Candidate Training ???] program will go inactive before the waiver is requested.

Answer: Per the clarifications of September 2003 on the Operator Licensing web page, they do not have to be active at the time they apply, but they do have to convince the NRC that they have had extensive actual operating experience... within two years before the date of application, in order to satisfy the 10 CFR 55.47(a)(1) criterion for a waiver. Moreover, to qualify for an upgrade, an applicant has to have at least one year of experience as an RO - hence, the waiver expectation that the applicant be active for 12 of the previous 24 months. Keep in mind that an applicant can still apply for a waiver even if he/she does not meet the criteria posted on the web page (and in ES-204); those are standard waivers that the regions can approve. If the region does not have the authority to grant the waiver, they will forward it to NRR for review on a case-by-case basis against the requirements in 55.47.

Q18. For an SRO upgrade waiver, does the individual need to maintain proficiency as an RO, i.e., stand 5x12 or 7x8 watches during the upgrade training program?

Answer: No, an acceptable condition is if the individual was active 12 of the 24 months before applying for the waiver. Moreover, SRO upgrade applicants who are not up-to-date in the RO requalification program CANNOT stand watch anyway.

Q19. Are there any time requirements for SRO(U) 25 question waiver application? When must it be submitted?

Answer: There are no different time requirements for this waiver request. The waiver request should be submitted first with the preliminary license applications, 30 days before the examination, and then later with the final applications, at least 14 days before the examination, according to NUREG-1021, ES-201 and ES-202.

8 Q20. With NUREG-1021 or within a supplement to NUREG-1021, please clearly describe the steps/process required to be followed to transition an individual from being a currently NRC licensed SRO, to becoming a currently NRC licensed LSRO and then dropping his SRO license.

Answer: The facility licensee should provide a letter requesting a license restriction to perform duties of SRO limited to fuel handling. The NRC will then restrict his license to LSRO duties, at which point the operator would shift from the regular requalification program to the LSRO program and be subject to the LSRO proficiency requirements. The expiration date of the license would not change, and the region would issue a new LSRO license when the license is renewed.

This process was used in the past and is acceptable to the NRC staff and it is similar to that to be used for an operator and facility licensee requesting the SRO license be permanently changed to RO license.

Q21. In no instance should members of the actual test population itself, be used for pre-validation purposes, exposure to the very items or similar items....would....compromise the integrity....of the exam....

The example given says the exam validation team can validate an exam and then take an exam that must not have duplication of any questions from the exams that were...validated...

The opening statement seems to prohibit similar items. How close to the questions on the exam be to what the crew has already seen? Is a significantly modified question different enough?

AND Validation of Exams - Requal: It is not clear how to validate an exam without exposing the crew to similar test items using a common sample plan.

Answer: If a requal crew validates another crews exam and gets any one of the questions from the other crews exam, that would be a clear violation of 10 CFR 55.49. A significantly modified question, if truly significantly modified, would not be considered a duplicate question. Facility licensee could also explore alternative methods to validation that do not involve actual crew members, e.g.,

training staff who were not involved in developing the requalification examination.

Q22. Written exam item repetition [for requalification exams]: If a utility gives an annual written exam (40-50 questions) and also gives weekly quizzes with 30 questions:

1. How does the repetition percentage apply, or does it?
2. Shouldnt some consideration be given to the number of items sampled over the two year period?

Answer: 1. Repetition as used in the pending guidance for IP 71111 Attachment 11 only applies to the comprehensive written exam at the end of the requalification period and not to quizzes used in the interim of the that requal period.

9

2. Yes, the sample plan guidelines of NUREG-1021 is a method acceptable to NRC staff for sampling of material to be tested in requal periods.

Q23. Written exam validation process.

1. Exam developer puts exams together
2. Validating crew takes first exam, if the exam is valid, then a pass/fail grade counts. If everyone passes they can validate all future exams.
3. Essentially, the first exam is invalidated, but can be a valid exam if the crew comments dont challenge validity. Then their performance counts.

Answer: Based on exam test item repetition, this appears to be an acceptable practice.

However the scenario proposed has an aspect that is unacceptable....If everyone passes, they can validate all future exams. This implies the test is valid if everyone passes and validity is not based on everyone passing. The operators need to know when they are taking a test with pass/fail consequences.

A good way to test an area missed would be to substantially modify the question dealing with a particular principle.

Q24. In regard to a requal failure on a written exam, why did we look at/go towards 0% repeat questions?

Looking at the same questions (overlapped): It seems to me that if the individual who failed is in one of two categories.

1. The individual got the questions(s) correct, shows he/she knew the material; or,
2. The individual missed the question(s), was re-mediated, so should now know the system/procedure/interlock.

So I may be missing a key thought process, but to understand if the individual understands the area(s) where he/she failed, a small percentage of repeats should be ok.

AND Excessive Test Item Repetition: Biennial Written Exam Retake #2) No 50% overlap between retake and other biennial exams given this requal period. Can there be 1 makeup exam that can be used any week that would meet #2. Personnel taking makeup could sign a statement of not telling others what was on the test. The issue is a guy failing in week 1 could retake in week 3 but someone who failed in week 5 also would need a whole new exam. Id rather make 6 exams with 1 being the retake exam.

10 AND Excessive Test Item Repetition - Example 2.1 Allows no overlap of exam questions to be included in the makeup exam for individual failure. This requires a new exam be prepared vs the current practice that allows the individual to be re-examined by taking a different written exam, from another requal exam. Although these different exams meet the less than 50% overlap, they may contain a few of the same questions that may be on the exam the individual failed. This new requirement will require a new exam be written since 0 repeat questions will be allowable.

AND Comment Concerning Biennial Written Exam Retake There must be no duplication of questions from the failed exam. Seems to be overly burdening. Normally if someone failed the written during week one - was remediated and ready to retake during week 3 -

this individual would take week 3's exam. This exam may have repeat questions from week 1, but greater than 50% different.

AND White paper Example 2. Clarify the definition of requalification period. Example: If I have a failure in week 1 of cycle and I wish to remediate and re-examine in week 3, the 50% overlap for this requalification period would only be weeks 1&2. However, if the re-examination will not occur until week 6 then the requalification period would be the first 5 weeks?

AND The potpourri breakout session handout states that for retake exams, There must be no duplication of questions from the failed exam. Id like the NRC to consider changing this from no...questions to allow 10% repeat questions. This would potentially allow the use of another existing exam to be used as a remedial exam, significantly reducing the administrative burden, and yet still providing a retake exam that is significantly different from the original exam.

Answer: Forcing operators to not talk with one another from day to day when they are released from a facility is very difficult for facilities to control and enforce.

Further, feedback directly from the industry including licensed operators, suggests that signing these agreements and not be able to talk to fellow participants tends to add excessive stress to the process. The amount of duplication is only a trigger to further review to ensure exam security and integrity. A good way to test an area missed would be to substantially modify the question already used dealing with a particular principle tested in order to avoid any duplication.

Regarding the 0% reuse from the failed examination ... Reusing the same items (missed or correct) from the originally failed requalification examination on the retake examination is a flawed practice since re-exposure to the exact items would falsely bias the examination results upward, inflating and distorting true

11 retake performance. Furthermore, including any of the same items on the retake examination amounts to little more than a review - not an examination, as commonly defined.

However, for training purposes, it is always desirable for the applicant and the facility to review specific examination items missed from the failed examination so as to remove knowledge deficiencies. Alternatively, it is never good practice to include those same items in a retake examination because the same items would have no discriminatory value -- an essential component of test validity --

due to their recent exposure.

Q25. Test item repetition draft white paper discusses practices which appear to be an NRC requirement not just an inspection criterion. Question: Define pre-validation purposes Answer: These are no requirements, it is only guidance. Again, the amount of duplication is only a trigger to further review so as to ensure exam security and integrity.

However, based on the information, the NRC may more closely review a test for integrity that does not adhere to the guidelines of the requal inspection procedure, IP 71111, Attachment 11. Pre-validation purposes is any test not given for score but given as a part of the overall preparatory review process before exam administration for score.

Q26. Re-activation IAW

  • 10CFR 55.53(f)(2): Stand a watch under instruction......how long is a watch? 8 hrs, 12 hrs. Should be consistent across the nation.

Answer: One of the requirements to re-activate a license according to 10 CFR 55.53(f)(2) requires the licensee to complete a minimum of 40 hours4.62963e-4 days <br />0.0111 hours <br />6.613757e-5 weeks <br />1.522e-5 months <br /> of shift functions under the direction of an operator or senior operator as appropriate and in the position to which the individual will be assigned. The 40 hours4.62963e-4 days <br />0.0111 hours <br />6.613757e-5 weeks <br />1.522e-5 months <br /> must have included ... all required shift turnover procedures. Since 10 CFR 55.53(f)(2) requires the 40 hours4.62963e-4 days <br />0.0111 hours <br />6.613757e-5 weeks <br />1.522e-5 months <br /> to include all required shift turnover procedures, the 40 requirement should be satisfied with complete shifts until at least the 40 hour4.62963e-4 days <br />0.0111 hours <br />6.613757e-5 weeks <br />1.522e-5 months <br /> requirement is met. However, the 10 CFR 55.53(f)(2) does not require the shifts to be 8-hour or 12-hour as required in 10 CFR 55.53(e).

Q27. Since no licensed operators are permitted to be exposed to similar items which they may in turn take, via the validation process. What is the acceptable population to perform exam validation for requalification exams, written and dynamic.

Answer: No minimum or maximum is specified. The number of personnel should be minimized. For example, if six to eight crews are tested, then doubling up crews per written test (given on same day) may result in only 3-4 tests generated and being completely different so that one crew can validate another crews test.

12 Section 2 INITIAL EXAM OPERATING TEST BREAKOUT SESSION ISSUES Q1. During exam development in 2002, there seemed to be concern by the Chief Examiners for overlap of test items between the operating test (i.e., specific events in a scenario) and written exam questions.

Is there a standard related to what is considered overlap between written test items and events in scenarios?

Example: Written question related to spraying the drywell during a LOCA and scenarios containing a leak in the Drywell that required drywell spray - Same for ATWS written questions and scenarios - Other EOP related events.

Answer: Form ES 201-2 section 4.d requires a check for duplication and overlap among exam sections and section 4.e requires a check of the entire exam for balance of coverage. The above sections correlate to guidance in ES 401 sections D.1.e and ES 301 D.1.h. Essentially the guidance is to avoid duplication between not only the walk-through and simulator operating tests but also between the operating test and the written examination to ensure a balance of coverage. The key word is duplication so if there a multiple paths in order to perform an activity it would be permissible, on occasion and as a result of the random selection process, to test different paths -- one on the written test and one on the operating test or one on the walkthrough and one on the simulator test. Multiple examples of this situation on the same initial test involving both the operating test and written examination should be questioned by the chief examiner on the principle of balance of coverage.

Q2. Would soft skills requirements for enumerating non-critical errors be germane in the walk-through portion of exam? (i.e., how will we handle lack of OPS standards implementation on a JPM?

Answer: The failure to adhere to a soft skill requirement in the required facility administrative procedures would most likely not be a critical step so the administrative topic/system JPM would most likely be evaluated as satisfactory.

If the failure to adhere to such a requirement is noted by direct observations or by for-cause questioning, the examiner needs to make a comment on the individual examination report since it does reflect an operating test deficiency which is required to be documented by ES- 303 for an informed licensing decision. In other words, significant applicant deficiencies noted during the performance of the JPM or as a result of follow-up questions, could result in an unsatisfactory grade for the administrative topic/system even though the applicant successfully completed the JPMs task standard, i.e., critical step(s).

13 Q3. Soft skills are sometime part of industry good standards. These may be written in documents that fall outside of the procedure systems (process).

Although these actions are expectations of Operations Management, they are not proceduralized. Are these still counted?

AND Grading to standards of excellence in soft skill areas vs. Grading that would discriminate only if there was consequence to the missed soft skill (i.e., resulted in mis-operation of a component or other adverse conditions.)

AND Evaluating crew or the individual performance using conduct of operations manuals (human performance standards) is potentially outside of the legal and license basis requirements for a station. Can an accumulation of competency errors on an exam in several (3 or more) human performance areas lead to a failure?

Answer: The Required Operator Actions listed on Form ES-D-2 for examiner use during the simulator operating test will list only those procedure steps and actions required by facility procedures and guidance applicable to the specific event(s).

Any applicant errors noted with respect to the Form ES-D-2 required actions will be evaluated as non-critical errors unless the error was a critical task. Soft skills requirements/guidelines such as 3-way communication do not have to be listed in the prepared material for JPMs and Scenarios. Examiners should be aware of such soft skills and, if they are facility requirements and applicant performance is observed as deficient with respect to these requirements, then the examiner will note the deficiency in the individual examination reports as a non-critical error.

Multiple non-critical errors in the same rating factor (two non-critical errors with no correct performance or three or more non-critical errors) can result in a rating factor score of 1. Similarly, one critical task error can also result in a rating factor score of 1. However, no applicant can receive a failing grade for any competency according to Draft Revision 9 of NUREG-1021 due to only one rating factor scored as 1. The applicant will have to perform deficiently in at least one other rating factor in the same competency to receive an unsatisfactory grade on the simulator operating test.

Finally, in response to industry concerns regarding the grading of so-called soft skill non-critical errors, the NRC staff outlined a proposal at the November 25, 2003, public meeting with the industry focus group (FG) on operator licensing.

The proposal would require any error related to the communications and crew interaction competency, i.e., soft skills, to have a material effect on the scenarios required operator actions according to Form ES-D-2 in order to justify a failing score on any one rating factor in the competency. The draft proposal was provided for industry review and comment and is available in ADAMS (ML033520304) or on the NRC Public Website via the Operator Licensing

14 webpage.

Q4. Is failure to comply with an expectation (such as an operations standard) graded the same as failure to comply with a station procedure?

Answer: As we understand the question, the answer appears to be yes. If the operators fail to comply with required facility guidance (operations standard), the observed deficiencies will be evaluated as critical or non-critical errors as appropriate. See the discussion in the preceding question. For example, if 3-way communications are required by facility operations standards for certain situations and the guidance is not followed, then an error -- critical or non-critical -- has occurred and will be documented and evaluated accordingly per NUREG-1021.

Q5. Does a candidate have to repeat and take another GFE if he passed the exam under Rev 8 over 2 years ago, but is now in a license class under Rev. 9? Class starts Nov.

2003 (e.g. His new 398 will show GFE passed over 2 yrs ago. He is an RO candidate attending NLO requal).

Answer: Since he started his class in November 2003, his license examination will most likely be administered after Final Revision 9 becomes effective. Final Revision 9 will include provisions for waiving the GFE if the applicant passes a randomly selected prior GFE administered by the facility licensee within two years before the date of application.

Q6. Will human performance error tool mistakes be lumped together for an aggregate error or will errors in different areas be scored in the different areas. Example: Will a EOP place keeping error (didnt follow admin guidelines) be a (2) in Procedure Use and errors in 3-way communications a (2) in Communications?

Answer: If place keeping is a facility requirement, then it would most likely be counted as an error in the Procedure Use competency. The aggregation of non-effect noncritical errors is being considered for only the Communications and Crew Interaction competency in response to a number of comments in this area. For further clarification regarding the proposal for this competency reference the November 25, 2003, meeting summary discussed above. See ADAMS ML033520304.

Q7. What if the utility insists that a 4th person is available specifically to do peer checks?*

How do we deal with this?

  • and would he be permitted to correct a potential error?

Answer: Crew staffing during the operating test is discussed in Section D.1.j of ES-302.

Applicants are tested per Technical Specification requirements. The facility licensee could only insist on a 4th person, as a surrogate operator, if the TS requires that complement for shift staffing. The peer checker is permitted to correct a potential error but peer checking correction will still count as an error if the applicant intended to operate the wrong component . Also another form of peer check is when the SRO ordered the wrong action but was corrected by the

15 RO, an error would be counted for the SRO.

Q8. Peer Checks - If peer checks are required, will this be noted in the scenario outline so the examiner knows and can watch for a potential non-critical error. (I.e., failure to ask for and receive the peer check.)

Answer: There is no simulator rating factor that specifically requires examiners to grade their applicants on the performance of peer checks. If peer checking is an administrative procedure requirement, it is not expected to be listed in the scenario guideline. Examiners should become familiar of such administrative requirement in preparation for the examination. If peer checking is specifically listed in the operating or technical procedure used to operate or test the plant for a specific scenario event, then that action is to be listed on the scenario or JPM guideline - most likely not a critical step.

Q9. If you didnt see someone perform an action in the simulator, required or not, how its scored is obvious. My question is, if the person is in the wrong place at the right time, i.e., not in the position to perform anymore of these actions for the remainder of his/her time on the control boards, do we have to run another scenario so that person can be seen.

Answer: Yes. We anticipate that, if an examiner recognizes his/her applicant will exceed the Not Observed *RF Guidelines, than the additional scenario or part(s) of a scenario will need to be administered. However the exam should be designed to cover all RFs for each applicant. Also, please note that the NRC staff revisited an unresolved issue related to the grading of single noncritical simulator errors, which would nominally result in a rating factor score of 2" even if the applicant did nothing correct to justify the passing score. The staff addressed the issue at the November 25, 2003, public meeting with the industry FG on operator licensing. The staff proposed that the solution would be to run another scenario that would provide the data required to support a passing or failing score on the subject rating factor. The FG acknowledged the concern and had no objections to the proposed solution.

Q10. Faulted JPM - Is going to the RNO [Response Not Obtained] column in an EOP a faulted JPM or must you go to another procedure. Has the definition of a faulted JPM changed?

Answer: No, the definition has not changed. However, please note that the term faulted JPM is a misnomer. The correct terminology according to Appendix C of NUREG-1021 is alternate-path JPM. Alternate-path JPMs are JPMs where malfunctions occur and are used to provide a methodology to evaluate whether an examinee has the skills and knowledge level needed to safely operate the system. Alternate-path JPMs require the examinees to use alternate methods to perform tasks. A RNO column response can be an appropriate example of an alternate-path JPM based on the detail and nature of the required actions vis-a-vis the alternate method to perform the task (is there a basis for observation and evaluation of operator action(s) in the RNO column).

16 Q11. Clarify predictability of using post scenario EP classifications as part of Admin JPM exam.

Answer: If post scenario EP classifications were routinely used for coverage of the Emergency Plan topic of the Administrative Topics Walk-Through on every examination day then the applicants will be able to predict that an EP classification task is going to be part of their exam. A proposed solution could be to examine emergency preparedness via an administrative JPM and administer on it on one single day to all the applicants. Or, conduct an EP classification on day 1 and another Emergency Plan task(s) on the remaining operating examination days in the week, e.g., a protective action recommendation could be done on day 2. Please note that each situation involving potential predictability will be evaluated on a case-by-case basis. The evaluation will likely consider the test items involved and the potential for not just predictability during the current examination week but for predictability from the last examination at the same facility.

Q12. Does the overlap limitation between JPMs/Scenarios apply to knowledge? NUREG 1021 specifically speaks to a prohibition on overlap on events, actions, and operations, but I cant find a reference to knowledge.

Answer: ES-301 D.4 requires that The selected tasks are in addition to and should be different from the events and evolutions conducted during the Simulator Operating Test. Some tasks similar to scenario events are permitted if the actions required to complete the task are significantly different from those required to complete the scenario event(s). Thus, the guideline applies to tasks or paths which are mostly related to abilities but the knowledge behind a task or understanding could also be tested. Alternate-path JPMs have the advantage of testing both; and, when there is a question on the applicants performance, the examiner must ask for-cause type questions to ensure ability and understanding.

Q13. Where do you draw the line between Admin and System JPMs?

Answer: 10 CFR 55.45 lists the various topics for the operating test. Item (9), (10), (11),

(12), and (13) are generally considered administrative topics. Per ES-301, the administrative topics are divided into four general areas: conduct of operations, equipment control, radiation control, and emergency plan. Tasks that focus on those areas would be considered administrative in nature. It is possible for system-focused tasks to include administrative requirements. Accordingly, separate JPMs may be developed on the system and administrative portions of the walkthrough test in the same exam.

Q14. Operating exam - What constitutes 30% repeat from last NRC exam? (i.e., RO exam has 4 admin JPMs. Is 30% equal to no more than 1 JPM repeated?

Recommend specifying numbers, not percentage.

Answer: The interim clarification that was posted on the operator licensing web page in September noted that the 30% will be applied separately between systems and admin JPMs. This issue was addressed in the formal industry comments; final

17 Revision 9 will use numbers instead of percentages.

18 Section 3 INITIAL EXAM WRITTEN TEST BREAKOUT SESSION ISSUES Q1. Overlap Issues - If a particular event is evaluated on the operating exam to evaluate performance of a particular procedure leg, is it acceptable for a written question to exist which asks about a related knowledge such as the basis for a step in the procedure.

No duplication between operating exam areas makes sense....Written versus operating seem extreme unless the question is directly related to the scenario/JPM performance rather than supportable knowledge.

Answer: The comment is acknowledged. It is acceptable for a written question to exist which asks about a related knowledge such as the basis for a step in the procedure while the performance of the test is tested in the operating test. This type of situation should not be overused from a balance of coverage viewpoint.

In response to another question in the operating test area, the following was given as guidance.

Form ES 201-2 section 4.d requires a check for duplication and overlap among exam sections and section 4.e requires a check of the entire exam for balance of coverage. The above sections correlate to guidance in ES 401 sections D.1.e and ES 301 D.1.h. Essentially the guidance is to avoid duplication between not only the walk-through and simulator operating tests but also between the operating test and the written examination to ensure a balance of coverage. The key word is duplication so if there a multiple paths in order to perform an activity it would be permissible, on occasion and as a result of the random selection process, to test different paths -- one on the written test and one on the operating test or one on the walkthrough and one on the simulator test. Multiple examples of this situation on the same initial test involving both the operating test and written examination should be questioned by the chief examiner on the principle of balance of coverage.

Q2. Is there any movement or momentum to alleviate the whole concept of a separate GFES Training Course & Exam? This way of doing business appears to be facilitating a general degradation in the Nuclear Fundamentals (Reactor Theory, Thermodynamics, Heat Transfer & Fluid Flow, etc., etc.) Knowledge base of out Control Room Crew personnel, Get Academics Out of the Way & Then Get to the Real Important Stuff to Get an NRC License. (The current methodology sends the unintended message that an Academics Knowledge Foundation is of low importance.)

Answer: No, the Generic Fundamentals Examination (GFE) will continue to be used to meet testing requirements for basic knowledge items in accordance with 10CFR55.

It should be noted that the industry requested the separate and early administration of the GFE so that the GFE would be administered as close as possible to the completion of GFE training. The reason was to better ensure that applicant test performance would be maximized by reducing the effect of time upon forgetting. However, early administration of the GFE need not result in any

19 degradation of GFE knowledge. A good training program would integrate GFE concepts into the balance of its site-specific instructional program.

Q4. On Wednesday, we spent a lot of time discussing K/A issues and discrepancies. I would like to recommend that we do away with the K/A catalog and trust the SAT process. I believe this would eliminate a lot of issues and streamline the process. It would be much more efficient. The issue of 2 part K/As would be gone. Having to suppress/reject K/As would be gone. Lesson plans are written with SAT approved objectives that match the facility task list. Eliminating the K/A catalog would allow the NRC to review exams that are custom for each facility. This eliminates an extra step in the process of determining if a K/A is met. It would also make it easier for the facility to write an exam with less issues for the NRC to have to deal with.

If the facility maintains their accreditation, their SAT process would keep their objectives in line with the K/A catalog. If the NRC certified the facilities objectives once up front, the follow up would be easy to maintain in the 2 year review process. Only those objectives that have changes over the 2 years would have to be looked at.

I believe all this would save the NRC and the facilities time and money and produce a better quality exam with less discrepancies. Would the NRC consider supporting this?

Answer: The K/A catalogs are a part of the process to ensure content validity of the initial exam. The listed knowledge and abilities are fundamental to ensuring that the NRC can measure Knowledge and Abilities required to operate a nuclear station while working with so many different nuclear stations. Moreover, the fundamental requirement surrounding the NRC license examination is that it be developed under Uniform Conditions. The single K/A Catalogue (one BWR and one PWR catalogue), developed jointly by industry and the NRC, ensures that license examinations are based upon an agreed-upon body of safety significant knowledge and abilities.

Q5. Level of Difficulty (LOD) - Consider defining this by using Desk Top Guides for each part of exam standard - Kind of a User Guide Answer: Appendices A and B of NUREG-1021 already discuss this and other examination concepts and principles. The NRR operator licensing program office prefers to maintain NUREG-1021 as the sole source document for generating operator licensing examinations.

Q6. K/A catalog needs to be updated to support site specific SAT program. Sample from site objectives, for example.

What, if any, is going on with this type of comment?

20 Answer: There is not currently a plan to review the K/A catalog, however there appears to be a significant amount of feedback on this topic. This needs to be evaluated for priority via the NEI/NRC licensed operator task force.

Q7. Exam questions that have been accepted by the lead examiner, but later determined to be UNSAT by an auditor, how are they communicated to the utility?

Answer: The NRR program office review process provides feedback to the regional examiners regarding the overall examination products and their consistency with regard to NUREG-1021 guidance. However, the feedback to the Region does not declare a test item UNSAT. The feedback is for the Regions consideration and, in some instances, indicates test items that did not clearly meet NUREG-1021 criteria. Region I evaluates this feedback for lessons learned and routinely conveys the lessons learned to the industry in meetings such as the subject NRC-MANTG Operator Licensing Workshop.

Q8. How do we input/identify UNSAT questions on INPO exam bank?

Are results by audit inputted for correcting questions?

AND (Comment) How do we input/identify UNSAT questions on INPO exam bank?

(Preclude selecting bad questions)

Answer: Currently there is no systematic why of doing this. All questions coming out of the bank should not be viewed a valid and ready for testing. The INPO bank is only a tool, and questions used from the INPO bank should be carefully reviewed before use. A number of factors require further review such as applicability to facility, potential for K/A mismatch in light of a recent renewed focus on this area, etc.

This is the industrys bank to be used as a tool or development aide for ideas related to K/A topic. NRC examiner have attested to that value in our own development effort but It is not an NRC bank nor is it a requisite or only part of the development process. Some have suggested that maybe the older vintage questions (i.e., prior to 2000) should be purged from the bank. The INPO Bank lead coordinator should know of the recommendations in order to improve this tool.

Q9. Written Exam - SRO only outline. Is it acceptable (preferable?) to only include A.2. and G categories for the random sampling and pre-screen out all other K/As?

Answer: Yes, the (K) and (A) columns have no required minimum coverage. As noted in the Revision 9 clarification that was posted on the operator licensing web page in September 2003, this issue will be clarified in final Revision 9.

21 Q10. (1)IAW ES-501, D2.e: (1) SRO (U) that scores <80% on the RO portion (i.e., take 100 question exam) and still get 80% overall on exam may require remediation. How will this be administered by the Regions? This is a departure from the way weve done business in the past. What level of formulating would the licensee need to submit with remediation, recommendation: The expectations for this should be clearly delineated in our standards. (2) How about if an SRO (I) scores <80% on the RO portion but scores

>80% overall on the exam?

Answer: This is not a departure from past practice; refer to ES-501 E.4.b Rev 8 Supplement 1. The NRC Regional Office should also conduct a case specific review of the SRO upgrade applicants to determine if the applicant failed as a result of significant deficiencies in the knowledge or abilities....If the SRO upgrade applicants deficiencies pose such a threat, the NRC may require the facility licensee to provide remedial training..... Thus a review of RO capabilities would be warranted if the overall failure was due to the RO section. For an instant or an upgrade taking the 100 point test, if an 80% is achieved overall it is impossible for less than 70% on the RO portion to occur. As with any SAT process, weaknesses demonstrated on the test (for the RO between 70% and 80%) even though passing, should be remediated before licensed duties. NRC staff has confidence in the SAT process to accomplish this.

Q11. What is the NRC criteria for level of difficulty, (what is a 1,2,3, etc) for written exam questions?

AND (Comment) Still no written guidance for LOD 1,2,3,4 or 5 AND While LOD is a criteria for acceptability, there is no written criteria for evaluating LOD on a 1-5 scale.

AND While LOD is a criteria for acceptability, there is no written criteria for evaluating LOD on a 1-5 scale.

Answer: A 1" is too easy and a 5" is too hard and is therefore unacceptable. A score of 2 to 4 is acceptable albeit subjective.

While assessing level of difficulty (LOD) is quantified on a 5-point scale, the underlying basis for quantification, nonetheless, resides in subject matter expertise and judgement. Thus, the judgement that results in item(s) determined to be either too easy or too hard is based upon examiner experience of how applicants have performed on similar items in past examinations, and

22 consequently, are likely to perform on the item in question in the future. More guidance will appear in Revision 9 that better addresses LOK and LOD.

Q12. How/where to you draw the line on what is considered a NLO question if the K/A match is high for ROs?

Answer: NRC staff has been accepting some of these questions if the K/A was truly generated from a random process. Similarly, system purpose and setpoint or power supply questions have recently been accepted in light of the random generation process. However, from a balance of coverage viewpoint, chief examiners may challenge excessive use of the same type question on the same initial exam.

Q13. Comment - K/A catalog needs to be revisited to clarity specific wording of K/As that address the conjunction AND. Current working makes it difficult to sort out what to ask.

Generally, the K/A catalog was not designed for exact K/A matches with respect to exam question writing.

AND (Comment) K/A Catalog needs significant revisions:

- GET, GFE, NCO, Setpoints, etc...

Review selection Criteria

- K/As not originally designed for exact K/A match (i.e., and K/As?)

- SRO only K/As reviewed

- K/As and SAT process objectives not linked (Alternative: Randomly sample training objectives)

AND K/A catalog needs to be updated to support site specific SAT program. Sample from site objectives, for example.

What, if any, is going on with this type of comment?

AND The K/A catalog needs significant revisions. The basis for this is that a large number of K/As are either only tangentially related or they do not make sense, The usual response is that we work with the chief to replace those K/As. The point is that this is a workaround.

AND From conversation in written exam - Good & Bad Examples - It becomes obvious that to meet the standards for K/A match, LOD >1, Pure Random Sampling, & SRO Only per 10 CFR - that the K/A Manual needs a revision very soon.

23 Dart Board Analogy -

5yrs ago K/A match had to hit dart board 2yrs ago K/A match had to hit a 5 of 10 Now - Only Bulls eye Acceptable K/A Manual does not support bulls eye for 100 questions.

Answer: There is not currently a plan to revise the K/A catalog, however there appears to be a significant amount of feedback on this topic and the NRC staff will review recommendations. This area should to be evaluated for priority via the NEI/NRC licensed operator task force.

It is important to note that the NRC staff does not agree with the dart board analogy. As early as 1998, in Information Notice #98-28, the NRC identified several instances in which written examination sample plans had not been developed systematically. The NRC also noted prior to implementation of Supplement 1 to Revision 8 of NUREG-1021, that some sample plans were not randomly selecting K/As to the specific K/A statement level, e.g., K1.03 or A2.11.

These deficiencies contributed, in part, to some imprecision between the question and its referenced K/A statement. In other words, the problem was not the guidance but the failure of the authors and reviewers to adhere to the guidance. In fact, a 100 question written examination can be properly developed consistent with a random and systematic sample plan with the K/A catalogue as reference if NUREG-1021 guidance is properly followed. Recall that ES-401 allows for adjustments to the sample plan by systematically and randomly selecting replacement K/A statements.

Q14 I recommend retaining the site-specific priorities option for the written exam from Rev. 8.

I believe this is important since we have significant equipment (SBO Diesel) and procedures (Rapid Down Power, Power Recovery, with SBO Diesel, Severe Weather) that may not be in the K/A Catalog and yet need to be eligible for testing.

If the random process is the concern here, we could simply add the topics to the ES-401 Outline before drawing tokens, including the topics in the random search.

AND (Comment) Recommend site specific priorities be restored in some capacity.

AND (Comment) K/A elimination guidance: In ES-401, attachment 2.2. Why cant invalid SRO-only K/As be pre-screened out, rather than selecting them and de selecting them?

Answer: The plant specific option was deleted to ensure the random selection process was used to the maximum extent possible and to ensure the process was not biased in any way. Now some want to use the option as a repository for good questions at the higher cognitive level that have K/A mismatches. This is

24 contrary to the proper random generation principles and results in examination bias. When the deletion was initially proposed there was no objection from the NEI task force.

The option to add topics for random selection is viable and the details should be reviewed by the applicable regional office as if you were making a proposal for permanent de-selection of not applicable K/As.

The option to eliminate invalid SRO-only K/As by pre-screening is viable and the details should be reviewed by the applicable regional office as if you were making a proposal for permanent de-selection of not applicable K/As.

Q15. Suggestion - Have a question or two, that has been re-written to satisfactory level.

Answer: Agreed we will attempt to do this in the Lancaster conference in June of 2004.

25 Session 4 SIMULATOR BREAKOUT SESSION This is a summary of discussion issues at the session. The Question do not have answers in order to allow the NRC staff to more fully develop the issues reflected in the questions. An additional interaction with the MANTG operations and simulator working group subcommittees is planned for February 12, 2004 and the answers will be published as a report on that session along with additional question from the follow-up conference.

Simulator Rule Summary - General Session - Questions for NRC Staff Consideration Q1. Why is Scenario-Based-Testing the simulators performance a challenge?

Q2. What impact does computer upgrades and re-hosting have on performance tests?

Q3. Are simulator design specifications required to be updated?

Q4. Is the NRC rethinking how simulator performance and testing is being conducted?

Q5. What type of plant reference data is used when designing a plant-referenced simulator?

(i.e.,Is it acceptable to use plant procedures, as-built instrument and electrical prints, Licensee Event Reports, Technical Specifications, and Final Safety Analysis Report?)

Simulator Rule Break Out Session - Questions for NRC Staff Consideration Q6. What is actually required when documenting SBT (Scenario-Based Test)?

Q7. What is the periodicity for SBT? (I.e., how recent that data is verified?)

Q8. The Operator Requal Human Performance SDP (Significant Determination Process) makes no mention of implementation of modifications. ANS-3.5 allows time for reference plant modifications to be simulated based on training-needs-analysis. What is the staffs position with regard to installing modifications on the simulator before being installed on the referenced plant?

Q9. How is simulator performance validated?

Q10. Will the staff determine whether or not a particular model is correct?

Q11. We have replaced some models with new models; What if the new model shows a different response than the old model? (With regard to malfunctions such as LOCAs and transients with no plant data).

Q12. We are on the 98 standard, how do often do I need to validate the simulators response?

26 Question for ANS-3.5 Working Group Q13. What performance testing is needed if I change models only? Change platforms only?

27 Simulator Breakout Session - Group Task Questions With Responses Breakout Group 1(2)(3) Task 1a: How did you qualify (initial and recurrent) your plant-referenced simulators performance to support its intended use for training? For examining?

And, for providing experience for applicants?

Group Response: In general, participants discussed how their specific plant-referenced simulator was determined to be acceptable for its intended use. Participants recalled how the simulators performance, for the most part, was checked out during the simulator manufacturers factory acceptance testing (FAT) off-site prior to delivery of the simulator to the customer. After delivery to the simulation facility on-site, the simulator was briefly retested by performing a limited number of mutually agreed upon factory acceptance tests to confirm the same test(s) results as those obtained in the factory. Following the successful performance of these tests, along with the successful resolution of any agreed upon simulator corrective actions (i.e., Hardware and Software Discrepancy Reports, etc.) the simulator was determined to be "Ready-For-Training (RFT)." The primary reason for conducting these tests on-site prior to RFT was to ensure that no damage had occurred to the simulator as a result of shipping from the factory to the training site.

Participants explained that the simulator is routinely performance tested or "qualified" on a periodic basis as prescribed by ANS-3.5, "Nuclear Power Plant Simulators for Use In Operator Training and Examination" (i.e., 1998, 1993, or 1985 revisions of the standard). In general, the group discussed how the standard provides specific requirements that a plant-referenced simulator must meet in order to possess a sufficient degree of completeness or fidelity and accuracy or scope to meet the needs of the industry and the requirements of NRC as described in 10 CFR Part 55, "Operators Licenses." 10 CFR 55 prescribes simulator requirements which allow the plant-referenced simulator to be used for meeting experience eligibility (55.31), for operating tests (55.45), and for licensed operator requalification training (55.59).

Breakout Group 1(2)(3) Task 1b: Does the simulator support evolutions and or control manipulations identified in FSAR/USAR Chapter 14 & 15, plant Technical Specifications, and 10 CFR Part 55 with respect to training, examination, and applicant eligibility experience?

Group Response: The group participants discussed and recalled numerous examples of how their plant-referenced simulator supported normal evolutions, including control manipulations that affect reactivity or power level, that are described and discussed at length in a licensees reference plant data materials such as the FSAR/USAR, and technical specifications bases.

Participants were not aware of any simulator that could not support training and administration of operating tests. Whether a simulator could or could not support an applicant for experience eligibility with regard to control manipulations was discussed and remained to be determined based on appropriate documentation of performance tests that would confirm that the nuclear and thermal-hydraulic characteristics which replicate the most recent core load of the reference plant. Most simulation facilities are not using their plant-referenced simulator for meeting NRC applicant eligibility experience requirements at the present time but plan to do so when they have confidence in the simulators core performance.

Breakout Group 1(2)(3) Task 2: How do you determine sufficiency of scope and fidelity for performance testing?

28 Group Response: Participants discussed various ways or methods for determining sufficiency of scope and fidelity for performance testing. In general, the group believed that as long as the plant-referenced simulator could support the conduct of plant procedures such as normal operations, abnormal operations, emergency procedures, and expected or unexpected plant transients, the simulators scope and fidelity were sufficient and could be adequately performance tested. Additionally, the group acknowledged that use of actual plant data as well as data obtained from equipment or system technical manuals, piping and instrument diagrams, wiring diagrams, and best-estimate engineering data is considered when performance testing the simulator. The group discussed how important that there be good agreement between the actual reference plant performance and the plant-referenced simulators performance. The Group also discussed the merits of having the same performance acceptance criteria applied to the actual reference plant being applied to the simulator, except as limited by design (i.e.,

simulator simplifications and assumptions in its final design specifications).

The Group pointed out how the ANS 3.5 standards govern, for the most part, the degree of performance testing as well as the adequacy of scope and fidelity to ensure supporting the use of the simulator for operator training and examinations. The Group also discussed how a simulation facilitys corrective action program also contributes to determining sufficiency of scope and fidelity in that resolutions are often bounded by performance testing of the actions.

The Group discussed how the regulations in 10 CFR 55 established a minimum threshold for scope and fidelity so the a plant-referenced simulator must demonstrate expected plant response to operator input and to normal, transient, and accident conditions to which the simulator has been designed to respond and to allow for the conduct of evolutions described in 55.45 (operating tests) and 55.59.(requalification program), etc. Finally, the Group talked about the use of operator, instructor, and subject matter experts (such as a simulator oversight committee) feedback to help identify scope and fidelity issues.

Breakout Group 1(2)(3) Task 3: What constitutes an acceptable performance test and tests results? (Consider the following attributes: test objective, technical content, acceptance criteria, repeatability, comparisons to plant, test periodicity, test results, and documentation.)

Group Response: The Group participants discussed at length various approaches to acceptable performance tests and how tests results are evaluated. The Group discussed some of the attributes mention in the stem of the question. For example, the Group talked about using the same criteria that is in the ANS3.5 standard such as comparing the simulator with the reference plant during plant events. The Group also discussed what is considered acceptable performance tests documentation for one licensee may or may not be acceptable to another licensee. In general, the Group acknowledged that more dialog is needed to better understand what is expected for acceptable performance tests and test results. It was pointed out that plant performance procedures have been used successfully for many years and that similar approaches for simulator performance tests procedures may be worth checking into because actual plant procedures have very well established protocols for documenting performance and evaluating results. For example, step by step criteria that must be performed or observed, values recorded, and tests results evaluated as satisfactory or not. The Group agreed that simulator performance tests should be repeatable from one test to the next test. The Group discussed scenario-based test performance testing with no census being reached on what was a good performance test. The Group requested that NRC provided more guidance in this area.

Breakout Group 1(2)(3) Task 4: What makes an effective simulator corrective action program?

29 (Consider the following attributes: Discrepancy identification, prioritization, and resolution.)

Group Response: In general, the group participants discussed the attributes mention in the stem of the question. Issues such as how simulator discrepancies were being identified, prioritized, and resolved were discussed. The group pointed out that most discrepancies are identified by operator feedback from training received on the simulator as well as from subject matter experts such as simulator instructors. The group also pointed out that the management and configuration process seems to be doing a good job in the tracking and ultimate resolution of any known problems. Several members of the group mention that resources are, for the most part, limited when it comes to putting in enhancements or modifications that are in the reference plant but not yet implemented on the simulator. One problem area noted by the group is that there appears to be different thresholds between simulator users for initiating discrepancies. For example, it was noted that most discrepancies are identified by the pro-active operators who have extensive plant experience and can readily pick up on any differences between the plant and the simulator and take the time to write up the deviation report(s). Overall, the group believed the simulator corrective action programs are adequate and serve its intended purpose.