ML20234D068
| ML20234D068 | |
| Person / Time | |
|---|---|
| Issue date: | 12/31/1987 |
| From: | Gore B, Laughery K, Plott C, Wachtel J Office of Nuclear Reactor Regulation |
| To: | |
| References | |
| NUREG-1258, NUDOCS 8801060431 | |
| Download: ML20234D068 (141) | |
Text
_.
Evaluation Procedure for i
Simulation Facilities Ceri:ified Under 10 CFR 55 i
Final Report U.S. Nuclear Regulatory Commission i
Office of Nuclear Reactor Regulation J. Wachtel, C. Plott, K. R. Laughery, B. F. Gore ps "scoq Y
SSR 28 Ass S72"
- 1258 R PDR
l t
l J
NOTICE Availability of Reference Materials Cited in NRC Publications r
Most documents cited in NRC publication:t will be available from one of the following sources:
- 1. The NRC Public Document Room,1717 H Street, N.W.l Washington, DC 20555 i
- 2. The Superintendent of Documents, U.S. Government Printing Office, Post Office Box 37082, Washington, DC 20013 7082 l
- 3. The National Technical Information Service, Springfield, VA 22161 Although the listing that follows represents the majority of documents cittd in NRC publications, it is not intended to be exhaustive.
Referenced documents available for inspection and copying for a fee from the NRC Public Docu-ment Room include NRC correspondence and internal NRC memoranda; NRC Office of Inspection and Enforcement bulletins, circulars, information notices, inspection and investigation notices; Licensee Event Reports; vendor reports and correspondence; Commission papers; and applicant and licensee documents and correspondence.
The following documents in the NUREG series are available for purchase from the GPO Sales Program: formal NRC staff and contractor reports, NRC-sponsored conference proceedings, and NRC booklets and brochures. Also available are Regulatory Guides, NRC regulations in the Code of Federal Regulations, and Nur. lear Regulstory Commission Issuances.
l Documents available from the National Technical Information Service include NUREG series reports and technical reports prepared by other federal agencies and reports prepared by the Atomic Energy Commission, forerunner agency to the Nuclear Regulatory Commission, j
Documents available from public and special technical libraries include all open literature items, such as books, journal and periodical articles, and transactions. Federal Register notices, federal and state legislation, and congressional reports can usually be obtained from these libraries.
Documents such as theses, disse-tations, foreign reports and translations, and non-NRC conference i
proceedings are available for purchase from the organization sponsoring the publication cited.
Single copies of NRC draft reports are available free, to the extent of supply, upon written request to the Division of Information Support Services, Distribution Section, U.S. Nuclear Regulatory Commission, Washington, DC 20555.
Copies of industry codes and standards used in a substantive manner in the NRC regulatory process are maintained at the NRC Library, 7920 Norfolk Avenue, Bethesda, Maryland, and are available there for reference use by the public. Codes and standards are usually copyrighted and may be purchased from the originating organization or, if they are American National Standards, from the American National Standards Institute,1430 Broadway, New York, NY 10018.
i
_-a==_m
Evaluation Procedure for i
1 S.imulat. ion Fac. lit.ies i
Certified Under 10 CFR 55 3
Final Report i
i Manuscript Completed: December 1987 Date Published: December 1987 J. Wachtel, C. Plott', K. R. Laughery*, B. F. Gore"
- M;cro Analysis & Design
" Pacific Northwest Laboratory Division of Licensee Performance and Quality Evaluation Office of Nuclear Reactor Regulation i
ll.S. Nuclear Regulatory Commission Washington, DC 20555 fo
~%,
j
i ABSTRACT
\\
This document describes the procedure to be followed by the NRC for the inspection of simulation facilities certified by facility licensees under 10 CFR 55.
Inspections are divided into four major areas based on the types of evaluations conducted: 1) performance testing; 2) physical fidelity / human factors; 3) control capabilities; and 4) design, updating, modification, and testing.
i NRC staff representing several disciplines including license examiners, operations specialists, and human factors experts, under the direction of a team leader, will perform these inspections.
A simulation facility inspection may include off-site and i
on-site phases.
The off-site phase will consist of an i
examination of the simulation facility documentation and an identification of those operations and procedures which may.
be considered for use in performance testing during the on-site phase.
In the on-site review, the staff will work 1
closely with the facility licensee to conduct a sound and 1
fair inspection and to evaluate the results of tests that I
are conducted.
Inspection findings will be based upon the staff's judgment of the simulation facility's compliance with 10 CFR 55.45.
Findings may range from one of "no adverse impact on the conduct of operating tests" through degrees of " adverse impact requiring correction" to " adverse impacts so serious that the simulation facility may not be used in the conduct j
of operating tests until the discrepancies are corrected and I
the simulation facility is decertified to the NRC."
\\
I I
l iii I
TABLE OF CONTENTS ABSTRACT......r iii FOREWORD...........................................
ix 1
INTRODUCTION...................................
1 1.1 Purpose...................................
1 1.2 Background................................
1 1.3 Scope of Inspection.......................
2 1.3.1 Performance Testing.............
2 l
1.3.2 Physical Fidelity / Human Factors.
4 1.3.3 Control Capabilities............
4 1.3.4 Design, Updating, Modification, and Testing...................
5 1.4 Overview of the Procedure.................
5 2
THE ROLE OF THE NRC STAFF IN THE CONDUCT OF SIMULATION FACILITY INSPECTIONS...........
7 3
THE OFF-SITE PHASE.............................
8 3.1 Review of Available Data..................
8 I
3.1.1 Performance Testing.............
8
)
I 3.1.1.1 The Standard...............
8 3.1.1.2 NRC Form 474...............
9 3.1.1.3 Operator Licensing Examination Guidance.....
10 l
3.1.1.4 Reference Plant and Similar i
Plant Operating History..
11 l
3.1.1.5 Operating Procedures.......
11 I
3.1.1.6 Cognizant Individuals......
12
)
3.1.1.7 Steps for Identifying 1
Operations for.
l Performance Testing......
12 3.1.1.8 Additional Considerations..
13 l
3.1.1.9 Data To Be Requested.......
13 3.1.1.10 Products...................
14 3.1.2 Physical Fidelity / Human Factors.
14 3.1.2.1 Panel Simulation...........
15 3.1.2.2 Instrument and Control Configuration...........
15 3.1.2.3 Ambient Environment........
15
(
3,.l.2.4 Products...................
16 l
l l
3.1.3 Control Capabilities............
16 3.1.3.1 Products...................
17 v
1 3.1.4 Design, Updating,. Modification, 1
and Testing...................
17 3.1.4.1 Products...................
17 l
3.2 Request Information From the Facility
]
Licensee................................
18 l
3.3 Review of the Data Obtained From the j
Facility Licensee.......................
18 1
3.4 Determine if the On-Site Review Will Be Conducted............................
18 3.5 Minimizing the Burden on the Facility I
Licensee................................
19 j
4 THE ON-SITE REVIEW.............................
20 l
4.1 Preparation for the On-Site Review........
20 l
4.1.1 Performance Testing.............
20-l 4.1.1.1 Steps for Developing the Performance Tests........
21 l
4.1.2 Physical Fidelity / Human Factors.
24 4.1.2.1 Panel Simulation...........
25 4.1.2.2 Instrument and Control Configuration............
25 4.1.2.3 Ambient Environment........
26' 4.1.3 Control Capabilities............
26 4.1.4 Design, Updating, Modification, and Testing...................
27 4.1.5 Products........................
27 4.2 General Course of Events for the
]
On-Site Review..........................
27 4.2.1 Activities for Each Day.........
27 4.2.2 Analysis of Results and Staff Response......................
28 4.3 Data Collection...........................
28 1
4.3.1 Performance Testing.............
29 l
4.3.2 Physical' Fidelity / Human Factors.
32 l
l 4.3.2.1 Panel Simulation...........
32 4.3.2.2 Instrument and Control-Configuration............
33 4.3.2.3 Ambient Environment........
33 4.3.3 Control Capabilities............
33 vi i
.-s
4.3.4 Design,. Updating,. Modification, and Testing...................
33 5.
EVALUATION CRITERIA.......'.....................
34 5.1 Performance Testing.......................
34 L
5.1.1 Evaluation of Parameters Measured......................
34 l
5.1.2 Evaluation of the General Performance of the Simulation Facility......................
36 1
5.2 Physical Fidelity / Human Factors...........
37 5.2.1 Panel Simulation Evaluation.....
37 5.2.2 Instrument /and Control.
Configuration Evaluation......
38 5.2.3 Ambient Environment Evaluation,.
38-5.3 Control Capabilities.......................
39.
5.4 Design, Updating, Modification,.and Testing 41 5.5 Known Discrepancies.......................
42 5.6 Assessing the Results of the Evaluations...
43 REFERENCES.........................................
45 APPENDIX A - SIMULATION FACILITY PERFORMANCE TEST-FORM................................
47 APPENDIX B - I&C DATA COLLECTION FORMS.............
51 APPENDIX C - TEST OF THE METHODOLOGY FOR THE SIMULATION FACILITY EVALUATION PLAN.
65 GLOSSARY...........................................-
131 INDEX..............................................
133 l
v11 l
1
\\
h FOREWORD On February 12, 1987, the Commission approved major revisions to 10 CFR Part 55, " Operators' Licenses," and
]
Regulatory Guide 1.149, " Nuclear Power Plant Simulation
{
Facilities for Use in Operator Licensing Examinations."
The i
regulation, which became effective May 26, 1987, specifies two acceptable types of " simulation facilities" for the conduct of operating tests:
- 1) a plant-referenced simulator that meets the requirements of ANSI /ANS 3.5, 1985 as endorsed by Regulatory Guide 1.149 and
- 2) a simulation facility, other than a plant-referenced simulator, which is 1
approved by the NRC after application for approval has been submitted by the facility licensee.
Facility licensees who intend to use a plant-referenced simulator have up to 46 months from the effective date of the regulation to certify this plant-referenced simulator to the NRC.
Facility licensees who intend to seek approval for other than a plant-referenced simulator are required to submit, within one year after the effective date of the regulation, a plan for the development of this simulation facility.
The Statement of Considerations published as background information to the revisions to 10 CFR Part 55 states that the guidance to be used by the Commission in its inspection of simulation facilities would be made publicly available six months prior to use.
That commitment was met with the issuance, in draft form for comment, of NUREG-1258,
" Evaluation Procedure for Simulation Facilities Certified Under 10 CFR 55," on April 18, 1987.
The final Simulation Facility Evaluation Procedure (SFEP) contained herein has been revised as a result of comments received, and implementation will begin immediately.
As stated within the document, these procedures are applicable to certified simulation facilities.
The inspection procedures for simulation facilities not certified under 10 CFR 55.45(b) (5) will be developed on a case-by-case basis, but will rely on this document to the extent possible.
2 l
i ix i
I
)
1 INTRODUCTION 1.1 Purpose This document describes the procedure to be used by the NRC for the inspection of simulation facilities which have been certified by facility licensees under 10 CFR 55.45(b).
NRC's objective is to communicate to the U.S. commercial nuclear power industry the methods which will be employed and the data which may be examined by the NRC to ensure a simulation facility's adequacy for use in the conduct of operating tests.
1.2 Backaround Paragraph 55.45, " Operating Tests," of 10 CFR Part 55,
" Operators Licenses" (the Rule), requires that an applicant for an operator or senior operator license demonstrate both an understanding of and the ability to perform certain essential job tasks.
It specifies that this demonstration will be done through the administration of operating tests in a plant walkthrough and in a simulation facility.
The simulation facility may be one which consists solely of a plant-referenced simulator certified to the Commission by the facility licensee, or it may be one which has been approved by the commission after application for such approval has been made by the facility licensee.
NRC Regulatory Guide 1.149 supplements 10 CFR 55 by identifying one acceptable method for a facility licensee to use in certifying or applying for approval of a simulation facility.
Regulatory Guide 1.149 endorses ANSI /ANS 3.5, 1985 (the Standard) with certain exceptions.
By following i
the guidance presented in Regulatory Guide 1.149 and the Standard, a facility licensee will be able to certify a simulation facility against the requirements of 10 CFR 55.
Once a certification or an application for approval has been made by the facility licensee, the Commission may, at any j
time, inspect the simulation facility and/or the documents and records associated with it or with its certification or application for approval.
The Simulation Facility Evaluation Procedure contained herein applies to those simulation facilities which have been certified to the NRC by the facility licensee under 10 CFR 55. 45 (b) (5) as plant referenced simulators.
Procedures for the inspection of simulation facilities for which application for approval has been made to the NRC by the facility licensee under 10 CFR 55.45(b) (4) will be handled on a case-by-case basis and, to the extent applicable, will follow this procedure.
1 1
l s
1.3 Scope of Inspection As discussa above, the NRC will conduct simulation facility 1
inspections against the requirements of 10 CFR Part 55, using the criteria provided in ANSI /ANS 3.5, 1985 as endorsed by Regulatory Guide 1.149.
This procedure is divided into four major areas based on the type of L
inspection to be conducted.
These are:
Performance Testing Physical Fidelity / Human Factors Control Capabilities Design, Updating, Modification, and Testing The scope of each of these: areas is described below.
This-procedure imposes no requirements, acceptance criteria, or staff positions.
Rather, it cites th2 appropriate source documents for all regulatory requirements.
1.3.1 Performance Testing The purpose ofl performance testing is to verify that the dynamic behavior of the simulation facility adequately represents that of the reference plant.
To do this, a variety of normal, abnormal, and emergency operations may be tested.
Since only limited time will be available for the conduct of performance tests, only a small number of such tests will actually be run.
To the extent possible, operations will be selected which: 1) are appropriate to the simulation facility and its reference plant, 2) are relevant to operator licensing examinations, and 3) offer actual reference plant data as a baseline for simulation facility comparison.
The following steps will be taken to narrow the scope of the operations which may-be selected for usa in performance testing.
First, those operations not required for certification will be eliminated.
The normal operations and malfunctions given in Sections 3.1.1 and 3.1.2 of the Standard delineate the scope of performance tests required for certification of the simalation facility.
In addition, NRC Form 474_ requires the l
facility licensee to identify the subset of operations and l
malfunctions for which the simulation facility has been i
certified.
This narrows the scope of performance testing to those operations and malfunctions which are appropriate for the given simulation facility.
Second, the focus will be on the simulation facility's acceptability as a tool for conducting operator licensing examinations.
This will limit performance testing to those operations which may be included in an operator licensing-examination.
Examiner Standards ES-301 and ES-302 give the procedures for developing and administering operator licensing examinations.
These documents provide guidance 2
i
for determining the number and variety of operations to be included in a licensing examination.
This guidance will be l
used for the selection of operations for performance testing which are representative of those included in licensing l
examinations.
j l
Third, to the extent possible, operations and events will be i
selected which have actually occurred at the reference plant i
or in similar plants.
This will help to ensure that the operations selected are legitimate candidates for operator licensing examinations, and that actual reference plant data for these operations are available.
The operations and malfunctions to be selected based on their occurrence at the reference plant or a similar plant will be identified, in part, through the use of Licensee Event Reports (LERs) for those plants.
LERs give a brief description of unusual or unexpected events which have f
occurred in the plants, and describe the state of the plant prior to the event's occurrence.
While not all LERs will be j
t relevant, there will be many which are appropriate for use j
l in performance tests of the simulation facility.
l i
The operations and malfunctions identified from the review of the LERs, while useful, will probably not cover all areas which must be considered in conducting an operator licensing examination.
For example, while it is likely that LERs will address operations which require the use of normal and abnormal operating procedures (including component and instrument failures), it is less likely that operations which require the use of emergency procedures will be addressed.
It is essential, for the conduct of a license examination, that the simulation facility permit a candidate to mitigate the consequences of an event using the reference plant's Emergency Operating Procedures (EOPs).
It must also be possible for the candidate to employ the reference plant's normal and abnormal operating procedures as required.
Indeed, use of reference plant procedures is part of the definition of a " plant referenced simulator" in the Rule.
Given this and the limitations of the LERs mentioned above, reference plant EOPs and other operating procedures may be used as a basis for the select 3cn of operations and malfunctions for performance testing.
In addition, the guidance given by the Standard may be used for the selection i
of these higher risk, low probability operations and 4
malfunctions.
)
I In summary, the scope of the performance tests to be conducted will be narrowed based on:
- the requirements of the Standard 3
j L_____--______
1
- the limitations indicated by NRC Form 474
- the needs of an operator licensing examination
- the operations identified from the reference plant and similar plant operating histories including LERs, and
- the reference plant operating procedures 1.3.2 Physical Fidelity / Human Factors Physical fidelity / human factors addresses the comparability of the simulation facility and the reference plant in the areas of panel simulation, instrument and control configuration, and the ambient operating environment.- The criteria given in the Standard for.these areas indicate that, in general, there must be no differences between the-reference plant's control room and that of the' simulation facility.
Deviations may exist if they do not detract from an examiner's ability to evaluate a candidate's performance during a licensing examination.
The Standard gives little further guidance or criteria for the evaluation of these concerns.
Thus, to ensure that NRC inspections are performed in a consistent and comprehensive ma.iner, questions have been developed for use by the NRC staff (the staff).
They are presented in Section 5.2 of
(
this procedure.
These are objective ("yes" or "no")
1 questions, enabling the inspection to be straightforward.
A "no" response to a question does not necessarily indicate a discrepancy.
Such responses will be evaluated, singly and as a group, for their impact on the conduct of an operator licensing examination.
This systematic process ensures that the scope of the physical fidelity / human factors evaluation is within the limits established by the Standard.
1.3.3 Control Capabilities Control capabilities are those features which allow the simulation facility operator to direct and' monitor the operation of the simulation facility.
This includes the ability to:
- produce a variety of initial conditions and
)
malfunctions
- freeze the simulation
- represent the actions of auxiliary or remote operators
- detect or determine when the simulation has gone beycnd plant or simulation facility design limite, and 3
4 i
i
t l
- monitor and record critical parameters l
I The Standard specifies requirements for these features, and
]
these specifications have been directly incorporated into l
this procedure.
1.3.4 Design, Updating, Modification, and Testing Review of the simulation facility's design data, data i
updating, modification, and testing is done to ensure that f
the configuration of the simulation facility is kept current with that of the reference plant.
This is done by l
inspecting selected reference plant and simulation facility l
design, modification, and testing records to confirm that the simulation facility is being kept up to date.
The nature of these reviews and the schedules for conducting them are given in the Standard and in the Rule, and are i
simply restated for this procedure.
j 1.4 Overview of the Procedure I
The wide variations in concept, design, and operation of j
simulation facilities make it impossible to delineate a precise approach applicable in all cases.
The procedure l
described herein is intended to be applied as appropriate for the collection of the information necessary to judge a simulation facility's acceptability in accordance with the Rule.
l l
There are three circumstances which may lead to an NRC inspection of a simulation facility.
j I
1.
The NRC may conduct periodic inspections on a i
random basis.
)
2.
During the preparation for or the conduct of operator licensing examinations, an NRC examiner may learn about or encounter unexpected behavior from a simulation facility.
Once a simulation facility has been certified, or an application for its approval has been submitted, examiners will report any apparent deficiencies and the context in which they were encountered.
Such examiner reports may lead to a simulation facility inspection.
3.
Incomplete or questionable data submitted by the facility licensee in support of its Form 474 certification could lead to an inspection.
A simulation facility inspection may consist of off-site and' on-site phases.
The off-site phase involves a review of performance test and other documentation for the simulation facility.
Additionally, the off-site phase will serve to 5
I I
.A
,l b i e
. [\\
r s.
,\\
\\,
.\\
\\
l o
l ident,tfy. inose operr& ions and procedures which may be
]
comitered for use dn 'on-site perfornanco testing,' and j
features if other areas to be evaluated during d.he on-site
$ 4
{
- V
&. phase.
! 1 s
a Upon completion of the off-site phase, the' inspection may be g
concluded.,.Iflthe on-oite phase ik\\to be pe.yrforme.d, data t
appro$riats for the #evhlopment of the performanew tests may be requested fr;om the. facility' licensee.'
Identificatl'on of I
the appropriate data is best accomplished on site prior to
{
f the actual cenduct of the inspec?. ton.
All or part of.tna four areas of ' review tescribody in hetion 1.3 may be
)
o
't included in the on-sitie Inath cy the'ldiscretic's of the NRC
<7 Staff.
\\
\\
h s the stsff'.will work clos)ely with the facility
{
l
\\
Once on site, licensee to make any nebestsary modifications to the j
1
> perfbraance tu ts.
This tjlll help to ensure that the.
/\\ performance te ws to beigonductad are both sound and fair.
The other arass to ba inspected may also be modified as s
,nepded.
ly %7corporating facility licensee input into the~~
ceformencC tests prior to th@ actual data col.lection and
(@'e%Guatipn, any potential weaknessessin.the methodology and bsqqline data used. for the review will hif:linimized.
This dilb al2ow the resuics of the review to'etand on their own mb. tit.
'3 J
0;4vation of thaulatulation facility or demonstrations-of 1
itdgence plant iup?iops. (e.g., annunciator test, auditory
{
signals) will be per7ohned by facility licpsee personnel.
1 The staff will be preseht to direct and obatrve the test, 1
j f
y and to record and analyn data.
j s'
- e.
t When data collection is complete,/the staff will prepare a j
s preliminary evaluation of the reral't:ah T'he staff will then.
)
discuss its preliminary t!.ndings'with $Ne facility licensee who will be given the opportunity 1 to provido ad.11tional l
information related R.o these findings.e T
Thq final resV1ts';Till be docunnated in an NRC inspection s
report.
t s
(>
5
' N\\.
N I.
't
['
y k
k-5;
'h i
T
+
i
(
+
+
x 43 l
/
.l
- 4'
\\"
V I
j
\\
./
m
2 THE ROLE OF THE NRC STAFF IN THE CONDUCT OF SIMULATION FACILITY INSPECTIONS The staff will conduct all simulation facility inspections and will make findings with respect to a simulation facility's compliance with the regulations.
Staff members who perform such inspections will represent an interdisciplinary group of technical areas, incitding license examination, operations, and human factors.
A lead staff member will be responsible for overall coordination of simulation facility inspections including:
serving as liaison between the NRC and facility licensees; informing the parties involved in an inspection of their i
individual responsibilities, and coordinating their efforts; scheduling all aspects of the inspection; identifying and resolving any problems encountered during the course of an inspection; reporting the progress and findings of:the inspection to the facility licensee and the NRC; and providing information about the staff's findings for any needed follow-on activities.
The license examiner will support all phases of this procedure by providing expertise in the areas of the licensing examination process, the job of the license candidate, and the use of control room procedures.
The operations specialist will provide plant operations expertise to the procedure.
Responsibilities will. include:
review of the performance tests performed by the facility 31censee; development and evaluation of the performance tests performed by the staff; data collection and evaluation l
for the simulation facility control capabilities, design, s
l updating, modification, and testing; and participation in other areas of the procedure as required.
The human factors specialist will support the procedure by collecting data and evaluating the simulation facility
{
environment, degree of panel simulation, and instrument and control characteristics, and he or she will participate in all other areas of the procedure as required.
An observer from another facility and/or the Institute of Nuclear Power Operations (INPO) may participate in the staff inspections of simulation facilities, provided that the facility licensee whose simulation facility is being evaluated does not object.
The staff believes that an observer would facilitate communication about the nature and process of inspections.
The observer will work with the staff, under the direction of the lead staff member, in the collection and analysis of simulation facility data, but will not perform evaluations or participate in staff decision-making regarding the findings of inspections.
7
)
17 3'hfRE OFF-SITE PRASE
't q
r
- s, Once'the URC decides to conduct an inspection of a pirulation facility, the staff members who will perform'the innpection will be selected and thesoff-site review will Nbegin.
s T
Y 3.2,Heview of Available Data The first step in the off-site review is to examine the information about the simulation facility which the staff has at its disposal as a result of a ?orm 474 certification, examiner feedback, or other sources.
The type of information to be examined and its usefulness to the review are described here for each of the four major areas of evaluation.
3.1.1 Performance Testing
, Performance testing is more Mtitably evaluated during the en-site review.
Some aspectsicif simulation facility performance can be evaluated during the off-site review, however.
The staff will use the off-site review phasetto begin the selection of the operations and malfunctions:to be included in the performance teste during the on-site review.
As discussed in Section 1.3 " Scope of Inspection,"
performance testing will emphasize those operations which:
arw required for; certification as set forth in the 1
Rule and on NRC Form 474 l
are relevant to licensing examinations haveachuallyoccurred'intherefkrenceplantor similar plants (where feasible), and make use of the reference plant's operating H
procedures.
The guidencoSfor performing the off-site evaluation and for
- selecting operations"Qithin these limitations is given belcw.
i t :,
3.1.1.1 The Standard' Using the listing of normal operations given in Section 3.1.1 of the Standard, the listing of plant malfunctions given',in Section 3.'i.2 of the Standard and the guidance given in Appendix B of the Standard, the staff will identify those operations and malfunctions which are appropriate to the reference plant under consideration.
i 8
?
k
I l
.l 3.1.1.2 NRC Form 474 The facility licensee's submittal on NRC Form 474 will be examined to ensure that it meets the requirements for j
certification of a simulation facility consisting solely of j
a plant-referenced simulator in accordance with 10 CFR Part 1
55.45.
The following will be confirmed:
1.
That the list of normal operations, steady-state 1
operations, transient operations, and malfunctions for which the simulation facility has been certified encompasses all of the functions for a plant-referenced simulator, for the applicable plant type, required by the Rule and identified in Sections 3.1.1 and 3.1.2, and Appendiv.B of the Standard.
2.
That the listed operations and malfunctions have been incorporated into the performance and 1
operability test cycle as required by 10 CFR 55.45(b)(5) in accordance with the guidance in Regulatory Guide 1.149 (unless the facility licensee has proposed an alternate method to that given in Regulatory Guide 1.149).
l The staff will identify any missing or unscheduled operations and malfunctions, and may request additional information about them from the facility licensee.
The facility licensee's submitted Form 474 chould contain a listing of the performance tests conducted in support of its certification.
While these must address all of the requirements for certification, it is possible that they will not reflect all possible combinations of operations, malfunctions and conditions which may actually occur in the i
reference plant and of which the simulation facility may be capable.
The staff will use the information given in NRC Form 474 as an indication of the types of operations, malfunctions and testing conditions which would be reasonable to include in its inspection.
By cross referencing the operations and malfunctions identified as appropriate by the staff based on the Rule and the Standard against those identified in NRC Form 474, the scope of operations and malfunctions available for testing will be identified.
Although the staff is not limited to evaluating only those tests conducted for certification by the facility licensee as given on NRC Form 474, some of these tests might be reviewed to give an indication of the scope and quality of the facility licensee's testing program.
For example, by reviewing the facility licensee's performance test documentation, the staff will be able to assess the basic 9
I test procedure, the parameters monitored, the data used as a baseline, and the criteria used for determining performance acceptability.
The selection of-a performance test (or tests) to be reviewed will be based on staff judgment, and' the facility licensee may be requested to provide the details for a " representative test" that it has conducted.
3.1.1.3 Operator Licensing Examination Guidance The guidance contained in Examiner Standard ES-302 for 1
developing operator licensing examinations will be used to determine the approximate number and relative proportions of operations to be included in the performance tests.
Table l' presents the number of each type of operation and malfunction for which information will be requested during the off-site review, and for which tests will be conducted during the on-site review.
Table 1 Type and Number of Operations To Be Selected for Performance Testing Maximum Number Maximum Number
^
Type Selected for the Selected'for the Operation Off-site Review On-site Review Normal operations 3
2 Abnormal operations 9
,6 (including. instrument and component failures)
Emergency operations 3
2 Total 15 10 Consideration will also be given to selecting a range of operations within each operation type, as well as a variety of operating conditions for the performance testing.
For 4
example, operations which affect reactor power control, the condensate system, the turbine, and electrical distribution may be celected to obtain breadth.
Tests of the same system at two or three different power levels may be selected to satisfy testing under a variety of plant conditions..
10 e
h
3.1.1.4 Reference Plant and Similar Plant Operating History Because the staff will be able to observe performance tests for only a very limited number of operations and malfunctions, it is most practical to select those malfunctions and operations which are capable of being directly evaluated against actual plant data when possible, and which fall within the scope of operator licensing examinations.
i RevieFing the operating histories for the reference plant i
and similar plants will allow the staff to identify operations and malfunctions which have actually occurred and for which plant data should be available.
A primary source of these data will be the Licensee Event Reports (LERs) for the reference plant and similar plants.
LERs describe abnormal and emergency operating events which have occurred at the plant.
They provide a summary of the nature of the event, the components and instruments involved, and the general plant conditions at the time the event occurred.
There are often more extensive data collected after an LER-reportable event (e.g., data collected for post-trip reviews in accordance with Generic Letter 83-28, " Required Actions Based on Generic Implications of Salem ATWS Events," July 8, 1983).
In addition, summaries of LERs are available to the staff.
Although many LERs will be good sources of operating event data, most of them will not be relevant to evaluation of simulation facilities.
Therefore, the staff will use the guidance given elsewhere in this section when reviewing them.
As many of the operations and malfunctions as possible will be selected from the reference plant LERs.
Similar-plant-LERs will be used when there is insufficient operating history for the reference plant, or when a similar-plant LER is judged to be especially relevant.
When LERs from similar plants are being considered, the primary bases for determining an LER's applicability will be the significance of the test and the similarity between the systems involved in the similar plant and those in the reference plant.
3.1.1.5 Operating Procedures To conduct a licensing examination, examiners must be able to evaluate a license candidate's use of the reference plant's operating procedures.
This is particularly true for emergency operations since the Emergency Operating Procedures (EOPs) may be the operator's only guidance.
Further, since it is unlikely that the candidate will have had actual experience with many plant emergency conditions, the candidate's ability to use plant procedures will be an 11
i important basis for evaluation during the examination.
l Thus, it is important that the simulation facility allow these procedures to be used as they would be used in the reference plant.
I As discussed above, the review of the reference plant's operating history probably will not provide all of the operations required for the performance tests.
The l
operating procedures provide the logical complement for completing the set of operations to be evaluated.
l other plant procedures in addition to Eops will also be considered.
For example, surveillance procedures make good performance tests.
They are straightforward and reference plant data are available for comparison.
Startup test I
procedures will also be considered.
These procedures have l
reference plant data and acceptance criteria associated with them, and they are cited in the Standard for use in evaluating transient data (Section 4.2.1).
Other procedures may be used as appropriate.
3.1.1.6 Cognizant Individuals Individuals available to the staff who are knowledgeable about the operating characteristics of the reference plant and/or the simulation facility can identify operations and malfunctions to be considered for use in performance tests, These individuals may include license examiners who have administered examinations at the simulation facility.
Their guidance may be useful in identifying problem areas at the simulation facility; ensuring that the operationc selected are relevant to operating licensing examinations; and identifying operations for which reference plant data is available.
3.1.1.7 Steps for Identifying Operations for' Performance j
Testing i
The result of the activities described above will be a candidate set of operations and malfunctions for use in performance testing.
The basic steps for making this selection are summarized below.
This is not a strict' sequence of events, and many of the steps may be conducted 1
in parallel:
i 1.
Identify the operations and malfunctions given in the Rule and the Standard which are applicable to the reference plant.
2.
Review the facility licensee's Form 474 for completeness.
12
(
1 4.
.i i
3.
Consider the potential simulation facility y
limitations indicated by the operations and malfunctions given on Form 474.
4.
Review the LERs for the reference plant and for similar. plants for appropriate operations and malfunctions.
5.
Identify operations and events for the performance tests which are based on the operating procedures.
6.
Contact cognizant individuals regarding the selection of operations and malfunctions and potential sources of reference plant data.
7.
Apply the guidance related to operator licensing examinations for selecting the number and variety of operations to include.
8.
Select the subset of operations and. malfunctions for which information will be requested from the facility licensee.
}
3.1.1.8 Additional Considerations In the course of determining which. operations and malfunctions are candidates for performance testing, the following additional issues will be considered.
1.
Due to the limited on-site time available'for conduct of the' performance tests, exceptionally time-consuming operations will be avoided.
Parts of such operations may be useful, however.
2.
Those operations and events which involve actions.
and/or decisions en the part of the operator (s) are
]
more likely to be candidates for licensing.
examinations, and therefore will receive greater emphasis.
]
1 3.
Operations or malfunctions associated with any reported inappropriate or unexpected behavior from the simulation facility will be considered for j
inclusion in the performance test.
3.1.1.9 Data To Be Requested Once the operations and malfunctions to be considered for performance testing have been selected, the staff will identify the data associated with them, and willLrequest those data from the facility licensee if such data are not.
qi already available to the NRC.
The staff will use the data to develop performance tests for the on-site review.
An'on j
site visit by one or more members of the staff may be 13
i required for the collection of the data.
The following data sources are candidates:
1.
Control room normal, abnormal, and emergency operating procedures.
J 2.
Listings of differences, if any, between the procedures used in the reference. plant and those used in the simulation facility.
3.
Descriptions of selected performance tests conducted by the facility licensee in support of certification of the simulation facility.
4.
Summaries of reference plant data obtained.as a-result of events associated with LERs.
5.
Procedures and data from selected reference plant startup tests.
6.
Relevant simulator exercise guides.
7.
Relevant simulator acceptance test procedures.
3.1.1.10 Products The products of this portion of the off-site review will be:
1.
A listing of'any relevant operations and malfunctions which were not addressed by NRC Form-474.
2.
A listing of candidate operations-and malfunctions q
being considered for performance. testing.
3.
A listing of the data to be requested from the a
facility licensee.
j
)
3.1.2 Physical Fidelity / Human Factors As discussed in Section 1.3, " Scope of Inspection," physical fidelity / human factors addresses the comparability of the simulation facility and reference plant in the areas of panel simulation, instrument and control configuration, and ambient environment.
)
Each of these areas was addressed in detail for the reference plant during the Control Room Design Review I
(CRDR).
This review addressed a variety of control room features including: control room and panel layout, displays, controls, all aspects of the control room environment, and differences between the physical configuration of the simulation facility and of the reference plant.
Human Engineering Discrepancies (HEDs) were prepared as required 14
}
and submitted to the NRC in a CRDR Final Report.
These HEDs may be useful for identifying problem areas or changes that were made to the reference plant's control room.
In conducting the CRDR, it was necessary for the utility to collect data on lighting, the auditory environment, and signal coding.
These data may also have been included in i
the CRDR Final Report and can be of use to the staff.
l l'
It should be noted that the CRDR will become less useful over time since the simulation facility will undergo modifications to maintain its fidelity to the reference plant.
The staff may meet with cognizant individuals who can provide guidance for the physical fidelity / human factors review.
These individuals may include license examiners who have conducted operator licensing examinations at the simulation facility, and those staff members who evaluated the CRDR data.
Other data sources will be considered as l
i necessary.
3.1.2.1 Panel Simulation The area of panel simulation concerns the layout of panels l
within the control room, the layout of instruments and controls on the panels, and the use and application of information and localization aids.
Information and localization schemes used in the reference plant control room may include background shading, demarcation, mimics, hierarchical labeling, color coding, and shape coding-.
From the review of the available data, the staff may ask the facility licensee to provide drawings or photographs of portions of the control room and its panels.
3.1.2.2 Instrument and Control Configuration For the purpose of a simulation facility inspection, instrument and control (I&C) configuration is concerned with l
the physical appearance and operation of the displays and controls on the boards.
Examples include the range and units displayed on meters, and the switch positions on j
controls.
Information that the staff may request might include photographs of specific I&Cs, the application of zone banding schemes for displays, or various cathode-ray-tube (CRT) displays.
3.1.2.3 Ambient Environment The ambient environment includes the lighting, auditory environment, alarms and auditory signals, and communications systems.
The staff will identify ambient environment data that the facility licensee will be requested to provide for the review.
15 I
l 1
l i
3.1.2.4 Products a
l The products of the physical fidelity / human factors off-1 site review may include the following:
{
W 1.
Photographs or drawings illustrating.the known 1
physical differences between the reference plant and-the simulation facility.
l l
2.
Photographs or drawings of the control room and/or i
selected panels for both the reference plant and the simulation facility.
1 3.
Descriptions and/or photographs of selected coding schemes used in the control room of the reference plant and in the simulation facility.
4.
Photographs of selected I&Cs for both the reference i
plant and the simulation facility.
5.
Data for relevant characteristics of the ambient j
environment for the reference plant and the simulation facility control rooms.
3.1.3 Control Capabilities
)
l Control capabilities are those features of the simulation facility which allow the simulator operator to control and monitor the simulation facility while it is being used for i
operator licensing examinations.
The staff may not have i
data about these capabilities at its disposal.
Therefore, j
requests will be made based on the requirements of the j
Standard.
This information may be requested not only for review but also as information which might be helpful in developing the performance test.
i Documentation for the following simulation facility control and monitoring capabilities will be considered for request.
1.
The number and variety of initial conditions available on the simulation facility.
2.
The capability for inserting and terminating malfunctions.1 3.
The capability of simulating simultaneous and/or sequential malfunctions.
1These capabilities are concerned more with performance test development than with evaluation of the simulation facility.
16
1 4.
The capability of incorporating new malfunctions into the system.
5.
The capabilities of freeze, fast time, slow time, backtrack, and snapshots.2 1
6.
The capability for allowing the simulation facility operator to perform the functions of auxiliary or remote operators.
I 7.
Administrative controls or other means for alerting the simulation facility operator when the simulation has gone beyond plant or simulator
]
design limits.
i 8.
The capabilities for monitoring and recording i
critical parameters.
3.1.3.1 Products A listing of the control capabilities for which data will be requested, l
3.1.4 Design, Updating, Modification, and Testing 1
The staff may have little information at its disposal (with j
the exception of the testing schedule provided on Form 474) regarding design, updating, modification and testing of the simulation facility.
As a result, requests for data in these areas will be made based on the criteria of the Standard.
The testing schedule will be reviewed and additional information requested if needed.
3.1.4.1 Products j
Documentation for the following areas will be considered for request.
1.
Records of reference plant modifications for specific time periods.
A time period may be selected at random, or based on information available to the staff such as NRC-imposed plant or simulation facility modifications.
2.
The status of the modifications identified above with respect to their incorporation into the simulation facility (e.g., pending assessment, modification completed, not incorporated).
2These capabilities are concerned more with performance test development than with evaluation of the simulation facility.
17
l Requests for justifications for not incorporating reference plant modifications into the simulation facility may also be considered.
3.2 Reauest the Information From the Facility Licensee l
After the staff has reviewed the available information about the reference plant and the simulation facility, it will develop a listing of the data to be requested from the facility licensee in each of the four major areas.
The staff will prepare a letter to the facility licensee explaining that an inspection of its simulation facility is to be conducted.
This letter will explain:
I
- the nature of the inspection
- the personnel who will conduct it q
- the tentative schedule for the inspection j
- the information required from the facility licensee
- the degree of cooperation requested from the facility licensee l
As an alternative, one or more members may visit the i
facility licensee to obtain and review this information.
{
Such a visit would permit the staff to directly and l
efficiently determine the availability and utility of the l
data needed for the inspection.
It would also reduce the j
amount of data that would otherwise have to be transmitted i
between the facility licensee and the staff.
j 3.3 Review of the Data Obtained From the Facility Licensee once the data requested from the facility licensee have been obtained, the staff will review the data to determine if everything requested was received.
Any missing or incomplete data will be listed so that, if necessary, it may be requested again.
j The data will be evaluated using the applicable evaluation criteria given in Section 5 of this procedure.
A list will be made of any items not in compliance with the evaluation criteria.
The staff will review this listing to determine the impact of such discrepancies on the simulation l
facility's acceptability for the conduct of operator licensing examinations.
3.4 Determine if the On-Site Review Will Be Conducted j
1 f
Once the data obtained from the facility licensee have been l
assessed, the staff will determine if the on-site review f
will be conducted.
This decision will be based on the 18
I results of the evaluation of the data received.
This may also include information obtained during the review of available data (e.g., reports from cognizant individuals l
such as license examiners; findings in HEDs or LERs; recent j
events at the reference plant or simulation facility which i
indicate potential problems with the simulation facility).
i If the staff decides to conduct the on-site review, the i
facility licensee will be notified of:
the schedule for the inspection personnel, facilities and data that the staff will j
need to have available to them, and
]
l any additional data that are needed by the staff in l
advance of the on-site review, and a date by which I
such data are needed.
l l
3.5 Minimizina the Burden on the Facility Licensee The staff will work with the facility licensee, to the extent possible commensurate with its responsibility for the conduct of the inspection, to minimize the resources required from the facility licensee.
This will include l
consideration of data to be requested, personnel needed,
{
testing to be conducted, and the schedules for
. j correspondence and for the on-site review.
i 1
l l
19
_y i
4 'THE ON-SITE REVIEW l
During the on-site review, the staff will evaluate the j
simulation facility in detail.
This may include conducting
(
performance tests, visiting the reference plant control room i
to verify the physical fidelity of the' simulation facility, or performing further investigation into any other characteristics of the simulation facility identified during
)
the off-site review.
]
This section has been divided into three major parts.. The j
first addresses the preparation for the review prior to the 1
arrival of the staff on site.
The second addresses the l
course of events for the on-site review.
The third addresses the conduct of the review itself.
j 4.1 Preparation for the On-Site Review The purpose of this preparation phase is to identify, in l
detail, the data to be collected on site and the methods for I
the data collection.
Because the staff will spend less than l
one week on site at the simulation facility, it will be necessary to devote as much of that time.as possible to actual data collection.
To do this effectively, all of the data to be collected should be clearly identified, the data collection procedures should be in place and understood, and 4
all data collection forms should be prepared prior to arrival at the site.
The subsections for each.of the major I
I areas of data collection describe this preparation process.
During this preparation phase, the staff will communicate with the facility licensee as required.
This will be done to clarify any questions the staff may have about the. data received from the facility licensee or about the actual operation of the reference plant and simulation' facility.
It may be desirable for one or more staff members to visit the site prior to the on-site review to: clarify data availability, format, and access; plan the logistics of the on-site review; and establish contacts at the working level 2
with the facility licensee.
To the extent possible, the several areas of the review will be coordinated with the performance tests being conducted.
For example, the instruments and controls to be used in the I&C configuration evaluation and the control capabilities to be evaluated will be selected from those associated with the performance tests being performed.
4.1.1 Performance Testing The purpose of this part of the preparation is to make the final determination as to which operations and malfunctions are to be included in the performance tests, and then to develop these tests.
The selection of such operations and 20 w-__-______-___-_-________
1 malfunctions will be based on the results of the off-site review.
The process for developing the performance tests is a combination of those processes used for developing simulator acceptance test procedures (ATPs), operator licensing examinations, and simulator training and exercise guides developed by the facility licensee.
The process described
)
in this procedure adopts a format similar to that used by l
ATPs, and combines with it the' approach to testing given in j
the examiner standards and in the exercise and training guides.
The result is a performance test that focuses on the behavior of the simulation facility in the context of l
cperations which are relevant to operator licensing l
I examinations.
Other than the required level of detail, the only I
significant difference between this process and an operating 1
test is the focus on the behavior of the simulation facility
]
as opposed to that of the operator license candidate. Thus, J
" perfect operator response" will be needed for the conduct of performance tests, except when the requirements of a particular performance test (e.g.,
replication of an LER) dictates that specific operator responses be recreated (e.g.,
see paragraph A5 below).
The steps described in this procedure for developing the performance tests are examples of the content and format to be considered when conducting a review.
They will of necessity be varied to meet the needs of specific circumstances.
If a facility licensee's performance test is to be repeated, if startup test procedures are to be run, or if a surveillance test is to be conducted, it may be possible to utilize these directly without a formal development process.
As a general rule however, performance tests will be developed to a level of detail and in'such a format that if they were repeated, the same results could be expected.
4.1.1.1 Steps for Developing the Performance Tests The preparation for the performance tests to be conducted at l
the simulation facility will include the following steps:
A.
The normal, abnormal, and emergency operations to be included in the performance test will be selected from those identified in the off-site review.
The following guidance will be used when making this selection:
1.
A broad spectrum of events will be chosen which exercises as many plant systems as possible.
This may include the limiting cases of the evolutions selected.
21
~
l 2.
Known simulation facility and referenc's plant operational limitations will be considered.
l 1
3.
Events cnr operations which are suspected of-not meeting the requirements of the Ru)e or the Standard will be considered.
j 4.
If a particular pothntial problein is to be j
investigated, performance tests may be selected I
which approach it from various directions.
This 1
will help to indicate the magnitude of the i
discrepancy, if it exists.
i 5.
Misoperations on the contral-boards which initiate malfunctions, such as those reported in an LER, will be considered.
6.
Information from the facility licensee that it may not be possible to run certain performance tests on This the simulation facility will be considered..
may occur, for example,.if particular malfunctions or events are not simulated.
7.
Scenario characteristics such as initial conditions and the timing of events /will be specified.
8.
A brief outline describing the performance test will be developed at this point.
It will be used as the bacis for the more detailed development of the tent in later steps.
B.
Once the operations, events, and scenarios for the performance test have been determined, the staff will identify the appropriate plant / simulation facility procedures to be used.
C.
Working through the appropriate plant / simulation facility procedures, a step-by-step outline (similar to j
the " Simulator Scenario Form" in Examiner Standard j
ES-302) will be developed.
This first draft will be primarily concerned with determining the general sequence of events.
D.
After the procedural steps have been delineated, the staff will:
1.
Identify the critical parameters for each performance test.
2.
Identify all critical alarms and automatic actions which would be expected to occur in the reference plant.
22 1
3.
Identify control actions which would result in an expected change in any of the critical parameters.
This may include misoperations used to initiate events.
4.
For each of the critical parameters identified, determine the start time, duration, and required frequency of recording.
This will be based on the need to capture sufficient information at the proper resolution for evaluation of the simulation-
)
facility.
j 5.
For the alarms, automatic actions, and control functions defined above, determine the approximate j
times, setpoints, and sequences at/in which they would be expected to occur.
6.
Identify simulation facility control capabilities required for conducting the performance test.
The points in the performance test at which they are to occur will also be determined.
The simulation facility control capabilities to be considered i
include:
j a.
Freezing the simulation.
I b.
Simulation of auxiliary operator functions performed outside the control room.
c.
The means for alerting the instructor when the simulation approaches simulation facility or plant design limits.
d.
Initialization conditions.
e.
Insertion of malfunctions.
f.
Adjustable rates for malfunctions.
g.
Simulation of simultaneous or sequential malfunctions.
E.
All baseline data which will be used for comparison in the performance test evaluation will be identified.
A list will be made of any needed baseline data that were not obtained during the off-site review.
F.
For each of the identified critical parameters, as well as the annunciators, automatic actions, and response to control functions, the means of data collection will be determined.
These means may include:
23
1.
Direct recording by the simulation facility's computers.
2.
Strip chart recordings or other data logging devices built into the system.
3.
Manual observation and recording by the staff.
The specific means used will be based.on availability in the order of preference listed above.
A list of the available means, including the points at which they are to be used in the performance test, will be made.
G.
Any differences between the simulation facility procedures used for the' performance test and the equivalent procedures used in the reference plant should be noted.
H.
When the draft of the performance test is completed, it will be more formally developed.
The Simulation Facility Performance Test Form given in Appendix A should be used for this purpose.
Any additional information that may be required for the development of the performance tests will be obtained from the facility licensee.
This process will result in a completely developed performance test.
Once the performance tests are developed they will be prioritized for their importance in the evaluation of the simulation facility.
This will help to ensure that those tests which are expected to provide the most information will be run.
At this time, a tentative schedule for the t
conduct of the tests will be developed.
A typical performance test schedule might start with a normal operation, followed by a few higher priority abnormal operations, then the highest priority emergency operations, and conclude with the lower priority normal and abnormal operations.
4.1.2 Physical Fidelity / Human Factors Most of the data needed for the on-site review of physical fidelity / human factors will already be available to the staff, or will be readily obtainable from the facility licensee.
In some cases, it may be necessary to review original documentation for HEDs, the CRDR or other data sources.
This may require that the cognizant staff member visit the location where these data are stored prior to the:
on-site review.
24
4.1.2.1 Panel Simulation 1
A.
The selection.of the control room layout features and j
1 the panels to be evaluated should.be based on the.
.}
following:
1.
Features and. panels associated with the' performance tests (if conducted).
2.
Features and panels specifically cited in.the LERs reviewed.
l 1
3.
Features and panels identified in the HEDs or other j
data reviewed.
J B.
Reference plant and simulation facility documentation, i
listings, drawings, and photographs will be selected and used for:
1.
Verification of the location of. panels and systems..
2.
Verification of the layout of I&Cs on panels.
3.
Verification of the consistent use of any of the following information and localization schemes:
a.
Background shading.
b.
Demarcation.
c.
Mimics.
d.
Hierarchical labeling.
e.
Color coding.
f.
Shape coding.
C.
Listings will be made for each of the above features i
selected for evaluation.
4.1.2.2 Instrument and Control Configuration A.
If a performance test is to be conducted,.the I&Cs to be evaluated will be selected from the critical parameters, annunciators, and controle identified during the preparation for the performance test.
If a performance test is not conducted'as part of the review, the following sources will be used:to identify I&Cs for evaluation (these sources may also be used in addition to performance test I&C identification):
1.
I&Cs specifically cited in the LERs reviewed.
25 L
1
2..
I&Cs identified in the HEDs or other data reviewed.
3.
I&Cs associated with'systemn or panels for which-the staff.has requested data for the panel simulation. portion of the review.
4.
I&Cs identified as Type A,' Category.1'in accordance with: Regulatory Guide 1.97.
In accordance with.
this regulatory guide, these I&Cs are considered to.
be the most critical for monitoring plant conditions,during emergency operations.
B.
Once all of the candidate' components are identified, a maximum of 100 will be selected for-evaluation.
The available information for each I&C will be_ gathered photographically, or through use of the appropriate form given in Appendix-B.
4.1.2.3 Ambient Environment A.
The staff may require the facility' licensee ~to identify-and provide justifications for any differences between the reference' plant control room and the simulation facility environment, in accordance wl'5 Section 3.2.3 of the Standard, for the'following:
1.
Ambient auditory environment.
~
2.
Alarms and auditory signals including coding of the alarms and signals'.
~
3.
Lighting.
4.
5.
Availability and operability of communications systems.
B.
Any additional information that may be required for the evaluation of the ambient environment will be obtained from the facility licensee.
4.1.3 Control Capabilities i
If performance tests are to be conducted, cimulation facility control capabilities may be incorporated into-them and tested during the course of the performance tests.
These tests of the control. capabilities will be cited on the Simulation Facility Performance Test Form.
l l
If performance tests are not conducted, the staff may make arrangements with the facility licensee for demonstration of selected capabilities.
26 1
l
I J
4.1.4 Design, Updating, Modification, and Testing If investigations of the data and the means used for simulation facility design, updating, modification, and testing are to be conducted during the on-site review, they will be identified.
Specific questions and additional i
information to be addressed should be delineated as well.
I 4.1.5 Products When the preparation for the on-site review is complete, the staff will have the following products (if all areas of the review are to be conducted):
1.
The fully developed performance test on the Simulation Facility Performance Test Form or some comparchle form.
2.
A listing of the panel simulation features and l
environmental data to be evaluated.
3.
A listing of I&Cs to be evaluated.
4.
A listing of the control capabilities and the areas of design, updating, modification and testing to be evaluated.
_ l 4.2 General Course of Events for the On-Site Review Data for the on-site review will normally be collected I
within a period of one week at the simulation facility.
l Access to the reference plant control room will be required i
for some members of the staff for approximately one day.
The time spent on site vill be used to collect and analyze data, perform a preliminary evaluation of the results, and discuss these preliminary findings with the facility licensee.
After leaving the site, the final results will be developed and findings will be documented in an NRC inspection report.
l The paragraphs that follow give an overview of the l
activities to be performed on each of the days of the on-l site review as well as the activities to be performed by the staff after the completion of the on-site visit.
4.2.1 Activities for Each Day Day 1 Upon arrival at the simulation facility, the staff will meet with the facility licensee to outline the activities to be performed during the inspection.
Once the meeting has been completed, the staff will begin the inspection with the 1
27
l I
assistance of the appropriate simulation facility personnel l
as required.
Before beginning data collection, the staff will review the performance tests to be run with the facility licensee.
Based on this review, modifications will be made to the I
tests if needed, to ensure that they provide a sound and j
fair test of the simulation facility.
After the discussion of the tests to be conducted is finished, any needed preparation for the data collection will begin (as time permits).
l Days 2 and 3 1
The actual data collection will begin on day 1 of the j
on-site inspection and will continue into day 3.
These days will be primarily devoted to conducting the simulation facility performance testing, which may require the participation of all members of the staff at times.
On a non-competing schedule, the staff will conduct the other aspects of the inspection.
Day 4 The activities for day 4 will primarily involve the i
evaluation of the data collected using the guidance given in
.]
this document.
Data collection will continue as needed.
l Day 5 l
Any of the evaluations not completed on day 4 will be finished on day 5.
Then, the staff will present the preliminary findings to the facility licensee, who will be given the opportunity to provide additional information related to the findings.
Upon conclusion of this reeting, l
the staff will leave the site.
I 4.2.2 Analysis of Results and Staff Response After returning from the on-site review, the staff will evaluate the data collected in greater detail if necessary, and prepare a report of its findings.
These final results will be documented in an NRC inspection report.
4.3 Data Collection The subsections that follow give the procedures for collecting the data for each of the data groupings.
They should be applied as appropriate for the specific reviews which the staff has chosen to conduct.
28 i
.1 l
4.3.1 Performance Testing A.
The following' steps will be completed in final preparation for performance testing:
y 1.
The simulation facility operator will be briefed on the performance test to be conducted.
The following areas will be emphasized in order to ensure their inclusion and. proper function in the performance test:
a.
The operations and malfunctions to be simulated.
l b.
The instrument and component failures to be I
simulated.
c.
The scenario conditions including the initial.
conditions and the timing and sequence of events.
i d.
The simulation facility control functions to 1
be tested.
e.
The parameters on which data will be collected, and the methods for this data collection.
4 l
2.
Potential problem areas will be identified and I
resolved.
Any problem areas that would result in changes to the performance test or in the inability l
to collect the desired data will be noted, and changes will be made accordingly.
3.
The performance test will be' set up on the i
simulation facility by the. facility licensee.
I 4.
Individual operations and malfunctions will.be pretested as required.
The specific activities to be pretested will be left to the judgment of the staff and the simulation facility operator.-
5.
All automatic data collection mechanisms will be tested as necessary.
6.
All manual data collection activities will-be rehearsed.
7.
The operating crew which will be performing the manual portion of the performance test will be l
briefed.
This briefing will include the-following:-
l 29
a.
The entire performance test will be reviewed in detail with emphasis on the activities that i
the crew is to perform.
l b.
It will be emphasized that simulation facility performance, not operator performance, is being evaluated.
Even though " perfect operator response" may be needed for certain tests, it will be made clear that the operators will not be penalized in any way if an operator error is made.
c.
Any intentional misoperations that are required in conducting the performance test (e.g., in order to replicate an LER) will be described, d.
Any questions about the operating crew's role l
in the conduct of the performance test will be l
answered.
It should be noted that this operating crew need not be made up of licensed operators, although this j
may be desirable.
The makeup of the operating crew l
will be left to the discretion of the facility licensee.
-l 8.
The role that each individual member of the staff will play in conducting the performance test will i
be determined and explained.
The specific role for i
each staff member may vary from one performance j
test to another.
In general however, the following I
assignments may be expected:
l a.
The lead staff member will generally be the test director or will assign another member of the staff to that role.
He or she will ensure that the test is completed in a timely manner and that all test conditions are met, and all required data are collected.
The test director will also approve any modifications to the test that are required before or during i
the course of the test.
b.
The license examiner will be responsible for i
monitoring that the test proceeded as planned, I
and for tracking adherence to plant procedures.
The license examiner will note any deviations in the performance of the test as well as any suspect behavior of the simulation facility.
l i
30 l
c.
The operations specialist's duties are likely to vary.
He or she will, however, assist the license examiner; monitor specific systems, operations, or instruments of interest; and perform other functions as required.
d.
The human factors specialist will monitor the performance test for any human interface problems which may be encountered, and will perform other functions as required.
At this point everything should be in place for conducting the performance test of the simulation facility.
B.
The simulation facility performance test will be conducted using the following procedure:
1.
The test director will oversee the performance tests to ensure'they are conducted as planned.
Facility licensee personnel will actually operate the simulation facility, while the staff will be present as observers.
2.
The scenarios, operations and events for each phase of a test will be briefly reviewed again by all participants prior to commencement of each exercise.
3.
The test will begin, and will follow the activities listed on the Simulation Facility Performance Test Form as closely as possible.
4.
During the course of the test, the operating crew and the simulation facility operator will call out the actions they are performing as they do them.
This will help to keep all parties involved in the performance test aware of what is happening at all times during the test.
5.
Members of the staff will follow the course of the test using the Simulation Facility Performance Test Form.
They will annotate the form and make other notes as required to indicate both appropriate and inappropriate performance of the simulation facility as the test progresses.
6.
A member of the staff will periodically check all active data recording mechanisms to ensure that they are functioning properly.
Simulation facility personnel should'be available for any needed modifications or repairs to the equipment.
31 i
i L------------------------ -------
J
7.
If unusual simulation facility behavior is encountered during the performance test, it may be desirable to freeze the simulation so.that the details of what has occurred can be understood und documented if necessary.
This decision will be left to the discretion of the staff and the simulation facility personnel.
8.
Upon completion of each test, all participants, including the operating crew and the simulation facility operator, should participate in a short debriefing.
This will help to ensure that all relevant information and observations about the test have been noted.
9.
During each performance test, the staff vill spot check the procedures being used to ensure that those procedures are the same as those used in the reference plant control room.
Only those differences previously identified by the facility licensee should be present.
Any differences found, whether previously identified or not, should be evaluated for their impact on the conduct of a licensing examination.
4.3.2 Physical Fidelity / Human Factors The human factors specialist will perform the physical fidelity / human factors evaluation with assistance from other i
members of the staff and facility licensee personnel as needed.
Since visual observation and comparison is an important factor in these reviews, it may be advantageous to use
" instant" photographs for the data collection.
Any information which would help to clarify the content of the photographs should be written on the back of them.
These 4
photographs may be used in lieu of the other means of data collection described in this document.
Due to the great amount of detail in the data collection for this portion of the review, there is a possibility of error.
Thus, the human factors specialist should confirm any discrepancies found.
4.3.2.1 Pane:;. Simulation The data collection for the panel simulation will be conducted using the following steps.
A.
The selected drawings or photographs for the control room layouts for both the reference plant and simulation facility will be verified.
They will then be compared and the differences will be noted.
32 l
l l
B.
The drawings or photographs for the panels. selected for review for both the reference plant and simulation' facility will be verified.
They will then1be compared for the location of both systems and components on the panels and the differences will be noted.
C.
Drawings and/or photographs for the informational and localization schemes selected for review for both the reference plant and simulation facility will be verified.
They will then be. compared and the differences will be noted.
4.3.2.2 Instrument and control Configuration The I&C data will be collected from the reference plant and the simulation facility control rooms.
The data will be collected using photographs or the appropriate forms given in Appendix B.
The instructions for filling out thece forms are contained in Appendix B.
4.3.2.3 Ambient Environment The human factors specialist will evaluate the environmental data that were requested from the facility licensee.
He or-she will visit the reference plant and simulation facility control rooms to verify the accuracy of these data, if-required.
If necessary, the human factors specialist will '
request that certain functions (e.g., alarms, annunciator test) be demonstrated in the simulation facility and/or the reference plant.
Cognizant facility licensee personnel should perform these demonstrations.
4.3.3 Control Capabilities Any simulation facility control capability data that will.be collected independently of a performance test will be done using the following steps.
A.
The staff will discuss the data to be collected with the simulation facility oparator.
B.
Any data collection mechenisms required will be set up.
J C.
The functions will then be performed by the simulation facility operator.
The staff will monitor the testing.
of the simulation facility control functions.
4.3.4 Design, Updating, Modification, and Testing Any simulation facility design, updating, modification, and i
testing data to be reviewed will be collected by the staff j
with the assistance of facility licensee personnel as i
required.
33
e,.
~
?
1 l
l 5
EVALUATIONiT.P.ITERIA l
This section specifies the Svaluation criteria to be used for the review; The criteria n;re organized into.the.four major areas of review used throughout this procedure.
The-3 last part of this section addr;essos the assessment of.the j
results of these evaluations.
J 1
54.1 Performance Test 1DE The performance t At data vill be reviewed and the following l
evaluations will be made.
The parenthetical elements refer
{
to the applicable sections of the Standard.
J
'I 5.1.1 Evaluation of Farameters Measured 1
Each of the parameters tested will be evaluated using the following' criteria.
The decision-tree analysis shown in q
Figure 1 will be used.
Each of the paths in the tree should be followed to its conclusion.
A determination will be mado l
of the impact of anytnoncompliances on the acceptability of the simulation facility for conduct of a' licensing examination.
A.
General q
E l.
Are expected relationships between this parameter and other parameters, according to the baselino data, reflected ~over the course af the performance' test?
(3.1.1, 3.1.2,and A3.1) o e
B.
Alarms and Automatic Actions P
l.
Do all of the alarms and automatic actions occur that would have occurred in the reference plant?
(4.2. l(c) ).
)
1 2.
Do any alarms or automatic actions occut that would l
not have occurred in the reference plant?
(4.2.l(c))
s C.-
Transient Operations
~
L 3-If applicable reference plant start-up test procedure acceptance-criteria exist, does the value represented by the parameter fall within these i
cariteria?
( 4. 2 >. l (a) )
2.
Does the observable change in the parameter violate-the physical 1 1aws of nature?
(4,2. l(b) and 4;2.2) 34 i
l' l
l
d ie f
i l
l s
a s i
u a a
f Q P se
- o Y
N t
c n ?
a a m y
n rp a
r n x A
I o E K
O
+
s - -
e Y
o
?
N n
o e i t
h c
t e
r K
ini O
D e r g e n p s
a o e
h r
Y K
C P o
C K
? N U
K O
e d g e n o
N g a s
s a R s
e s
s r
e Y
o Y
e e
e r o
?
Y Y
v e p i N
n
?
o o
A w a
o w
?
N N
o K
r a
% e
?
u e i L n
e
% P O
a i l
l 10 i 2 i 1 e t
2 t
t a
% n r
1l r
o a e
1l h
d V
a n s in t
n n s e
d i ic ih B
i a t C S
E s
n s
h r t ?
e E
e e s t
y y t
h B it v
Y r e e n h i
f t
o W O e ?
A T H A P W o i
f N
t d W o a e
+
i r p o s
r r s
o e
e p ia N
o o
o c Y
Y N
N r c
- s p O r ?
u e e
?
e e
p a i l
t Y
t a s t
t r
b a
u n m n
t ic ic i ?
I r
r a t
C a
M t
o S
i c
N l
t A
y s lp r
0 e n e C
6 T F
A T
A
+
+
s e
b Y
K o
O
?
N t
e d n
e e
y t
r is a
t ia d e ru r
n e a p c a
t t
c o
o r
S S o
N r
?
T
+
ps s
s r
+
m a
ip e
A r
t l
h e
l l
s m
A A n
a o
r
?
i a
r e t
P o
t a
a t
t l
r e e n S e
R h s
t y
s is d m
e r o Y e n a r
ph a e a
o t
r t
T S
l r
A P i w
e c
n r
a e
s m
t r
m o
f t
a r s r e e a P
T P
y*
l i
l 1i
s 3.
Is the observ6ble change in the parameter in the same direction as that expected from the baseline data?
(4.2.l(b) und 4.2.2)
D.
Steady-State Operations 1.
If it is a critical parameter, does it fall within i2% of its reference value?
.(4. l (3) )
2.
If it is a noncritical parameter, does it fall within 10% of its reference value?
(4.1 ( 3 ) )
3.
Has the accuracy of the computed values been determined for a minimnm of three points over the power range?
(4.1) 4.
For a 60 minute test,tdoes the value of the parameter not vary more than i2% over the 60 minute period?
(4.l(2) and A3.2 (1))
5.1.2 Evaluation of the General Performance of the Simulation Facility A.
The general performance of the simulation facility will be evaluated uaing the following criteria.
A determination will be made of the impact of any noncompliance on the acceptability of the simulation facility for the conduct of a licensing examination.
1.
Where applicable to the malfunctions tested, does i
the simulation facility provide the operator the
]
capability of taking action to recover the plant, 1
mitigate the consequences, or both?
(3.1. 2 )
i 2.
For the performance tests conducted, is the
)
simulation capable of continuing until such a time j
that a stable, controllable and safe condition is attained which can be continued to cold shutdown i
conditions, or until the simulation facility operating limits are reached?
1 (3.1.2) 3.
Does the simulation facility provide the appropriate response to operator errors, if any were tested?
(4.l(3), 4.l(4))
36
4.
Does the simulation facility respond inappropriately to any correct operator actions?
i (4. l(3 ), 4.1 (4 ) )
s 5.
Are there any differences identified between the procedures used in the simulation facility and i
controlled copies of reference plant procedures?
(A1.4) l 6.
When tested by the staff, is simulation facility instrument error no greater than that of the comparable meter, transducer or related instrument system of the reference plant?
l
( 4.1 (1) )
5.2 Physical Fidelitv/ Human Factors 5.2.1 Panel Simulation Evaluation The following criteria will be used in evaluating the panel simulation.
A.
Control Room Layout 1.
Does the simulation facility contain sufficient operational panels to provide controls, instrumentation, alarms and other man-machine interfaces to conduct the normal evolutions and to respond to the malfunctions required by the Standard?
(3.2.1 and A1. 2. (1), A1. 2 (2) )
i 2.
Do differences from the reference plant in the relative locations of panels to each other result in a detraction in the ability to conduct a i
licensing examination?
(3. 2.1 and A1. 2. (1), A1.2 (2) )
3.
If panels not in the main operating area, such as back panels and remote shutdown panels, are not included, is there adequate simulation of the information obtained from them or the control functions performed on them to conduct a licensing examination?
(3.3.2)
B.
Panel Layout 1.
Are systems on the same panels as in the reference plant?
(3.2.1 and A1.2(3))
37 l
2.
Are systems in the same relative locations to each other within and across panels as they are in the reference plant?
(3.2.1 and A1.2(3))
)
3.
Is the general layout of components within a system or on a panel the same as in the reference plant?
(3.2.2 and A1.2(3))
i l
I 5.2.2 Instrument and Control Configuration Evaluation All of the differences found between the I&Cs reviewed for the simu3ation facility and the reference plant will be evaluated in order to determine their impact on the performance of a licensing examination.
The requirements of Sections 3.2.1, 3.2.2 and A1.2(2) of the Standard will be used for guidance in making these decisions.
5.2.3 Ambient Environment Evaluation 4
The criteria given below will be used in evaluating the ambient environment.
All of the criteria given are based on the requirements and guidance given in Sections 3.2.3 and j
A1.2(4) of the Standard.
Since the Standard is not as j
specific as the criteria given here, judgments of j
discrepancies will be based on available human factors data i
or good human factors ~ practice.
A.
Normal and Emergency Control Room Lighting 1.
Both should be simulated.
2.
The significance of any effects on the ability to conduct a licensing examination due to differences in the illumination levels or location of the lighting will be determined.
The following factors will be considered in making these judgments:
a.
Do differences in illumination levels affect the readability of any displays?
b.
Do differences in lighting fixture locations result in variations in glare or illumination which could affect readability of displays?
B.
Alarms, Signals, and Incidental Noise 1.
All audible alarms should be simulated to the extent that they may be used in conducting a licensing examination.
2.
Other signals and incidental noise should be simulated to the extent that they may be used in conducting a licensing examination.
38
3.
If auditory coding is used, it should be identical for the reference plant and the simulation facility.
C.
Communications Systems 1.
All communications systems that are expected to be used for communicating with auxiliary operators (or examiners acting as auxiliary operators) during an examination should be available and operational.
D.
Operator Cuing and Information Aids 1.
Documentation for operator cuing and information aids, including panel drawings and photographs, l
will be reviewed for each of the following when l
applicable:
a.
Background shading.
b.
Mimics.
c.
Demarcation.
d.
Coding schemes.
e.
Labeling schemes.
2.
They should be applied in the same instances and in the same manner in the simulation facility as they are in the reference plant.
Judgments about any deviations in the use of such aids will be made using the guidance given in Sections 3.2.1 and A1. 2 (2 ) of the Standard.
5.3 nontrol Capabilities The following criteria will be used for the evaluation of the simulation facility control capabilities.
A.
Control 1.
Does the simulation facility possess a minimum capability for storage of 20 initialization conditions?
For simulation facilities which have commenced operations within the last 18 months, or j
are referenced to plants which have commenced I
operations within the last 18 months, are at least ten of the conditions operational?
( 3. 4.1, A1. 3 (1), and 5.2) 2.
Do the initialization conditions include a variety of plant operating conditions, fission-product 39
.I A
l l
poison concentrations,-and times in core life?
(3.4.1 and A1.3(1))
3.
Does the simulation facility have the capability of freezing the simulation?
(3.4.3 and A1.3(2))
4.
Is it possible to conveniently insert and terminate each of the malfunctions being evaluated?
(3.4.2 and A1.3(2))
5.
Is the simulation facility capable of simulating-simultaneous or sequential malfunctions, or both, if these malfunctions can be expected by design or operational experience?
(3.4.2 and A1.3(2))
6.
Where operator actions are a function of the degree of severity of a malfunction, does the simulation facility have adjustable rates of such a range to represent the plant malfunction conditions?
(3.1.2 and A1.3(2))
7.
Are there any cues to the operator, other than i
those that would occur in the reference plant, that a malfunction has been introduced into the simulation?
(3.4.2 and A1.3(2))
8.
Are provisions (administrative or other) in place for incorporating additional malfunctions identified from operational experience?
(3.4.2 and A1.3(2))
B.
Instructor Interface 1.
For simulated actions performed outside the control room, does the capability exist for the simulation facility operator to perform the actions of an auxiliary operator?
(3.4.4, A1.3(3) and A1.3(4))
2.
Are provisions made for alerting the simulation facility operator when any aspect of the simulation approaches the simulation facility or plant design limits?
(4.3)
C.
Monitoring 1.
Are the critical parameters identified for performance testing obtainable in hardcopy form as either plots or printouts?
(4.4) 40
)
l
_a
2.
Is the parametric and time resolution of the hardcopy data for the parameters sufficient to determine compliance with the performance test criteria?
(4.4) 5.4 Desian, Updatina, Modification, and Testina The criteria given in this section are based on the requirements of Sections 5 and A2(4) of the Standard.
A.
Design Data 1.
Do baseline data exist for all parameters tested?
(3.1.2, 5.1, A2 and A3.3) i 2.
If multiple sources of baseline data are available, are they used in the following order unless otherwise justified?
a.
Reference plant ~ operational data - data l
collected directly from the reference plant.
b.
Analytical or design data - data generated through engineering analyses with a sound theoretical basis, c.
Similar plant data - data collected from a plant which is similar in design and operation to the reference plant.
d.
Other data - data, such as subject matter expert estimates, which does not come from any of the above sources.
(3.1.2, 5.1, A2 and A3.3) 3.
If the reference plant has been in commercial operation for 18 months, have plant data been included in the data base?
(5.1)
B.
Updating and Modification 1.
If
- 1) the reference plant has been in commercial operation for at least 18 months, and
- 2) it has been at least 18 months since the simulation facility's operational date, does the update design data base include actual plant data?
(5.2) 41
i i
2.
Is there an annual review of reference plant modifications?
(5.2) 3.
Has the first such review been undertaken within one year of the simulation facility certification?
(Regulatory Guide 1.149, Section C, Item 4) 4.
Have the simulation facility update design data been revised as appropriate, based on an engineering, training value, and licensing examination assessment of the reference plant modifications identified in.the annual review described in item 2 above?
(5.2) 5.
Is there a means of incorporating student feedback on the simulation facility into the updating and modification process?
{
(5.2) j 6.
Have all modifications to the simulation facility required as a result of the assessment performed in Item 3 above, been made within 12 months of their identification?
(5.3)
C.
Testing 1.
Are data from simulation facility performance tests which were performed after completion of initial construction and after any configuration or performance modifications available for review?
(5.4.1 and A2 (4))
2.
Are data from the annual operability testing available for review?
(5.4.2 and A2(4))
5.5 Known Discrepancies Iny discrepancies identified during the course of the review which were previously known to the facility licensee and for which resolutions or justifications were provided will be reviewed.- The staff will determine:
1.
If any of the discrepancies'could have a significant adverse affect'on the conduct of a i
licensing examination.
)
i 2.
If there are any facility licensee resolutions or j
responses with which the staff does:not agree.
I i
42 I
5.6 Assessina the Results of the Inspection Findinas Upon completing an inspection, the staff will review any of the discrepant items with respect to their impact, if any, on the ability to use the simulation facility to conduct a licensing examination.
This review will be conducted for discrepant items individually and in combination.
If the discrepancies found are judged to have little or no adverse effect on the conduct of a licensing examination, the staff will recommend only that the facility licensee correct them or document a basis for accepting them as is.
If the discrepancies are found to have a minor but definite impact on the ability to conduct a licensing examination, the staff will reauire.that the facility licensee correct the discrepancies as part of its ongoing simulation facility update program as required by the Standard.
Discrepancies Yhich constitute a " minor but definite" impact include those whose impact can be easily accounted for and overcome by a license examiner.
If the discrepancies are found to adversely affect the ability to conduct an examination on a given procedure, system, or event, the staff will reauire that the facility licensee correct these discrepancies on an accelerated schedule (i.e., less than the time permitted by the Standard).
Examinations will not be conducted using the procedure, system, or event until the correction is made.
If the discrepancies are found to greatly hinder or limit the ability to conduct an examination on the simulation facility, such that the requirements of 10 CFR 55.45(b) cannot be met, then operating examinations shall not be conducted until the facility licensee has corrected the discrepancies and decertified the simulation facility.
i l
43
REFERENCES American National Standard for Nuclear Power Plant Simulators for Use in Operator Training, ANSI /ANS-3.5-1985.
American Nuclear Society, La Grange Park, IL.
Oak Ridge National Laboratory, " Licensee Event Report (LER)
Compilation." NUREG/CR-2000, ONRL/NSIC-200. Oak Ridge, Tennessee 37831.
U.S. Nuclear Regulatory Commission, "10 CFR Parts 50 and 55, Operator's Licenses and Conforming Amendments, Final Rule."
52FR9453, Federal Reaister, March 25, 1987. Government Printing Office, Washington, D.C.
U.S. Nuclear Regulatory Commission, " Operator Licensing i
Examiner Standards." NUREG 1021, Revision 3, September 1986.
l l
Available for purchase from National Technical Information Service, Springfeild, Virginia 22161.
U.S. Nuclear Regulatory Commission, " Required Actions Based j
on Generic Implications of Salem ATWS Events."
Generic Letter 83-28, July 8, 1983.
Copies are available from the l
Office of Management and Budget, Reports Management Room j
3208, New Executive Office Building, Washington, D.C.
l 20503.
U.S.
Nuclear Regulatory Commission, Regulatory Guide 1.97,
" Instrumentation for Light-Water-Cooled Nuclear Power Plants to Assess Plant and Environs Conditions During and Following an Accident."
Copies are available from U.S. Government Printing Office, Washington, D.C.
20402.
ATTN: Regulatory Guide Account.
U.S. Nuclear Regulatory Commission, Regulatory Guide 1.149,
" Nuclear Power Plant Simulation Facilities for Use in l
j Operator License Examinations."
Copies are available from i
U.S.
Government Printing Office, Washington, D.C.
20402.
I ATTN: Regulatory Guide Account.
1 i
45 1
u_________________________.__.______._______.
1
)
l APPENDIX A l
SIMULATION FACILITY PERFORMANCE TEST FORM This appendix contains a sample " Simulation Facility Performance Test Form" which may be used for developing and conducting the performance test for the simulation facility.
l l
l l
t 1
i 1
)
l l
1 47 APPENDIX A c_-_____
SIMULATION FACILITY PERFORMANCE TEST FORM Simulation Facility XYZ Simulation Facility Date 00/00/00 Reference Plant Plant XYZ Performance Test NATURAL CIRCULATION - LER 85-019 I.
Initial-' Condition: 100%' power, middle of life II.
Data-Collection Method Identified Critical Parameters:
Analog:
Mark the~ recorders TO for Test 9.
Digital: Ensure data are recorded as follows:
Every'15 seconds until manual trip.
Every 5 seconds for one subsequent minute.
Every 60 seconds until RCP start Every 15 seconds until completion.
PZR pressure PZR PORV status PZR level VCT level Charging flow l
Letdown flow PZR spray line T N-45 (Nuclear Power) 1 D rod bank position Turbine first stage p - Tref l
Tave - auct hi l
Tcold - loop wide range - A,B,C Thot - loop wide range - A,B,C i
Incore Temperature i
subcooling Margin RCS loop flow - A,B,C
- RCS pump radial bearing temperature Main Steam header p SG pressure - A,B,C SG feed flow - A,B,C SG steam flow - A,B,C SG level - narrow range - A,B,C SG level - wide range - A,B,C i
SG PORV status - A,B,C SG safety status -'A,B,C
- Spotcheck of components
- Denotes data not logged.
Ensure data recording mechanism is in place.
- Denotes data not expected to be critical for this test.
APPENDIX A 48 a___-___
'l l
I III. Procedure h
1.
Simulation facility instructor fails 14J-4 (480 VAC) open, then puts simulation facility in " freeze".
y Verify the following:
l A.
CCW valves (106A,B,C) to RCP' closed B.
Spotcheck of other equipment deenergization j
from attached load list.
l 2.
Resume simulation.
Panel operators commence power reduction at 4%/ min.
Freeze sinulation facility l
at 87% power.
Verify the following:
A.
Elevated RCP bearing temperature (should be between 195F and 216F).
B.
Increasing Tave, PZR press & level 3.
Resume simulation.
Panel operators perform manual-reactor trip immediately, perform expected actions (secures RCPs about two minutes later upon entry into ES-0.1).
Observe:
Natural circulation indications develop and stabilize RC flows decrease to 10% in 30 sec further decrease more slowly Tc decreases slightly, approaching SG Tsat Th increases to 30-50F above Tc,. stabilizes Incore thermocouple track Th (560-570F)
SG pressure stable, with steaming indicated 4.
Restart RCP A 30 min after Rx trip after component cooling reestablished.
Observe:
Th, Tc converge (all 3 loops)
SG A pressure increase, no increase on SG B,C Possible PZR pressure decrease 1
49 APPENDIX A
I IV.
Data Collection 1.
Record all annunciators of interest and attach to l
this procedure.
_j 2..
Verify standard printouts are collected,. marked, and attached to this procedure.
l 3.
Verify all other data specified in Section II is l
collected, marked, and and attached to'this i
procedure.
.l V.
Preliminary Evaluation I
1 1.
Compare simulation facility data with' plant data or expected response and. note any discrepancies.
2.
Attempt-to resolve the' discrepancies with the plant content expert.
Append resolution or recommended followup action to this procedure.
)
APPENDIX A 50 l
APPENDIX B I&C DATA COLLECTION FORMS This appendix contains the directions and forms for collecting the I&C data for the simulation facility inspection.
Data collection may begin in either the reference plant control room or the simulation facility control room.
Regardless of the location of this first data collection, all of the relevant information about the !.nstrument.or control should be gathered.
It is expected that not all of the fialds on the forms will always be applicable.
In these cases'they chould be marked j
as not applicable (N/A).
It is also possible that the' forms 1
are not completely comprehensive.
In this case, additional I
fields should be added as needed and additional relevant information should be collected.
Relevant notatiens and i
comments on the part of the data collector are also j
encouraged.
These may help in clarification and understanding of the data during later analysis.
Once the data are collected at the first location (reference i
plant or simulation facility) the data sheets should be taken to the second location and the data verified.
If the data for a field are identical, a check should be placed in the corresponding field for the second location.
If the data are different, the field for the second location'should be filled in with the differing data.
Separate data collection forms for annunciators, meters and 4
digital displays, recorders, CRTs, and controls are included
)
here.
Explanations for the data fields particular to each individual form are included with the forms.
I J
l, d
51 APPENDIX B i
The following data fields are common to all of the I&C Data Collection Forms.
Simulation The name of the simulation facility being Facility evaluated.
1 Reference Plant The name, unit and docket number of the j
reference plant.
Date The date the data are being collected.
Performance Test The number and/or name of the performance Reference test in which the component is referenced.
Procedure The procedure name, number, and step i'n Reference which the component is referenced.
j 1
These two references permit the component j
to be put back into operational context for the evaluation of the significance of any discrepancies found.
i Label Name The exact wording on the label'of the component.
Identifier Any numeric or alphanumeric codes which will aid in the. unique identification of the component.
Examples include l
equipment piece numbers and dimensional
)
location measures.
Panel The panel number and/or name on which the component is located.
Location Within May include specific measurements or Panel references to appropriate board prints or photographs.
j 1
l
}
APPENDIX B 52
e ANNUNCIATOR DATA COLLECTION FORM Simulation Facility Date Reference Plant Performance Test Reference I
Procedure Reference Feature Reference Plant Simulation Facility Label Name i
l l
l Identifier Panel Location Within Panel Type 1
Color 1
l 1
I 53 APPENDIX B l
Annunciator Data Field Definitions Type The type of annunciator - standard, fire panel, emergency safeguard system Color The color of the annunciator when illuminated I
APPENDIX B 54 1o--_______________
METER AND DIGITAL DISPLAY DATA COLLECTION FORM Simulation Facility Date Reference Plant Performance Test Reference Procedure Reference Feature Reference Plant Simulation Facility Label Name l
l Identifier l
l Panel Location Within Panel j
Type j
Parameter Measured Units Range Divisions Zone Banding and Setpoint Indication f-55 APPENDIX B
.s
1 i
Meter and Digital Display Data Field Definitions Type The typeaof meter - edgewise, rotary or-digital display - electronic counters, LEDs, LCDs, drum counters, printers i
Parameter The parameter that the display reflects Measured Units The units in which the: parameter is reflected on the display Range The range that is displayed (meters and recorders) or that can.be displayed-(digital displays and CRTs) l Divisions The divisions in which the parameter is reflected (In some cases there may be several different divisions used for scaling purposes.
In these cases the range over which each division is used should be included with the division.)
i I
Zone Banding and
-The ranges / points and colors used for I
Setpoint zone banding and setpoints displayed Indication
)
l l
l l
APPENDIX B 56 j
RECORDER DATA COLLECTION FORM Simulation Facility Date Reference Plant Performance Test Reference Procedure Reference Feature Reference Plant Simulation Facility 1
Label Name.
1 Identifier Pen or Point Identifier Panel Location Within Panel Type Parameter Measure.1 Units Range
. j Divisions Zone Banding and
)
Setpoint i
Indication i
Pen Color i
. a j
i i
57 APPENDIX B I
.J
Recorder Data Field Definitions Type The type of recorder - single pen, dual pen, multipen, multipoint Pen or Point The number or code for the pen or point Identifier used u
Paranieter The parameter that the display reflects Measured Units The units in which the parameter is reflected on the display Range The range that is displayed (meters and i
recorders) or that can be displayed i
(digital displays and CRTs)
Divisions The divisions in which the parameter is reflected (In some cases there may be several different divisions used for scaling purposes.
In these cases the range over which each division is used should be included with the division.)
Zone Banding and The ranges / points and colors used for Setpoint zone banding and setpoints displayed Indication i
J Pen Color The color of the pen used i
I APPENDIX B 58
. e INDICATING LIGHT. DATA COLLECTION FORM Simulation Facility Date Reference Plant l
l Performance-Test Reference Procedure Reference Feature Reference Plant Simulation Facility Label Name
-1 L
Identifier Panel Location Within Panel Type Color Light Legends or Symbols e
59 APPENDIX B-1 i
E
_a____
. _ _ _. _ _ _ _ _m.mam am.__.___-_-_..4
Indicating Light Data Field Definitions Type The type of indicating light'- legend, i
non-legend Color The color of the light when illuminated Light Legends The legends or symbols used on indicating or Symbols lights i
l 1
l l
I l
J l
APPENDIX B 60 i
j
l CONTROL DATA COLLECTION FORM
\\
Simulation Facility Date j
i Reference Plant Performance Test Reference Procedure Reference
)
Feature Reference Jlant Simulation Facility Label Name i
1 l
Identifier l
Panel
]
I L'3 cation Within j
Panel l
Type Function Switch Positions or Control Scale Shape or Color Coding l
61 APPENDIX B
l 1
Control Data Field Definitions 1
Type The type of control - discrete rotary (including J-handle, T-handle, and star-handle), continuous rotary, toggle switches, thumbwheels, pushbuttons l
(lighting and non-lighting, legend and j
non-legend)
]
u Function The function (valve control - throttle,
{
seal in; breaker, controller) and/or type of activation (as is, spring return, pull to lock) for controls Switch Positions The names of switch positions (discrete or control Scales controls), ranges for scales (continuous controls), or legends on push buttons Shape or Color The shape or color code used for that Coding control if coding is used 1
i APPENDIX B 62
-- j
1[
COMPUTER EVALUATION Because computers are so complex,'a form-is not-3 appropriate.
The following criteria should be met along.
j with the appropriate criteria from the forms'for displays q
and controls cited.
1 Input
(
1.
Is the same input device used (e.g., keyboard,Lcontrol panel)?
2.
Are the command sequences the same for calling up given displays, requesting information, running programs, and performing calculations?
3.
Are the same names used for-given displays, points, programs, and calculations?
4.
Based on the type of input device, are the general requirements for controls met (see the controls' data collection form)?
output 1.
Are the same displays available (e.g., CRTs, printers, l
meters, recorders)?
2.
Is the display format the same?
'1 3.
Based on the nature of the information displayed and the type of display, are the requirements for meters, recorders, and indicating lights met (see the appropriate data collection form)?
ci i
63 APPENDIX B
Y
\\l t
t P
t APPENDIX C f
r, l
l' l
TEST OF THE ME'Jf,t,ODOLOGY FOR THE
. 1 I
SIMULATION FACILITY EVALUATION PLAN
,; 3 1
l r
v
't s.,
t I
t e
l l
e s
(
4, i
),*
L i
\\
j g.,
y lt I
\\
'g t
j' y ) ' '/
s i,
A t
..a
\\,
s.
A s
y
\\
\\
l i
/
a o
\\l t
i
.u,.
s
.(
65 APPENDIX,C
't s.
e k a g k,
.i t l
l
,I y>
TABLE OF CONTENTS 1
-f 1
INTRODUCTION............
69 1.1 Assumptions Made in the' Conduct.
of the Pilot Test.................
69 1.2 Deviations From the Draft SFEP......'
70 f
1.3 The ISFET...........................
71 1.4 Kick-Off and Exit Meetings..........
72 2
PERFORMANCE TESTING......................
76 l
2.1 Performance Test Selection..........
76 2.1.1 1MER Events at North Anna.......
76 2.1.2 Emergency Events Not Represented in LERs..........
78 2.1.3 LER Events From Similar. Plants.
79 2.1.4 Common Transients Not j
Represented by LERS..........
79
}
4 I
2.2 Performance Test Devttopemnt........
81 2.2.1 Initial Development............
81 2.2.2 Revisions to Initial Test Procedures...................
82-
- '1 2.3 Conduct of the Performance Tests....
85
/'
l 2.3.1 Test Panel Makeup..............
86 2.3.2 Conduct of Individual Tests....
87 2.3.3 Test Sequencing and Duration...
88 2.4 Evaluation of the Performance Tests.
88 2.5 Results of the Performance Tests....
- 89 l
l 2.5.1 Normal Operations.............
89 1
2.5.2 Abnormal Operations............
89-l 2.5.3 Emergency Operations...........
91 j
3 PHYSICAL FIDELITY / HUMAN FACTORS..........
93 3.1 Off-Site Review and Data Collection.
93 3.2 On-Site Review and Data Collection..
96-3.3 Results of the Review...............
96 4
CONTROL CAPABILITIES.....................
100-5 DESIGN, UPDATING, MODIFICATION, AND TESTING.............................
101
>?
6 CONCLUSIONS, OBSERVATIONS, AND RECOMMENDATIONS....................
102 i
6.1' The SFET............................
102 1
67 APPENDIX C l
y
,.e
,4
.1
,.~,
sjt,.
~%sk
+
U
. (; _
6.2 Perfortsanc'e Testing.......?.........
-102 6.3 fj PhyisicaP: Fidelity / Human Fhitors.....
-105 6.4 Design, Updating, Modification,
-g and Testing.......................
.105
- s6.5 General.............................
105 7
SUPPORTING DOCUMENTATION,N.................-
106 i
v..
5, k
,)
t h
t Q
l.
(: i9
. i
- s.
'5 1
a c
( k' t
1 '%
T.
r b '
-y A
ye l
1
'4
,q
' 1
.i I
3..
i L
r i
l l'. r>.c
.l',1 1
3-
'(
i 1
l 1-
- t.
APPENDIX C
' 68' l
y-1 1
. \\;
y i
I b; O-
.)'
t n
h'
1 INTRODUCTION This document discusses the pilot test of the methodology for the draft Simulation Facility Evaluation Plan (SFEP) as I
of November 1, 1986.
The purpose of this test was to determine the usefulness of the methodology as a tool for
]
evaluating the acceptability of simulation facilities consisting solely of plant-referenced simulators in accordance with proposed 10 CFR Part 55 and ANSI /ANS 3.5, 1985, as endorsed by proposed Regulatory Guide 1.149.
1 The test was conducted at the simulation facility located at j
the North Anna training center, using North Anna Unit 1 as the reference plant.
Both of these facilities are owned and operated by the Virginia Power Company.
The test was conducted during the week of November 17, 1986.
On-site preparation and data collection for the test were performed
)
the previous week by members of the Interim Simulation Facility Evaluation Team (ISFET).
Due to time and manpower constraints, the actual pilot test deviated, in part, from the methodology identified in the draft SFEP, and certain assumptions were made in order to conduct the pilot test.
The primary differences were in the i
conduct of the Off-Site Review.
These deviations and i
assumptions are described later in this document.
1.1 Assumptions Made in the Conduct of the P1).ot Test certain assumptions were made in order to perform a j
reasonably representative pilot test.
They were:
1 l
1.
That proposed 10 CFR 55 (the Rule) and Regulatory Guide 1.149 had been published.
2.
That the Rule had been in effect for at least four years.
3.
That Virginia Power (the facility licensee) had completed performance testing for the North Anna simulation facility.
4.
That the facility licensee had certified the North Anna simulation facility to the NRC on NRC Form 474.
5.
That this was a random audit; that is, that there had been no reports from license examiners or anonymous plant personnel of any problems with the simulation facility.
6.
That all of the steps that were to be completed prior to the on-site review had been done in accordance with the'(draft) STEP.
69 APPENDIX C
I
)
1.2 Deviations From the Draft SFEP l
Time constraints and logistical problems made it necessary to deviate from certain parts of the methodology given in the draft SFEP.
These deviations were primarily made to 1
expedite the off-site data collection (SFEP Gection 3.0),
and the preparation (SFEP Section 4.1) and evaluation (SFEP Sections 5.1.1 and 5.1.2) portions of the plan.
The major l
effect of these deviations was that the methodology for performing the off-site review was tested loss thoroughly than that for the on-site review.
None of these deviations, however, led to the ISFET's inability to conduct the test, i
or to achieve the intended results of the test.
The following listing describes the deviations from the l
draft SFEP that were made during the pilot test.
l 1.
The time scale for the activities to be conducted prior to the on-site review was compressed.
Members of the ISFET arrived on site the week before the pilot test, collected the data, and prepared for the following week's review.
They worked closely with facility licensee personnel during this period, and communication was personal and informal rather than the formal correspondence identified in the draft SFEP.
2.
As a matter of expediency, and with the full support of the facility licensee's staff, members of the ISFET collected some of the data which the l
draft SFEP indicates should be collected by the facility licensee and provided to the ISFET.
3.
An off-site review was not conducted.
Since the Rule had not yet been published, much of the data that would have been requested for the off-site review had not been collected or compiled by the facility licensee.
Given the experience gained as a result of this exercise however, the methodology for the off-site review given in the plan seems reasonable.
l 4.
An intended deviation included in the pilot test, was a performance test based on a generic LER which exercised the Emergency Operating Procedures.
This was done to determine if this type of performance i
test should be incorporated as part of the SFEP, i
which, in draft form recognizes the use of only reference plant and sinilar plant LERs for use in performance testing.
APPENDIX C 70
5.
Personnel schedules made it necessary to have two different individuals serve as the License Examiner on the ISFET.
The first person filled the role during the early planning stages of the review.
Ths second person filled the role during the final planning stages and the actual conduct of the on-site review.
6.
Four days were spent at the simulation facility for the on-site review instead of the three prescribed by the SFEP.
The extra day was used to perform the initial analysis of the data.
7.
The two Operations Specialists did not completely meet the position descriptions given in the draft SFEP.
The first one was actually an employee of the facility licensee.
The second, a peer evaluator from another facility licensee, had extensive operations, training, and simulator l
testing experience, alboit with boiling water reactors.
8.
The performance tests were not developed and l
l documented to the fine level of detail described in 1
the draft SFEP.
Due to time constraints, the ISFET adopted a rule of thumb which said: " Develop them to the point at which another ISFET could come in and run them and get the same results."
Despite these differences, the basic content of the performance tests was the same as that given in the draft SFEP, and the ISFET was able to successfully j
conduct and evaluate the tests.
I 9.
The performance tests were not developed to a level
{
of detail which identified the specific I&Cs used.
j As a result, the Human Factors Specialist asked the reference plant Operations Specialint to identify the I&Cs associated with the critical parameter.
These were then used for the I&C Inventory.
The Human Factors Specialist verified that these I&Cs were, in fact, used during the course of the performance tests.
1.3 The.JJIJDC The ISFET was made up of the following individuals.
Except for the cases mentioned in the Deviations section, they met the requirements of the draft SFEP-71 APPENDIX C 1
~,,
)
_+
Name AffiliDj;inD Role 4
Ron Laughery Microanalysis Team Leader and Design Bryan Gore Battelle - Pacific License Examiner-J Northwest Laboratories'
-(Planning)
Bob Gruel Battelle - Pacific License Examiner i
I Northwest Laboratories (Pilot test).
Dave Roessner Iowa Electric Operations Specialist (PeernEvaluator) l Alan Kozak Virginia Power Operations Specialist (Reference Plant Operations)
)
i Chris Plott Micro Analysis Human Factors and Design Specialist In addition to the members of the ISEFT, there were also two i
NRC' observers present during the test of the methodology.
l They were John Hannon and Jerry'Wachtel of the Division of j
Human Factors Technology, Operator Licensing Branch.
)
1.4 Kick-Off and Exit'Meetinas 1
1 Briefings were held uith members of the ISFET, '!GtC J
representatives and facility licensee' representatives, both before and after the conduct of the pilot" test.
In the pre-briefing the content and intent of the pilot test were discussed.
In the post-briefing the results'of the pilot test and their, impact on the SFEP were' discussed.
The attendees.for both of these meetings are listed on the following pages.
l l
q 1
APPENDIX C 72
KICK-OFF MEETING ATTENDEES
)
Name Affiliation Position j
i Ben DeLamorton Virginia Power Supervisor - Training 1
(Simulator)
Larry Edmonds Virginia Power Superintendent -
I Nuclear Training I
I Anil K. Jain Virginia Power Senior Simulator Specialist I
Dave Cruden Virginia Power Manager Terry Williams Virginia Power Manager - Power Training Services I
Allan Kozak Virginia Power Senior Instructor -
l l
Nuclear (Simulator) i David Roessner Iowa Electric Senior Simulator Engineer Jerry Wachtel NRC/DHFT/OLB Training and I
Assessment Specialist John Hannon NRC/DHFT/OLB Exam Development Ron Laughery MA&D President Chris Plott MA&D Human Factors Engineer Bob Gruel Battelle - Pacific Westinghouse Northwest Licensing Examiner Laboratories l
l 73 APPENDIX C
EXIT MEETING ATTENDEES Name Affiliation Position Ben DeLamorton Virginia Power Supervisor - Training (Simulator)
Larry Edmonds Virginia Power Superintendent -
Nuclear Training Anil K. Jain Virginia Power Senior Simulator Specialist Dave Cruden Virginia Power Mana3er Terry Williams Virginia Power Manager - Power Training Services Allan Kozak Virginia Power Senior Instructor -
Nuclear (Simulator)
L. Richard Buck Virginia Power Supervisor of Training - Operations Robert Soderholm Virginia Power Instructor -
Operations (Surrey Simulator) i E. R. Smith, Jr.
Virginia Power Assisstant Station Hanager - North Anna Larry Gardner Virginia Power SPS - Training David Roessner Iowa Electric Senior Simulator Engineer Bill Russel NRC/DHFT/OLB Director Jerry Wachtel NRC/DHFT/OLB Training and Assessment Specialist John Hannon NRC/DHFT/OLB Exam Development Ron Laughery MA&D President Chris Plott MA&D Human Factors Engineer Bob Gruel Battelle - Pacific Westinghouse 1
Northwest Licensing Examiner Laboratories APPENDIX C 74
Eryan Gore Battelle - Pacific Project Manager Northwest i
Laboratories Mike Wyatt INPO Senior Program Manager - Simulators Jean-Pierre EPRI Program Manager Sursock Bill Gardner Combustion Program Manager -
j Engineering Simulators j
(SFEP Steering Committeo
)
representative)
]
i i
j l
I I
)
75 APPENDIX C i
l i
2 PERFORMANCE TESTING 2.1 Performance Test Selection At the time of this pilot test the facility had not done its own performance testing, and had not certified the performance of the simulator.
Consequently, the performance tests were selected and planned without benefit of a Phase 1 review.
Such a review would have facilitated test planning by providing information on tests completed and their results, data available for evaluation of tests developed by the ISFET, and possible deficiencies in simulation facility performance observed during operator licensing exams.
In the absence of information from a Phase 1 review, several factors were used to guide the selection of performance tests.
First, it was desired to evaluate as broad a range of simulation capability as possible.
Second, it was desired to base as many tests as possible on events for which the facility licensee had plant data to compare against the simulated values of important parameters.
This would minimize potential uncertainties in expected simulation facility performance, allowing unambiguous evaluation of test results.
Third, it was desired to evaluate the ability of the simulator to support application of the reference plant's emergency operating procedures.
Evaluation of a license candidate's use of EOPs is an important part of the license examination process.
Performance tests were selected from the following sources:
events reported in recent North Anna LERs, emergency events i
not expected to be represented in LERs for most plants, UER events from "similar" plants, and commonly expected transients not represented by LERs.
2.1.1 LER Events at North Anna Events reported in LERs were the prime candidates for selection as performance tests, since it was expected that plant data would be available from the facility licensee j
against which to compare simulation facility results.
All North Anna LERs listed in NUREG/CR-2000 between January 1984 and August 1986 (N = 155) were reviewed for applicability.
Several events, all from Unit 1, were selected for use in performance testing.
These were:
LER 84-014 Reactor trip from 20% power caused by low steam generator level during manual feedwater control by the operator.
This system is sensitive at low power levels J
because small changes in valve position can cause large
)
changes in feedwater flow.
Valve leakage can also l
complicate control.
APPENDIX C 76 1
)
1 1
i This event tests the modeling of the control sensitivity of the main feedwater system.
LER 85-017 Dropped control rods at 16% power, not causing reactor trip.
This event tests reactivity model.tng of the reactor core, including changes in flux level and profile.
LER 86-002 Reactor trip from 100% power caused by a malfunction of the turbine control.
Rapid governor valve closure caused " shrink" in all steam generators resulting in a steam generator low-low level trip.
This event tests steam generator modeling.
It also
. tests modeling and operation of all normal post-trip control actions, alarms, and interlocks.
LER 84-019 l
Reactor trip from 100% power caused by loss of a 125-V l
AC vital bus.
Reactor trip was due to false indication of loss of a reactor coolant pump.
Various components tripped, and various indications and controls were lost.
This event tests modeling of the 125-V AC vital electrical system, and its interactions with instruments i
4 and control systems.
LER 85-019 l
l Natural circulation following a manual trip of all RC pumps.
Loss of a 4160-V AC emergency bus at 100% power caused loss of component cooling water to the RC pumps, forcing the operators to trip the reactor, and then trip the RC pumps.
This event tests thermohydraulic modeling of reactor coolant system (RCS) flow in the absence of forced flow.
It also tests modeling of the 4160-V AC emergency electrical system and its interactions with plant equipment.
The draft SFEP specifies that performance tests should address two normal evolutitos, six abnormal evolutions, and two emergencies.
Of the five events listed above, the first was considered to represent a normal evolution because feedwater control prior to the trip was the point of interest.
The middle three were considered to represent 77 APPENDIX C
abnormal events since they represented specific plant malfunctions, but in no case were RCS inventory control, pressure control, or the transport of heat from the reactor core jeopardized.
The last event was considered to represent an emergency event, due to interruption of forced reactor coolant flow.
2.1.2 Emergency Events Not Represented in LERs As was expected, review of the North Anna LERs yielded no major emergency events:
e.g., small-break (SB) loss-of-coolant accident (LOCA) exceeding makeup capability, unisolable steam line break, total loss of main and auxiliary feedwater, or steam generator tube rupture.
Due to the importance of these events to operator licensing examinations, the ISFET decided to select one as a performance test, while cognizant of possible difficulties in evaluating test results.
For several reasons, a steam generator tube rupture was selected for the performance test.
The most important reason is that it addresses the greatest variety cf physical l
phenomena, and thus exercises the broadest range of l
simulation facility performance.
This transient involves l
the initiation of safety injection, the inflow of primary 1
coolant into the secondary system (steam generator),
RC pump trip and the establishment of natural circulation, the formation of a steam bubble in the reactor vessel head, and l
subsequent collapse of the steam bubble on restart of an RC i
pump.
It is the most complex of the SBLOCA events in that it involves mass transfer between the primary and secondary q
cooling systems.
With regard to the other emergencies, a l
total loss of feedwater event (main and auxiliary) is dealt with procedurally by openirg the power operated relief valve (PORV) and using high-head safety injection (HHSI) cooling, I
thus turning the transient into a SBLOCA.
A main steam line l
rupture is important, from the standpoint of simulator modeling, in respect to recriticality caused by reactivity addition from the moderator temperature coefficient (MTC) of l
reactivity.
However, MTC effects can be verified in other, less dramatic transients.
Thus, the escalating steam generator tube rupture (SGTR) event is a logical choice for evaluating the simulation facility's performance on emergency events.
One additional reason supports choice of a SGTR event to represent the major emergencies.
On January 25, 1982 at the l
Ginna reactor made by Westinghouse, a major SGTR event occurred, which progressed through the entire spectrum of phenomena described above.
This event is thus clearly relevant to the Westinghouse North Anna plant, even though North Anna has three RCS loops compared to two at Ginna.
Even though detailed comparison of parameter values is probably not possible nor necessary, the detailed chronology APPENDIX C 78
of the Ginna event identified important trends, alarms and automatic actions which allowed a qualitative evaluation of the North Anna simulation facility's ability to reproduce a corresponding event.
2.1.3 LER Events From Similar Plants The two-unit Westinghouse plants at Surry and Turkey Point are all of the three-loop design used at North Anna.
LERs published between January and August 1986 for Surry (N = 56) and Turkey Point (N = 62) were reviewed to identify additional events of relevance.
As was expected, several j
events similar to North Anna events were identified, including reactor trips due to problems with feedwater control, turbine control, dropped control rods, and electrical power supply.
One of these was unique in that it resulted in a trip on high RCS pressure.
It was, therefore, selected for the pilot test.
Turkey Point Unit 4 LER 85-017 l
Loss of a 125-V AC vital power invertor failed nuclear instrumentation and pressurizer level indication.
Pressurizer spray failed, pressurizer heaters j
interlocked off, and letdown isolated.
One PORV had been previously blocked due to leakage, and the other failed to automatically open on high RCS pressure.
The reactor tripped on high pressure due to failure of either PORV to operate.
2.1.4 Common Transients Not Represented by LERs Three events were selected for performance testing as representatives of significant transients which might be survivable withe'ut reactor trip, and hence might not be found in LERs.
These events were:
Load rejecticn at maximum rate (200%/ min) from 100% to 50% power.
Trip of one main feedwater pump at 100% power.
Trip of one RC pump at 28% power (just below reactor trip setpoint).
i These events were selected because they introduce significant perturbations into the primary and secondary coolant systems.
Thus, they test the modeling of various instrumentation and control systems which respond, bringing the systems back into balance so that trip setpoints are not exceeded.
As discussed above, these tests were selected to address as broad a range of simulator modeling as possibit, since 79 APPENDIX C l
l
i l
l results of simulation facility testing were not available.
l The tests were chosen to utilize actual plant data for evaluation wherever possible.
These tests were believed to be representative of those which would normally be performed by the facility licensee in c"aluating and certifying the i
performance of the simulation facility.
I The load rejection event was initially. classified as a normal event, since Westinghouse data'show that Westinghouse plants are designed to survive it without tripping.
(It was later reclassified as an abnormal event upon the recommendation of the facility licensee.)
The other events l
were classified as abnormal events.
With these events, then, the desired complement of two normal, six abnormal, i
I and two emergency events was achieved for performance testing.
1 l
The tests initially selected for performance testing and I
their classifications were:
NORMAL OPERATIONS 1.
LOAD REJECTION - 200% PER MINUTE I
2.
SG LEVEL CONTROL 15-25% POWER LER 84-014 ABNORMAL OPERATIONS 3.
MAIN FEED PUMP TRIP - 100% POWER 4.
DROPPED CONTROL RODS - 16% POWER LER 85-017 l
5.
RCP TRIP - POWER < REACTOR TRIP SETPOINT l
6.
TURBINE CONTROL MALFUNCTION LER 86-002 7.
INVERTOR LOSS, HIGH P TRIP TURKEY POINT LER 85-017 8.
125-V AC LOSS LER 84-019 l
l EMERGENCY OPERATIONS 9.
NATURAL CIRCULATION LER 85-019 10.
LARGE STEAM GENERATOR TUBE RUPTURE APPENDIX C 80
i During test development, two of these tests were dropped and replaced, as discussed in Cection 2.2.
These decisions were
{
made primarily on the basis of data availability for test evaluation.
In retrospect, the legic of this test selection process was reasonable and appropriate.
j 2.2 Performance Test Development Initial conditions for the tests, malfunction input information, and required operator actions were specified based upon LER information, where available, and otherwise upon the conditions desired for the test.
Qualitative expectations of parameter changes and the automatic actions of plant control systems were specified based upon plant design and knowledge of system interactions.
Test were developed in two phases.
The first phase was carried out before travel to the facility.
The initial selection of performance tests was done two and one-half weeks before testing was scheduled to begin, via a conference telephone conversation between personnel at PNL, MA&D, and NRC.
During the next week and one-half, PNL license examiners developed preliminary test procedure content for eight of the initially selected tests.
The second phase of test development was carried out at the North Anna site during the week before the actual pilot test. There, PNL personnel joined the ISFET members from
)
North Anna, Iowa Electric, and MA&D to complete the test
{
I procedures.
Eight test procedures were finalized prior to initiation of pilot testing.
2.2.1 Initial Development The initial development of the test procedures was based primarily on generic knowledge of Westinghouse systems and control system design, since timing precluded shipment and/or study of plant-specific information.
Draft emergency operating procedures from North Anna were available at PNL from work on a different project, however, and they provided information useful in the planning process.
Development of the SGTR test scenario was based on information from the extensive analysis of the Ginna event contained in NUREG-0909 (1982).
The initial test procedure development process focused on specifying not only scenario initial conditions, malfunction inputs, and operator actions, but also the best estimates of the expected responses of the simulation facility.
Knowledge of Westinghouse operating and control system designs was used to identify expected trends in critical parameters and the resulting automatic control and interlock actions.
81 APPENDIX C
)
The test procedures initially. developed thus included lists of expected observations to be verified during test performance.
This information proved to be very helpful during the actual pilot test.
It facilitated understanding I
of the evolution of the transient and helped confirm that I
the test was proceeding as expected.
It also allowed real-time initial evaluation of overall simulation facility performance.
The incorporation of lists of expected observations into the test procedures greatly facilitated the evaluation process.
l All findings reported from these tests were initially identified during test performance by the use of these l
lists.
Thus, careful pre-test development of expected-results can minimize the' time required for post-test performance evaluation.
2.2.2 Revisions to Initial Test Procedures Upon arrival at the North Anna site, the initial test procedures were reviewed with a Senior Reactor Operator i
4 (SRO) licensed member of the training staff who was included i
in the ISFET.
His input was very important in confirming
)
expected trends, correcting misinterpretations, and in adding plant-specific information.
1 At this time, existing plant data for events reported in the LERs, as well as for non-LER events selected for performance tests, were gathered and reviewed.
As discussed below, several changes were made in testing plans due to inadequacies in the available data.
It was clear that, as called for in the SFEP, data should be acquired from the facility licensee prior to starting test procedure development.
Personal interactions between the ISFET and facility personnel were quite helpful to communications in both directions.
They helped inform the facility licensee just what information was needed, and they showed the ISFET what i
l was available.
One lesson learned from this effort was that a personal visit by a member of the ISFET to the facility licensee should accompany the request for plant data, both to ensure effective communications, and to minimize false starts and wasted effort by both ISFET and facility licensee personnel.
Several important changes were made during planning work at North Anna.
1.
The attempt to develop a test of manual feedwater flow control sensitivity at low power levels, LER.84-014 (Section 2.1.1.1), was abandoned.
There were several reasons for this decision, but the primary one was inability to develop a reproducible APPENDIX C 82 1
test.
The LER resulted from the difficulty of manually controlling feedwater flow, and it was felt impractical to define feedwater control manipulation directions leading to repeatable actions.
In addition, there was no information in the LER packet at the plant from which to determine manual actions associated with the LER.
- Finally, there were very little data in the LER packet against which to correlate plant response.
2.
The test involving the trip of one main'feedwater pump at 100% power (Section 2.1.4.2) was deleted.
It was felt that the maximum rate runback test adequately covered most of the modeling addressed by this test.
An additional consideration was that no plant data existed for evaluation of the results l
of such a test, whereas plant data did exist for the runback test.
3.
A performance test in the normal operations category was selected to replace the deleted feedwater flow control test.
This test was a straightforward performance of the surveillance procedure 1-PT-71.1, Steam Driven Auxiliary Feedwater Pump (1-FW-P-2) and Valve Test, to be evaluated by comparison of simulation facility surveillance results with recent plant surveillance data.
No " procedure" was developed forithis test other than the actual plant surveillance procedure.
i 4.
A performance test in the abnormal operations category was developed to replace the deleted main feedwater pump trip test.
This test was developed from a recent plant LER for which considerable data were available in the plant LER package.
LER 86-006 Reactor trip from 100% power caused by closure of the B Main Steam Trip Valve (MSIV).
Safety injection was automatically initiated due to high steam flow coincident with low steam pressure in the unaffected lines.
This event tests steam system modeling, steam pressure control, and safety injection initiation logic.
5.
The test based on the Turkey Point LER (Section 2.1.3.1) was determined to be of marginal utility for evaluation of the North Anna simulator.
Little correlation was found between the instruments and controls failed by the invertor loss at Turkey Point ar..d the effects of invertor loss at North 83 APPENDIX C
Anna.
Although a test could be constructed based on loss of the same equipment at North Anna, the Turkey Point LER data packet was not available to the ISFET, so no data were available for evaluating the test results.
The ISFET agreed to leave this test for last, because it was felt that not all tests might be completed in the 16-hour time period allotted.
No formal test procedure was developed, but it was planned to replicate the equipment failures identified in the LER and verify that RCS pressure increased to the reactor trip setpoint.
Ultimately, time constraints prevented this test from being performed.
6.
The SGTR performance test was modified to introduce stepwise escalation of the leak to allow assessment of flow balances for leaks within the capacity of i
the charging system with and without letdown.
7.
A list of critical parameters was developed from i
the review of the performance tests which had been developed.
This list contained approximately 30 l
parameters, most of which were either plotted on i
I chart recorders or could be printed out by the simulation facility.
It was decided to record all of these data for all performance tests, since it was simpler to acquire more data than necessary than to alter the data-logging programming between l
tests.
In addition to this list, any information to be logged by observers was included for the test when necessary.
With the incorporat'on of these seven changes, draft test I
procedures were formalized and written up into proper procedure format.
This required a significant amount of l
work which could not be delegated to secretarial staff because considerable development work was done in the process,and because test development had been carried out using a personal computer word-processing program.
This allowed preliminary drafts brought from PNL to be finalized at the facility.
As revisions and extensions were made, l
they were recorded directly into the evolving test procedure.
Formalization of the test procedures prior to the actual pilot testing proved to be very desirable.
It allowed all members of the ISFET as well as the simulator operators to work from uniform, clean, hard copy.
This helped minimize cor. fusion and ambiguity in the test performance phase.
Copies of these test procedures in the format used by the ISFET are presented in Appendix A.
Although not editorially perfect, they functioned well.
In addition, when used as a check-off sheet during the testing phase, they provided an immediate record of actions taken and observations recorded.
APPENDIX C 84
As has been noted, essentially all of the findings reported j
from these tests were initially identified during test performance.
Thus, this pre-test development was quite important to successful, efficient test performance and l
evaluation.
The performance tests which were' developed are listed below.
l Performance Tests Developed For The North Anna Simulation Facility j
NORMAL OPERATIONS 1.
SURVEILLANCE OF STEAM DRIVEN AUXILIARY FEEDWATER PUMP AB_y_QRMAL OPERATIONS 2.
50% LOAD REJECTION - 200% PER MINUTE 3.
MSIV CLOSURE - 100% POWER l
LER 86-006 4.
DROPPED CONTROL RODS - 16% POWER l
LER 85-017 j
5.
RCP TRIP - 28% POWER l
6.
TURBINE CONTROL MALFUNCTION l
LER 86-002 7.
125-V AC LOSS, STUCK MFW VALVES LER 84-019 8.
INVERTOR LOSS, HIGH PRESSURE TRIP I
TURKEY POINT LER 85-017 I
EMERGENCY OPERATIONS l
9.
NATURAL CIRCULATION LER 85-019 1
10.
STEAM GENERATOR TUBE RUPTURE OF INCREASING MAGNITUDE 2.3 Conduct of the Performance Tests Nine performance tests were run sequentially during two 8 hour9.259259e-5 days <br />0.00222 hours <br />1.322751e-5 weeks <br />3.044e-6 months <br /> shifts.
Only the test based on the Turkey Point LER was omitted.
Test performance took the entire time allotted, with essentially no breaks except for simulator set-up, and pre-test review of procedures and expectations.
85 APPENDIX C l
l l
C____._____
2.3.1 Test Panel Makeup The tests were conducted by members of the ISFET and facility staff.
ISFET members and their contributions were:
1.
An Operator Licensing Examiner knowledgeable in Westinghouse systems and procedures.
He functioned as Performance Test Director (as appointed by the Team Leader), maintained real-time cognizance of the progress of the test, verified and checked off expected simulation facility performance as indicated in the test procedures, and determined when test procedure steps had been satisfied so that continuation to succeeding steps i
was warranted.
j l
2.
An Operations Specialist with specific knowledge of facility systems and procedures, and of the simulation facility capabilities and operations.
He was an SRO-licensed member of the facility licensee's training
)
staff.
He worked with the Test Director to ensure correct performance of the test, and provided verification of test progress and results.
He also I
functioned as a Panel Operator, performing operator actions as required by test or operating procedure.
He also provided direction to the two other facility l
licensee personnel assisting in the test performance: a l
Simulator Operator, and a second Panel Operator.
3.
An Operations Specialist experienced in simulator operation and testing, and in BWR operations and training.
He contributed to data gathering and analysis, and to resolution of questions addressing the interface between the simulation facility and the tests.
4.
Two Human Factors Specialists, one of whom was the Team l
l Leader for the project.
They contributed to the l
gathering of data which could not be automatically recorded by the simulator (and, of course, to non-performance test parts of the simulation facility pilot test).
5.
A second Operator Licensing Examiner who had developed l
l the initial drafts of the performance tests, and who had l
contributed to their completion at the facility, made additional contributions to test tracking and data gathering.
6.
Facility licensee personnel who participated in test performance were the trainer / member of the ISFET, a Panel Operator from a similar plant owned by the same utility, and a Simulator Operator who was employed by Virginia Power.
l APPENDIX C 86 l
This test panel makeup proved adequate for all. phases of the performance tests.
2.3.2 Conduct Of Individual Tests The performance tests were conducted sequentially.
Before each test,rthe ISFET Test Director reviewed the test procedure and its objectives with the other members of the ISFET.
As was discussed above, an inclusive list of critical parameters had been developed during test planning, and data I
were automatically charted or printed out for all parameters j
during each test.
This minimized set-up time between tests.
However, not all data of interest could be printed out (eg.
pressurizer heater status).
As was appropriate to each test, ISFET members were assigned to log needed data.
In 1
addition, observation plans were made so that any deviations of simulator performance from that expected would be recognized at the time and the need for Edditional testing evaluated.
These observations proved to be of central 1
importance to the ISFET's ability to provide meaningful i
evaluations of the test results on site.
While the ISFET was conducting the pre-test briefing, the simulator operators input the relevant initialization conditions and marked recorder charts.
During the test, l
data-logging intervals were changed as required to ensure adequate data collection during rapidly changing conditions, l
yet minimize data collection during slow-moving evolutions.
As each test was performed, the ISFET members verified the' general performance of the simulation facility against the expected results, which were listed in the test procedure...
At times, the sinnlation was frozen to ensure understanding of developments, and also to allow checking of load lists following electrical failures.
Many questions were raised and discussed, but not all could be answered before the ISFET left the site.
This was particularly the case during I
tests which were not replications of LER events.
- Overall, this process proved quite successful in providing the ISFET with confidence that the simulation facility was performing generally as expected.
It also demonstrated the effectiveness of the testing procedure by identifying a few discrepancies between expected results and-the results of simulation, as will be discussed in Section 2.5.
After each test, recorder charts were marked and data logging printouts were collected.
Data and notes taken by the observers were also collected.
87 APPENDIX C L-__-__-__-
l 2.3.3 Test Sequencing and Duration The test sequence was selected to begin with the more straightforward tests, yet also to ensure that the more complex and significant tests were performed. For this reason, the two emergency operations tests were scheduled at i
the beginning of the second day of testing.
The test i
sequence scheduled for the first day was (numbers are from I
the listing of tests developed in Section 2.2.2) 2, 4,
5, 6,
1, and 3.
All tests were completed.
The tests scheduled for the second day were 9, 10, 7,
and 8.
The first three of these were completed.
On the basis of these results the ISFET concluded that the 2-day period of testing is reasonable for the performance of 10 meaningful tests.
This assumes some improvement in I
efficiency on the part of the team as additional simulation facilities are tested.
j 2.4 Evaluation of the Performance Tests The most significant evaluation of simulation facility performance took place during test performance itself.
As has been discussed, the test procedures developed prior to testing identified important responses expected from the simulation facility, including automatic actions and critical parameter trends caused by intar-system interactions.
These actions and trends were verified as much as possible during test perforr.ance.
Electrical modeling was also verified during testing, by freezing l
simulation after electrical supply loss and comparing load lists with indications of failed equipment.
)
l l
All reported deficiencies identified in the performance of l
the North Anna simulation facility were discovered during performance testing, as opposed to during post-test data l
review.
The incorporation of all available information on expected simulation facility response ensured that clearly successful and unsuccessful performance would be identified during testing.
The 1-day review of the test data provided confirmation of the observations made during testing.
It also allowed verification that the traces of critical parameters obtained from the simulation facility were consistent with those from actual plant data, where it was available.
However, detailed plant data for critical parameters were available for only three events, even though five of the events were based on North Anna LERs.
Potential discrepancies between expected simulation facility performance and observed responses were noted.
However, no additional conclusions were substantiated.
A more detailed data review may have helped resolve some of them.
For others, additional, or repeated, simulation might have been required.
For yet others, perhaps no answers short of APPENDIX C 88
comparison with plant data or engineering calculations would have been found.
From these results, it is clear that careful pre-test development of test procedures, including listings of i
expected simulation facility responses, is a significant factor in evaluating the performance of a simulation i
facility.
In addition to expediting the evaluation of such l
performance, it also ensures the separation of expectations and results.
l i
2.5 Results of the Performance Tests I
This section presents a brief description of successes and deficiencies identified for each of the tests which was performed.
It should be noted that a result of "none found" in the deficiency category does not necessarily guarantee I
that no deficiencies exist.
It only indicates that no l
deficiencies were identified based on the data available'and the criteria developed.
2.5.1 Normal Operations I
SURVEILLANCE OF TURBINE-DRIVEN AUXILIARY FEEDWATER PUMP J
SUCCESS:
All steps of this test were conducted in accordance with the reference plant procedure.
The AFW system was lined up, started, and operated.
All valves procedurally designated for operation from the control' room were operated.
Pump output flow and pressure were-as expected.
I DEFICIENCY:
Two valves failed to stroke within 10% of l
the stroke time measured in the reference l
plant.
(Required in accordance with ANSI /ANS 3.5, 1985, Section 4.1.)
One was fast, and one was slow.
- However, neither was outside of the performance band allowed for the reference plant system by procedure criteria. The facility licensee was unaware of the problem.
2.5.2 Abnormal Operations 50% LOAD REJECTION - 200% PER MINUTE SUCCESS:
Steam pressure was appropriately controlled by steam dump valves.
Steam generator level remained above the trip setpoint.
Pressurizer level and pressure were controlled by heaters and spray 89 APPENDIX C
I i
i within appropriate values.
The reactor did not trip.
DEFICIENCY:
Runback of control rods did not follow j
Tave-Tref-program.- Runback decreased l
from maximum speed sooner than expected.
The facility licensee was aware of the problem and had plans to remedy it.
MSIV CLOSURE
'100% POWER LER 86-006 SUCCESS:
Reactor and turbine tripped as expected.
Post-trip response of all parameters was
-i appropriate.
DEFICIENCY:
Safety injection was'not initiated, as had occurred in the LER.
Steam pressure failed to drop sufficiently.
The facility licensee was aware of the problem and had plans to remedy it.
DROPPED CONTROL RODS - 16% POWER LER 85-017 j
SUCCESS:
Rod bottom' lights lit and rod' position indicators indicated that the selected rods had dropped.- As in.the LER, the reactor did not trip on negative flux rate, and reactor power decreased 2%.
As was expected (due to control rod positions), imbalance became less negative (by 0.5%), although no LER data were available for comparison.
l DEFICIENCY:
None found.
RCP TRIP - 28% POWER I
SUCCESS:
All changes were as expected.
Flow decrease in C loop and flow increases in l
A and B loops were appropriate in speed and magnitude.
Thot and Tave in A and B l
loops increased appropriately.
Tcold_in C loop agreed with Tcold in the other loops, and C loop Thot dropped below Tcold while-remaining above Tsat in the C steam generator indicating appropriate I
reverse flow in C. loop.
Steam flows from' the A~and B SGs increased, and flow from the B SG decreased.
Steam header pressure decreased, with steam pressure higher in the A and B SGs than in the C SG.
APPENDIX C 90
i DEFICIENCY:
None found.
TURBINE CONTROL MALFUNCTION LER'86-002 SUCCESS:
As in the LER, the reactor tripped on steam generator low-low level.
Post trip a
behavior of all parameters agreed i
extremely well with LER data.
DEFICIENCY:
None found.
1 125-V AC BUS LOSS LER 84-019 SUCCESS:
As in the LER, loss of power to the l
components and instruments served by bus 1-III resulted in a reactor trip on an indicated-(false) loss of the C RC pump.
An audit of the loads listed in the LER showed correct indications of power loss to all but one load.
j DEFICIENCY:
Boric acid transfer pump 2A could be transferred from slow to fast speed, although the load list for bus 1-III indicated that this transfer should be disabled by bus loss.
The facility licensee was unaware of this problem.
INVERTOR LOSS, HIGH PRESSURE TRIP TURKEY POINT LER 85-017 This scenario was not run due to lack of time.
2.5.3 Emergency Operations NATURAL CIRCULATION i
LER 85-019 SUCCESS:
Loss of the 14J-4 motor control center (MCC) initiated this transient.
All loaas of this MCC.which were sample audited (35 cut of 83) failed upon loss l
l of the MCC.
RC pump bearing temperatures increased as expected after loss of' component cooling water flow.
After RC pump trip, natural circulation indications. developed appropriately.
After start of the A -RC pump, reverse flow indications developed in B and C l
loops.
91 APPENDIX C
I DEFICIENCY:
Recirculation valves-for the condensate pumps, the heater drain pumps, and for the main feedwater pumps were modeled for automatic operation in the simulation facility, whereas in the plant there are also isolation valves which are manually closed or throttled.. The facility licensee was unaware of this problem.
STEAM GENERATOR TUBE RUPTURE OF INCREASING MAGNITUDE SUCCESS:
During the leak escalation phase, primary system flow balances (charging plus seal injection equals letdown plus leak) were satisfactory for various leak rates.
l When the. scenario was rerun with a large l
leak followed by opening of the PORV, rapidly increasing pressurizer level indicated establishment of a steam bubble l
in the' reactor vessel head.
During decrease of pressure in the RCS, safety injection flow appropriately increased as-pressure decreased. After restart of an RC pump, variations of pump motor current indicated possible two-phase flow.
Subsequently, decreasing pressurizer level indicated the expected condensation of the steam bubble.
l l
DEFICIENCY:
None found.
1
.l 1
1
\\
l.
l l
'l l'
APPENDIX C 92
)
.i t
m
1 3
PHYSICAL FIDELITY / HUMAN FACTORS 3.1 Off-Site Review and Data Collection i
As described in the SFEP, the first step for the evaluation of the human factors / physical fidelity was to review the Human Engineering Discrepancies (HEDs) that vere generated as a result of the Control Room Design Review (CRDR).
The l
HEDs were reviewed by the Human Factors Specialist to 1
identify control room data and characteristics which could be of use in evaluating the comparability the simulation facility to the reference plant.
The HEDs reviewed were those which were submitted to the NRC as part of the Summary Report for the CRDR.
In general, these were abbreviated versions of the original HEDs and did not contain sufficient detail.
As a result, while the HEDs contained some useful information, in many cases it was not complete enough for use in the review.
In these cases the HEDs were flagged for further investigation.
The Summary Report for the CRDR also described the data l
collection methodology used for collecting control room l
environmental data.
While in most cases the actual data l
were not reported, these descriptions did allow the Human i
Factors Specialist to identify environmental data which I
s could be requested from the facility licensee.
7 i
The Summary Report also included the checklist which was used for evaluating the control room during the CRDR.
From this, the Human Factors Specialist was able to identify characteristics of the control room for which an HED may not have been written but which would be of interest for this review.
For example:
'i 1.
The types of communications systems used by the control room operators were identified.
2.
The fact that color coding schemes were in use for background shading, annunciators, and controls was I
ascertained.
3.
The fact that the auditory signals used in the control room were appropriately coded was l
indicated.
l As part of the CRDR, a task analysis of the operator actions required for mitigating emergency events was conducted.
In the conduct of this analysis, the instruments and controls (I&Cs) required to perform these actions were identified.
The I&Cs were identified from the perspective of "what's needed" in the control room as opposed to "what exists" in I
the control room.
From this analysis, HEDs were written which addressed the following types of problems:
93 APPENDIX C
\\,_
' j
l
.c N
w 1.
I&Cs which are needed in the control room but do not currently exist.
2.
I&Cs which exist in the control room but are not in the proper lqmation.
j l
3.
I&Cs which exist in tlie control'yoom bra j
do not reflect parameth ys as required q
(displays) or perform control funcf. ions as lj j
required (controls)!
Since the I&Cs identif'.ed lo these HEDs 4-61titularly those q
for tus'. latter two typoa) e.'re considered tg be significant l
to emergency operations, they make a good sample for ILC l
l Inventory.
This would be true whether or not any modification is actually made to the reference plant as a
(
result of the HED.
For this review a sampling,of I&Cs for j
each those types of HEDs was made.
This casily resulted in selecting more than 50 separate I&Cs.
e i
The Summary Report for the CRDR and the HEDs included in it seems to be a good starting point for identifying physical
)
characteristics of the simulation facility to be reviewed.
J As a result of this initial review, the following i
characteristics were identified for further investigation:
I 1.
Drawings of the control room layout were obtained.
l 2.
Data from the control room lighhing survey were i
obtained.
l 9
3.
The fact that color coding was in use for annunciators, controls, background shading and indicating lights was determined.
l 4.
Panels which are difficult to see or reach in the control room were identified.
5.
The communications systens used in the control' room were identified.
'\\
6.
The alarms and auditory signalsjaudible in the control room were identified.
7.
A. sample of more than 50 separate I&Cs T
which could be used for the I&C' Inventory were identified.
s At this point in the human factors / physical fidelity review the STEP indic3tts that any additional information needed be requested from the facility licensee.
Due to the abbreviated pilot test schedule, this was done by having the Human Factors Specialist sort through the available data himself with the permission of the facility licensee.
APPENDIX C 94 q
n
! 4 l
l u
L________E___
_i
i Two observations were made as a result of this exercivo.
First, the data itself were not necessarily complete or well I
organized.
(In the case of the simulation facility that was
{
the subject of this pilot test, the data were kept by the i
contractor who had performed the CRDR; not by the facility licensee.)
In some cases the data were collected and not i
retained.
This was particularly trae for those items for l
which the collected data were found to be acceptable.
For example, if the meaning of colors used on indicating lights Was found to be acceptable in accordance with the CRDR l
criteria, the data collected for color meaning were not retained.
In other cases poor organization of available l
data required considerable searching to find what was
- needed, If these kinds of problems are typical of the industry, they could result in the facility licensee sending incorrect data or no data at all in response to the SFET's request.
The second observation was that having the Human Factors Specialist go through the data may result in obtaining the best information available to meet the needs of the SFET.
This process can result in the identification of additional useful information that was not apparent from the HEDs provided in the Summary Report.
In this investigation, for example, while searching through the original HEDs the human factors specialist found some drawings and listings of I&Cs that were not logically laid out.
While the HEDs for these had been included in the Summary Report, the actual drawings and listings had not and there was no indication in the Summary Report that they existed.
The SFEP indicates th3t at this point the data received from the facility licensee should be reviewed to determine if they are in compliance with the requirements of the Standard.
This step was not conducted for this review because simulation facility data were not yet required to be collected.
The next step in the SFEP was to prepare for the on-site review.
Most of this preparation was completed during the activities discussed above.
The only thing yet to be prepared was the I&C Inventory.
There were two possible methods for selecting I&Cs for the inventory evaluation.
The first was to select the I&Cs associated with the critical parameters included in the performance tests to be conducted, if any.
The second was to select the I&Cs based I
on the HEDs and from other sources such as those identified as critical by Regulatory Guide 1.97 (Revision 3, May 1983).
Since performance' tests were conducted for this review, the l
I&Cs were selected using the first method.
It should be noted that the SFEP recognizes the possibility of conducting a human factors / physical fidelity review even if performance tests are not required.
In such cases, the second method stated above would be used.
95 APPENDIX C L
I i
Once the' performance tests were developed and the critical j
parameters identified, 100 I&Cs were selected for evaluation.
This was done by the Human Facters Specialist and the reference plant Operations Specialfot.
The Operations-Specialist identified all of 'the' controls, displays, and annunciators associated with each of the l
critical parameters.
The Human Factors Specialist then selected a sample of 100 I&Cs from those identified.
An effort v/as made to select those which were used most in monitoring and controlling the parameters.
An effort was i
also made to maintain the some relative proportions of each j
type of I&C in the sample as there was in the original set j
identified.
i At this point, the preparation for the on-site review was complete.
l 3.2 Qn-Site Rgyiew and Data Collmction I
The on-site review for human factors / physical fidelity was conducted in accordance with'the methodology described in l
the SFEP.
Data were first collected at the simulation facility and then verified in the reference plant.
Any discrepancies from the Standard that were identified were then discussed with the facility licensee to determine if i
they had been addresecd.
They were then classified as being:
- 1) in the process of modification, 2) determined, by the facility licensee, to have no impact on training, or 3) not addressed by the facility licensee.
l The only variation from the plan was that some of the information obtained for the evaluation was volunteered by i
the control room operators while the ISFET was in the control room conducting the review.
Specifically, there were a variety of control room noises that could be heard under certain conditions in the reference plcnt but were not simulated in the simulation facility.
While this kind of information may not always be available to a Simulttion Facility Evaluation Team (SFET), it should be incorporated into the review when it is.
3.3 Results of the Review Once any discrepancies had been identified and discussed with the facility licensee, the Human Factors Specialist and the Licsnse Examiner evaluated them to determine their inpact on'the conduct of a licensing examination.
The following items were found to be discrepant from the Standard, and had not been' addressed by the facility licensee.
The facility licensee indicated that these discrepancies would be resolved.
APPENDIX C 96 l
1.
Discrepancy - Out of 36 annunciators which are backlit red in the control room to indicate immediate operator action, one was found not to be backlit red in the simulation facility (panel 21B, annunciator H1, "PRZ RELIEF TK HI TEMP").
Assessment - This could have an impact on the conduct of a licensing examination since it is a legitimate cue which is not being presented to the operator.
2.
Discrepancy - The recorder which records T-HOT /T-COLD for Loop 3 had a range of 0 - 700 in increments of 10 in the reference plant and a range of 0 - 5 in increments of 0.1 in the simulation facility.
Assessment - The other two. loops have the proper range displayed so that this would have a minimal impact on the conduct of a licensing examination.
3.
Discrepancy - The selector switches for.the red and blue psns for the NIS recorder (NR-45) indicate that the Delta Flux for channels I and III may be recorded with l
the red pen and channels II and IV with the blue pen.
In the simulation facility this is interchanged so that j
the selector switch for the red pen indicates channels l
II and IV and the selector switch for the blue pen indicated channels I and III.
Assessment - There is redundant indication nearby so that there would be minimal impact on the conduct of a licensing examination.
4.
Discrepancy - of the 50 meters sampled, 23 had zone banding and 18 of those were found to be discrepant.
The discrepancies consisted mainly of missing banding (usually the low end of the scale) and mismatches for the starting points of the banding.
These starting point mismatches seemed to be systematic.
The zone banding in the simulation facility started right at a setpoint while the zone banding in the reference plant I
started a few divisions before the setpoint.
The facility licensee indicated that this may be a result of a recent modification to the plant which may have changed the setpoints.
Assessment - This could have an impact on the conduct of an examination since the zone banding allows the operator to easily scan the boards to determine if a l
parameter is out of range.
l
[
l 5.
Discrepancy - Out of the 100 I&Cs sampled, 5 had missing labels or labels with misspellings.
97 APPENDIX C 1
1
Assessment - The impact of these missing or misspelled labels on a licensing exam would be minimal.
The following items were found to be discrepant from the Standard, but are being addressed by the facility licensee.
1.
The lighting level and distribution for both the normal and emergency lighting is not the same in the reference plant and the simulation facility.
The glare reduction features used in the reference plant are not used in the simulation facility as well.
The facility licensee has' determined through a training value assessment that these differences have minimal impact.
2.
The annunciator alarm for the back panels is not simulated.
The facility licensee is modifying the j
simulation facility to correct this.
3.
The following control room noises are not simulated:
l
\\
Current:y being assessed 4
Feedwater reg valves for Unit 2 Feedwater lifter relief VElves PORVs l
Poppad-open safety valve MSR relief valve Steam break Modification in progress Air damper change on SI 4.
Not all of the common panels / instrumentation are simulated.
The facility licensee has determined through a training value assessment that these differences have minimal impact.
5.
Not all of the telephone communications systems are simulated.
The facility licensee has a modification in l
progress to correct this.
1 6.
The saturation margin meters are not at all alike.
rne facility licensee has a modification in progress to l
correct this.
7.
Annunciator 21C Al "VCT HI-LO LEVEL" has been broken into two annunciators in the plant.
The facility licensee has a modification in progress to correct this.
8.
The four "PRZ PORV" meters are in the simulation facility but no longer exist in the plant.
The facility licensee has a modification in progress to correct this.
l APPENDIX C 99 l
9.
The pressurizer power relief valve controls have two sets of red / green lights in the plant and only one set in the simulation facility.
The facility licensee has a modification in progress.to correct this.
1 i
l J
l 99 APPENDIX C f
4 CONTROL CAPABILITIES Testing of the control capabilities was included as part of the Performance Testing.
The simulation facility met all of the requirements of the Standard for this area.
I I
]
i l
I I
l l
l l
l l
l APPENDIX C 100
+
S DESIGN, UPDATING, MODIFICATION, AND TESTING-This area was not included in the test'of the SFEP methodology.- The-reasons-for.this are primarily logistical.
Since the Rule'had not yet been implemented, the actual simulation facility performance tests that will'be. required by the Rule have not been run by the facility licensee, and
- 'the simulation'. facility configuration management program has not formally begun.
I l
l-L F
4 101
- APPENDIX C.
I I
.______-____--_--.a._--._.
6 CONCLUSIONS, OBSERVATIONS, AND RECOMMENDATIONS
{
The result of the pilot test conducted at the North Anna simulatior. facility is that the methodology given in the I
draft SFEP is soundly based and workable.
Although the pilot test did not follow the draft SFEP exactly, all of the fundamental methodologies included in it were tested.
These methods proved to be workable and resulted in the identification of features and behaviors of the simulation facility that were not in conformance with the proposed regulation.
The following are the observations and recommendations for possible changes to the SFEP based on the results of the pilot test of the methodology.
These are made based on the experiences of the ISFET and the results of the test.
6.1 The SFET The more experience each of the members of the team has with the reference plant the better.
This is especially true for the license examiner and the operations specialists.
It is very desirable to have at least one member of the team with a strong background in nuclear power plant simulator testing / evaluation.
Having someone with a strong reference plant operations background as a member of the team is a very good idea.
It helps to make for a better test and makes the test development, conduct, and evaluation much more efficient.
If a reference plant operations expert is not a member of the SFET, one should be asked to review the test procedures to verify their appropriateness, verify plant-specific information included, resolve uncertainties, and supply i
I plant-specific information which the SFET could not supply.
The peer evaluator seems to be a good idea.
The option of having such an individual as a "non-voting" member of the team is encouraged.
6.2 Performance Testing The guidance in the SFEP for selection of 10 operations for performance testing during 2 days of simulator use seems appropriate.
Test operations should be chosen in the ratio of 1-3-1 of normal, abnormal, and emergency events.
Wherever possible, performance tests should be developed to I
I utilize existing plant data for evaluation.
For tests developed without supporting plant data, the level of development of test procedures should agree with the 102 APPENDIX C
i ability to predict responses based on knowledge of plant l
design and system interactions.
I The level of detail provided in the performance test f
procedures used here (see Appendix A) seemed adequate.
In general, the more detail incorporated in test procedures the better, as long as the detail is based on known plant responses.
Tests based on the reference plant's EOPs should be included.
Tests based on "similar plant" LERs can be difficult to use unless the plants are very similar.
Even for similar plants it may be necessary, in some cases, to determine if the systems cited in the LER are similar.
The use of sister plants for LER tests should be explored.
Surveillance procedures seem to make good tests.
They are fairly straightforward and most of the development for them j
is complete.
They also typically exercise many aspects of an entire system as well as some of its relationships to other systems.
The performance tests should be developed to a level at which they may be reproduced by another SFET if necessary.
They should however, be in a form which permits easy understanding of what was done (i.e., neat, organized, references included, etc.).
Securing a copy of the actual performance test conducted by the facility licensee gives a good indication of how thorough the testing is and how good the simulation facility is.
Simply repeating one of the performance tests which the facility licensee claims to have conducted can be a good test in some cases (particularly if the test does not look very sound on paper or if license examiners have reported problems with the operations involved).
Including this type of test will help to ensure that the facility licensee's testing program is meeting the requirements of the Rule and the Standard.
Considerable evaluative information can be obtained during running of the performance tests.
When expected responses are clearly known beforehand, and test procedures include verification steps for expected observations, the general quality of simulator performance is apparent by the cone:1usion of the test.
The need for off-site evaluation of performance test data will depend primarily on the extent to which performance tests can be developed before they are run, and on the 103 APPENDIX C
results which are obtained in the tests.
If all observations are as expected, off-site evaluation may not be necessary.
In other cases, further data evaluation may be required, the results declared unsatisfactory, or the facility may be asked to provide further analyses and explanation of discrepancies.
Once an on-site audit is to be conducted, the SFET should work closely with the facility licensee in the development of the performance tests.
This will help to ensure that the tests are fair and reasonable.
This will help in avoiding any misunderstandings about methodology that was used when the results of the test are being discussed.
A facility operations expert should be present as an observer during testing and on-site data evaluation.
Questions arise during test performance and evaluation, many concerning unforeseen details of plant responso, which, if answered at the time, may remove confusion, refocus observations, and enhance the acquisition of needed information and data.
l l
It is not necessary that the operating crew for the i
performance tests be made up of licensed personnel.
A decision-tree type of analysis should be included in the SFEP for the evaluation of the performance tests.
For example, if a paraaeter for a transient operation is being evaluated, the decision-tree process may be as follows:
1.
Violates physical laws?
Yes - fail No - continue l
2.
Change in the proper direction?
No - fail Yes - continue 3.
Proper relationships with other parameters?
No - fail Yes - pass etc.
The 2%-10% criteria given for normal operations in the SFEP should be taken out.
According to the Standard they only apply to steady state operations.
The phrase " key parameters" should be eliminated and only
" critical" and "non-critical" should be used.
APPENDIX C 104
6.3 Physical Fidelity / Human Factors l
Examining the original HEDs and raw data is a good idea.
When identifying I&Cs from the performance tests, it may be sufficient just to identify all of the I&Cs associated with the critical parameters.
Developing the performance tests to a fine level of detail solely to identify ILCs may not be practical.
1 An instant camera would be an excellent way of collecting data, and for some aspects of the data collection is almost necessary.
Notes should be made on the backs of the photographs so that their content and meaning are apparent.
It may be helpful to reverify any problems identified as part of this portion of the review.
It involves a lot of detailed data collection which can be subject to error.
6.4 Desian, Uodatina, Modification, and Testina l
l The focus for this review should be shifted from how things are done to what is done and when.
The off-site review should include a request for data on reference-plant modifications made during a given period, together with data for any resulting changes to the simulation facility or, alternatively, the facility licensee's reasons for not changing the simulation facility.
i The time schedules for these changes / decisions should be included as well.
I 6.5 General i
The opportunity for certain members of the SFET to visit the facility licensee in advance of the on-site review for the purposes of establishing a rapport, examining available data, and working out some of the logistics of the review, i
seemed to be favored by the members of the SFET and the facility licensee.
While this pilot test was completed, by and large, in accordance with the schedule described in the draft SFEP, it was done so with considerable effort and long hours on the l
part of the ISFET and the facility licensee's staff.
If a more ambitious set of performance tests is to be included in future simulation facility evaluations, consideration should be given to making arrangements for a longer stay on site.
Greater emphasis should be placed on the use of reference-plant procedures in the simulation facility.
It may be better to refer to an "on-site" and "off-site" review instead of the current phases.
105 APPENDIX C 1
7 SUPPORTING DOCUMENTATION This section contains copies of the Performance Test Procedures for.the North Anna simulation facility that were used in testing the SFEP methodology.
Except for the surveillance procedure (where the plant procedure was used) and the' Turkey., Point LER (which.was not run), the procedures for each of the performance tests are given in the following order.
NORMAL OPERATIONS 1.
SURVEILLANCE OF STEAM DRIVEN AUXILIARY FEEDWATER PUMP.
ABNORMAL OPERATIONS 2.
50% LOAD REJECTION - 200% PER MINUTE l
3.
MSIV CLOSURE - 100% POWER LER 86-006 4.
DROPPED CONTROL RODS - 16% POWER LER 85-017 5.
RCP TRIP - 28%. POWER 6.
TURBINE CONTROL-MALFUNCTION LER 86-002 7.
125-V AC LOSS, STUCK MFW VALVES LER 84-019 8.
INVERTOR LOSS, HIGH PRESSURE TRIP TURKEY POINT LER 85-017 l
EMERGENCY OPERATIONS 9.
NATURAL CIRCULATION LER 85-019 10.
STEAM GENERATOR-TUBE RUPTURE OF INCREASING MAGNITUDE APPENDIX C 106
I l
2.
50% LOAD REJECTION - 200% PER MINUTE I.
Initial Condition:
100% power II.
Data Collection Method Identified Critical Parameters:
Analog:
Mark the recorders TO for Test 1.
Digital: Ensure data are recorded as follows:
Every 5 seconds for first minute.
Every 15 seconds until completion.
PZR pressure PZR PORV status PZR level VCT level Charging flow Letdown flow PZR spray line T
- PZR heater status N-45 (Nuclear Power)
D rod bank position Turbine first stage p - Tref Tave - auct hi Tcold - loop wide range - A,B,C Thot - loop wide range - A,B,C Incore Temperature Subcooling Margin
- RCS loop flow - A,B,C Main Steam header p SG pressure - A,B,C SG feed flow - A,B,C SG steam flow - A,B,C SG level - narrow range - A,B,C SG level - wide range - A,B,C SG PORV status - A,B,C SG safety status - A,B,C
- Turbine Bypass Valve Status
- Denotes data not logged.
Ensure data recording mechanism is in place.
- Denotes data not expected to be critical for this test.
107 APPENDIX C
i l
III.
Procedure 1.
Position data taker at turbine bypass valves.
A.
Note the elapsed time and Tave - Tref I
mismatch for each turbine bypass valve operation.
2.
Position data taker at pressurizer heater and spray control.
A.
Note pressurizer level deviation or
.I pressure at onset and termination of heater operation.
B.
Note pressurizer pressure at onset l
artd termination of spray.
3.
Panel operator inserts 200%/ min load reduction te 50% power.
Annunciators are not j
acknowledged, but can be silenced.
No I
further operator action is taken.
j l
Observe:
Turbine control rapidly throttles turbine valves.
l Rx does not trip Turbine first stage impulse pressure drops SG pressure increases (until steam dumps open)
SG level drops due to pressure increase Steam Flow / Feed Flow mismatch causes FW flow decrease l
Tave-Tref error causes steam dump valves to j
trip open I
(number appropriate to power change)
Control rod runback initiated.
Rods move in at maximum rate (Tave-Tref and Qn-Qturb program causes max speed)
(rod motion is sequenced by auto control program) i Increase of Tave due to steam dump response delayed.
RCS pressure initially increases (PZR heater decrease, possible spray)
PZR level increase due to swell (PZR heaters on if 5% increase)
Charging flow decrease APPENDIX C 108
1 I
Reactor power decreases with rod insertion Decreasing nuclear power, Tave Rod insertion speed decreases as Tref l
approached Steam dump valves modulate closed as Tref approached
)
SG pressure decreases towards normal j
Feedwater flow decreases with steam flow i
PZR level changes with Tave decrease I
PZR level decrease and potential overshoot 1
(RCS shrink with cooling) i (Charging rate establishes level via program) i VCT level changes due to charging j
i
}
IV.
Data Collection i
l 1.
Record all annunciators of interest and
)
attach to this procedure.
2.
Verify standard printouts are collected, j
marked, and attached to this procedure.
3.
Verify all other data specified in j
Section II are collected, marked, and i
attached to this procedure.
j 1
1 V.
Preliminary Evaluation 1.
Compare simulator data with plant data or expected response and note any discrepancies.
2.
Attempt to resolve the discrepancies with the plant content expert.
Append resolution or recommended follow-up action to this procedure, 109 APPENDIX C 4
___-_..m__m.-
3.
MSIV CLOSURE - 10b% POWER -'LER 86-006 a
I.
Initial Condition: 100% power II.
Data Collection Method Identified Critical Parameters:
Analog:
Mark the recorders TO for Test 3.
Digital: Ensure data'are recorded as follows:
Every 5 seconds for first minute.
Every 15 seconds until. completion.'
PZR pressure PZR PORV status.
PZR level VCT level charging flow l
Letdown flow PZR spray line T N-45-(Nuclear. Power) i D rod bank position Turbine first stage p
. Tref Tave - auct hi Tcold - loop wide range - A,B,C Thot - loop wide range - A,B,C Incore Temperature Subcooling Margin l
- RCS loop flow - A,B,C Main Steam header p SG pressure A,B,C l
SG feed flow ~ A,B,C l
SG steam flow - A,B,C SG level - narrow range - A,B,C l
SG 1evel - wide range - A,B,C SG PORV status - A,B,C-SG safety status - A,B,C
- Denotes data not logged.
Ensure data. recording mechanism ic in place.
- Denotes data not expected to be critical for this test.
l-APPENDIX C 110
III. Procedure 1.
Simulator instructor closes B MSIV.
Panel operators carry out expected response (except for acknowledging annunciators).
Observe:
SG B pressure spikes SI actuation due to high steam flow coincident with low steam line pressure Reactor trip Turbine trip RCS pressure, temperature decrease per data SI termination, plant stabilizes IV.
Data Collection 1.
Record all annunciators of interest and attach to this procedure.
2.
Verify standard printouts are collected, marked, and attached to this procedure.
3.
Verify all other data specified in l
Section II are collected, marked, and j
attached to this procedure.
j 1
\\
V.
Preliminary Evaluation 1.
Compare simulator data with plant data i
or expected response and note any j
l discrepancies.
j l
i l
2.
Attempt to resolve the discrepancies 1
with the plant content expert.- Append 1
resolution or recommended follow-up action to this procedure.
l 111 APPENDIX C
E 1
6..
DROPPED CONTROL RODS - 16% POWER - LER 85-017 j
I.
Initial Condition per LER as follows:
Reactor power _16%
Rod control (manual)
Main FW control (closed)
Bypass FN control (man)
Main feed pumps: A Condensato pumps: A,B Steam Dump (Press mode)
II.
Data Collection Method Identified Critical Parameters:
I l
Analog:
Mark the recorders TO for Test 4.
l Digital: Ensure data are recorded as follows:
Every 5 seconds until 2 minutes after trip.
Every 15 seconds until completion.
PZR pressure
?
I N-45 (Nuclear Power)
- N-41,42,43,44 D rod bank position
- IRPI for dropped rods l
- Rod bottom lights for dropped rods Turbine first stage p - Tref Tave - auct hi i
Tcold - loop wide range - A,B,C i
Thot - loop wide range - A,B,C Incore Temperature Subcooling Margin
- RCS loop flow - A,B,C Main Steam header p j
SG pressure - A,B,C J
SG feed flow - A,B,C SG steam flow - A,B,C SG level - narrow range - A,B,C SG level - wide range - A,B,C
- SG PORV status - A,B,C
- SG safety' status - A,B,C
- Denotes data not logged.
Ensure data recording mechanism is in place.
1
- Denotes data not expected to be critical for this test.
j APPENDIX C 112 l
l-l
l III. Procedure i
1.
Position data taker at power range meters.
A.
Record upper and lower detector current on each power range channel.
B.
Record upper and lower detector current on each power range channel after rod drop just prior to manual reactor trip.
2.
Position data taker at rod position indication.
A.
Verify IRPI for dropped group reads "0".
B.
Verify rod bottom lights lit for dropped group.
3.
Simulator instructor drops Group 1 of Bank D.
No operator action is taken.
Observe:
Rod position indications NI power reduction Absence of neg. flux rate trip i
l (rods are peripheral, between excore NIs)
Flux distortion per NIs (imbalance and QPT changes)
(no data exist in LERs) 4.
Once plant has stabilized and data in Sections III.1 and III.2 collected, panel-operator manually trips the reactor and carries out expected actions.-
Observe:
Reactor trip response 113 APPENDIX C u
l
)
IV.
Data Collection 1.
Record all annunciators of-interest and I
attach to this procedure.
2.
Verify standard printouts are collected,-
marked, and attached to this procedure.
3.
Verify all other data specified in - _
Section II are collected, marked, and attached to this procedure.'
V.
Preliminary Evaluation 1.
Compare simulator data with plant data or expected response and note any discrepancies.
]
2.
Attempt to resolve the discrepancies d
with the plant content expert.
Append resolution or recommended follow-up action to this procedure.
- )
APPENDIX C 114
5.
RC TRIP - 28%'POWEB I.
Initial Condition: 28% power-Rod control in manual' Turbine on line II.
Data Collection Method Identified Critical Parameters:
Analog:. Mark the recorders TO for Test 5.
Digital: Ensure data are recorded as follows:
i l
'Every 5 seconds for first minute.
Every 15 seconds until completion.
PZR pressure
- Letdown flow PZR spray line T N-45 (Nuclear Power)
- N-41,42,43,44
- D rod bank position
- Turbine first stage p - Tref' Tave - auct hi Teold - loop wide range - A,B,C Thot - loop wide range - A,B,C Incore Temperature Subcooling Margin RCS loop flow - A,B,C Main Steam header p
- Turbine throttle valve position SG pressure - A,B,C SG feed flow - A,B,C SG steam flow - A,B,C SG level - narrow range - A,B,C SG level - wide range - A,B,C
- SG PORV status - A,B,C
- SG safety status - A,B,C
- Denotes data not logged.
Ensure data recording mechanism is in place.
- Denotes data not expected to be critical for this test.
115 APPENDIX C
1 1
III. Procedure 1.
Position data taker at turbine control panel.
A.
Verify turbine t irottle valves open slightly during transient.
2.
Position data taker at power range meters.
A.
Record upper and lower detector current i
on each power range channel.
B.
Record upper and lower detector current on each power. range channel at 1
conclusion of test.
)
l~
3.
Panel operator trips C loop RC pump.
g L
i l
Observe:
l l
C loop flow decreases to 10% in 30 sec, further decrease more slowly, eventually reverses.
l RC flows increase in A and B loops C loop Th drops below Tc (reverse flow) l Reduced steaming from C SG C feedwater flow reduced by mismatch Increased steaming from A & B SGs l
feedwater flows increased by mismatch C SG pressure drops to Psat for Tc i
1 SG header pressure drops to C SG pressure (no flow) i l
A & B SG ps higher than C (flow resistance)
)
l Turbine valves open further (lower header l
pressure)
Tave increases initially, then recovers Initial increase due to reduced heat. transfer NI power reduction due to MTC Tave recovers since turbine power constant A and B loop delta T increases (same heat transfer necessary)
Core power tilt due to temp differences APPENDIX C 116 j
i i
i i
IV.
Data Collection 1.
Record all annunciators of interest and attach to this procedure.
2.
Verify standard printouts are collected, marked, and attached to this procedure.
3.
Verify all other data specified in i
Section II are collected, marked, and attached to this procedure.
V.
Preliminary Evaluation l
l 1.
Compare' simulator data.with plant data or expected response and note any discrepancies.
2.
Attempt to resolve the discrepancies with the plant content ~ expert.
Append resolution or recommended follow-up i
action to this procedure.
I I
1 117-APPENDIX C i
I 1
l i
'I 6.
TURBINE CONTROL MALFUNCTION - LER 86-002 I.
Initial Condition: 100% power l
II.
Data Collection Method Identified Critical Parameters:
)
i Analog:
Mark the recorders TO for Test 6.
Digital: Ensure data are recorded as follows:
Every 5 seconds for first minute, a
Every 15 seconds until completion.
PZR pressure PZR PORV status PZR level l
VCT level Charging flow Letdown flow PZR spray line T
- PZR heater status N-45 (Nuclear Power) i
,I D rod bank position Turbine first stage p - Tref Tave - auct hi Tcold - loop wide range - A,B,C l
Thot - loop wide range - A,B,C
)
Incore Temperature j
Subcooling Margin l
- RCS loop flow - A,B,C l
Main Steam header p l
SG pressure - A,B,C SG feed flow - A,B,C SG steam flow - A,B,C SG level - narrow range - A,B,C SG level - wide range - A,B,C i
SG PORV status - A,B,C SG safety status - A,B,C
- Turbine bypass valve status
- AFW pump status
- Denotes data not logged.
Ensure data recording I
1 mechanism is in place.
- Denotes data not expected to be critical for this test.
i APPENDIX C 118 1
1
.A
i III. Procedure 1.
Position data taker at' turbine bypass valves.
A.
Note time between initiation of
]
malfunction and turbine bypass valve j
operation.
B.
Note turbine bypass valve closure in controlled fashion.
C.
Note time between turbine bypass valve I
opening and full closure.
D.
Note Tave - Tref mismatch at time of turbine bypass valve full closure.
2.
Position' data taker at pressurizer heaters.
A.
Note pressurizer level'or' pressure j
deviation at onset of heater actuation.
i B.
Note pressurizer level or pressure deviation at end of heater actuation.
3.
Simulator-instructor initiates rapid turbine governor valve closure.
Operator actions limited to those necessary to fulfill, immediate actions.
Observe:
SG pressure increases SG level drops due to bubble collapse Reactor trip due to SG lo-lo level Turbine trip due to reactor trip Steam dumps open SG PORVs open AFW pumps start (due to low SG level)
Tave increases PZR level increase PZR pressure-increase PZR spray valve actuates Tave decreases toward no-load value Steam dump valves modulate closed 119 APPENDIX C
SG PORVs close PZR level decreases toward no-load value,
PZR pressure decreases heaters energize l
subsequently recovers SG level recovers to no load value SG pressure recovers to no-load value 1
IV.
Data Collection 1.
Record all annunciators of interest and attach to this procedure.-
2.
Verify standard printouts are collected, marked, and attached to this procedure.
3.
Verify all other data specified in Section II is collected, marked, and attached to this procedure.-
V.
Preliminary Evaluation 1.
Compare simulator data with plant data or expected response and note any-discrepancies.
2.
Attempt to resolve the discrepancies with the plant content expert.
Append resolution or recommended follow-up L
action to this procedure.
g APPENDIX C 120
i
's 7.
125-V AC BUS LOSS. STUCK MFW VALVES - LER 84-019 l
I.
Initial Conditions:
100% power II.
Data Collection Method 1
Identified Critical Parameters:
Analog:
Mark the recorders TD for Test 8.
Digital: Ensure data are recorded as follows:
Every 5 seconds for first minute.
l Every 15 seconds until completion.
1 PZR pressure PZR PORV status PZR level VCT level charging flow Letdown flow i
PZR spray line T N-45 (Nuclear Power)
D rod bank position Turbine first stage p - Tref Tave - auct hi Tcold - loop wide range - A,B,C al Thot - loop wide range - A,B,C Incore Temperature l
Subcooling Margin l
RCS loop flow - A,B,C l
I Main Steam header p SG pressure - A,B,C SG feed flow - A,B,C SG steam flow - A,B,C SG level - narrow range - A,B,C SG level - wide range - A,B,C SG PORV status - A,B,C j
- SG safety status - A,B,C l
- Loss of power to various components l
9
- Denotes data not logged.
Ensure data recording mechanism is in place.
- Denotes data not expected to be critical for this test.
l 4
l 1
l l
1 n
l n
121 APPENDIX C 4
i.
'4 fg i
I,'
s' k' III.
Procedure x
',.,s,,
'l.
Simulator instructob deenergizes vital bus 1-III (invertor, failure).
/
observe:-
Rx trip due.to sensed C RCP loss
~,
(C RCP does not trip)
<,.\\
t Feedwater valves for B SG fail closed
+
main (FCV-1488).
s, bypass (FCV-1489)
. a u
Level indication.for B SG fails low l
/
wide range LI-1487
/
- l. 6.,
Aux feedwater pump for B SG fails:to auto start 4
4i 1
(1-FW-P-3B)
B SG level drops below narrow range indication (no wide. range ind. available)
All circulating water pumps trip (water boxes' vacuum brkrs deenergized).
)
.i
]
Various containment isolation valves trip.
x(incl. component cooling to RCPs)
.' MZ power range detector N43 deenergized I
't 26 incore thermocouple lose powar
/,
i,.
1 SSPS channel III input relays deenergize g
,N, SSPS train B output relays deenergize-
's
.tudit remaining items per attached. load list,
[
k
- g
) 4 1
i
_j
- s h
4 l
.g.
I APPENDIX C 122
- a..
'k.'!
ti l i,a
>r
o.
.:(
. i. ' \\.')
IV.
Data Collection 1.
Record all annunciators of interest and attach to'this procedure.
2.
Verify standard printouts are-collected, marked, and' attached.to this' procedure.
a 3.
Verify all other data specified in Section II are collected, marked, and attached to this procedure.
V.
Preliminary Evaluation 1.
Compare simulator data with plant data
.or expected response and note any discrepancies.
2.
Attempt to. resolve the discrepancies with the plant content expert.
Append resolution or recommended follow-up action to this procedure.
4 4
'l
\\
?
i s
?.
r < s,,.
APPENDIN'C 123' 1.
A f
__-_:____-_-)
. I F
T IH
~
9.
NATURAL CIRCULATION - LER 85i019 I.
Initial Condition: 100% power,Jmiddle of life
' II.
Data' Collection Method Identified Critical Parameters:
Analog:
Mark the recorders TO for Test 9.
Digital: Ensure data are recorded as follows:
p Every 15 seconds until manual trip..
Every 5 seconds for one subsequent minute.-
l-Every 60' seconds until RCP. start Every 15 seconda until completion.
PZR pressure PZR FORV status-c, PZR level VCT level Charging flow Letdown flow PZR spray line T J
N-45 (Nuclear Power)
D rod bank position Turbine first stage p - Tref Tave - auct hi Tcold - loop wide range - A,B,C Thot - loop wide'rsnge.- A,B,C Incore Temperature"'
Subcooling Margin RCS loop flow - A,B,C-
.i
- RCS pump radial bearing temperature Main Steam header p'-
SG pressure - A,B,C
[.
SG feed flow - A,B,C i
SG steam flow - A,B,C SG 1evel - narrow range - A,B,C SG level - wide range - A,B,C SG PORV status - A,B,C 1
SG safety status A,B,C r
- Spotcheck-of components I
o
- Denotes, data not logged.
Ensure data recording mechanism is in place.
- Denotes data not expected to be critical for this test.
]
i
. APPENDIX C 124 r
c I
)
f III. Procedure 1.
Simulater instructor fails 14J-4 (480-V AC) open, then puts simulator in " freeze".
Verify the following:
)
l A.
CCW valves (106A,B,C) to RCP closed j
l l
B.
Spotcheck of other equipment deenergization from attached load list.
2.
Resume simulation.
Panel operators commence 1
power reduction at 4%/ min.
Freeze simulator
)
at 87% power.
Verify the following.
l A.
Elevated RCP bearing temperature (should be between 195F and.216F).
i B.
Increasing Tave, PZR press & level 3.
Resume simulation.
Panel operators perform
'i manual reactor trip immediately, perform expected actionsf(secures RCPs about two minutes later upon entry into ES-0.1).
Observe:
i i
Natural circulation indications develop and stabilize l
RC flows decrease to 10% in 30 sec further decrease more slowly Tc decreases slightly, approaching SG Tsat Th increases to 30-50F above Tc, stabilizes Incore thermocouple track Th. (560-570F)
SG pressure stable, with steaming indicated 4.
Restart RCP A 30 min after Rx trip after component cooling reestablished.
Observe:
Th, Tc converge (all 3 loops)
SG A pressure increase, no increase on SG B,C Possible PZR pressure decrease 125 APPENDIX C
____-____-_-________--Q
IV.
Data Collection 1.
Record all annunciators of interest and attach to this procedure.
]
2.
Verify standard printouts are collected, marked, and httached to this procedure.~
j 3.
Verify all other data specified in l
Section II are collected, marked, and attached to this procedure, j
V.
Preliminary Evaluation I
1.
Compare simulator data with plant data or expected response and note any
)
discrepancies.
2.
Attempt to resolve the disc.epancies with the plant content expert.
Append I
resolution or recommended follow-up action to this procedure.
l l
1 i
l l
l APPENDIX C 126 i
l
-~z 1
'i 10.
STEAM GENERATOR TUBE RUPTURE OF INCREASING MAGNITUDE I.
. Initial Condition: 100% power I
Intermediate range channel 1
undercompensated II.
Data Collection Method i
Identified Critical Parameters:
i I
Analog:
Mark the recorders T0 for Test 10.
Digitalf Ensure data are recorded as-follows:
Every 5 seconds for first minute after each
. modification.
Every 15 seconds for remaining time until completion.
PZR pressure l
PZR PORV' status 1
PZR level VCT level I
- RVLIS indication Charging flow
- Charging flow through SI flowpath
')
- Accumulator discharge Letdown flow
- PZR heater status PZR spray line T N-45 (Nuclear Power)
- Intermediate range response 1
- Source range response i
D rod bank position j
Turbine first stage p - Tref l
Tave - auct hi
)
l Teold - loop wide range - A,B,C H
Thot - loop wide range - A,B,C Incore Temperature Subcooling Margin RCS loop flow - A,B,C l
- RCP current Main Steam header p SG pressure - A,B,C l
SG feed flow - A,B,C SG steam flow - A,B,C SG level narrow range - A,B,C SG level - wide range - A,B,C SG PORV-status - A,B,C SG safety status - A,B,C
- Secondary radiation alarms
- SI actuation (pump starts, valve depositions
- Denotes data not logged.
Ensure data recording mechanism is in place..
- Denotes data not expected to be critical for this test.
127 APPENDIX C w
3 III. Procedure 1.
Simulator instructor initiates 20-gpm leak in B SG.-
Nc. operator action.
1 Observa:
]
PZR level /prensure control
]
CVCS aute control increases charging VCT tank level decreases Auto VCT makeup Secondary radiation alarms 2.
Simulator-instructor increases leak rate to 50 gpm.
Panel operators' start additional charging pumps.
3.
Simulator instructor increases leak rate to 1
100 gpm.
No operator action.other than to i
reestablish PZR heaters and to swap charging suction to RWST (if necessary).
]
Observe:
PZR level / pressure decrease PZR heaters on, then deenergize on low level Letdown isolatec Charging flow to maximum, then increases' q
as RCS pressure decreases PZR-level / pressure " stabilize" PZR level increases after-letdown isolates PZR pressure increases enco~ heaters.reenergize PZR level decreases as pressure increases PZR heaters deenergize on low PZR level.
')
Cycle should continue 4.
Simulator instructor increases leak rate to i
800 gpm.
Panel operators carryout expected response through EP-3,. Step 10.
1k) further operator action.
Audit SI/ CIA actuation during plant stabilization.
Observe:
RPS 1cw pressure trip'(1875 psig).
Alarms and annunciators Automatic controller response to trip i
APPENDIX C 128
l
/
SI initiation (1760 psig)
Various pumps start Valves reposition Phase A containment isolation i
RCP trip criterion met (SI flow indicated.
j and RCS' pressure < 1230 psig RCS flow coast down' Interlocks start RCP lift and lube oil pumps Natural circulation indications develop 1
B SG level increases (isolated)
B SG goes solid B SG pressure rises to SG.PORV setpoint B SG PORV, possible safety valve actuation RCS pressure / level increase RCS pressure matches B SG pressure 1
RCS level increases an-PZR bubble condenses (inhibit heater actuation ?).
5.
Panel operator opens PZR PORV.
Observe:
Rapid PZR level increase to 100%
RCS pressure decrease.
Possible accumulator discharge; RCS voiding indications ~
RVLIS indication (if operable)
Loss of subcooling Increased charging flow due to lower pressure Possible secondary-to-primaryiflow 6.
Panel operators close PZR PORV, energize PZR heaters, start an RCP (start criterion do not have to be met),
and stop charging pumps.
Observe:
RCP start indications Current, RCS flow Th-Tc equalize 129
. APPENDIX C
_ _ =
'l RCS pressure recovers, stabilizes Indications of bubble collapse i
Eventual return of indicated PZR level l
=I 7.
Verify SR detectors not energized due to IR range undercompensation.
Verify that panel operator can manually energize.
IV.
Data Collection 1.
Record all annunciators of interest and attach to this procedure.
2.
Verify standard printouts are' collected,
.)
marked, and attached to this procedure.
3.
Verify all other data specified in Section II are collected, marked, and attached to this procedure.
~
V.
Preliminary Evaluation I
1.
Compare simulator data trith plant data or expected responne and note any.
discrepancies.
)
2.
Attempt to resolve the discrepancies with the plant content expert.
Append resolution or recommended follow-up J
action to this procedure.
I APPENDIX C 130
GLOSSARY Accepta.n.co test procedures:
Those procedures used to ensure that the simulation facility produced by the simulator 4
vendor meets the specifications of the facility licensee.
Acceptance testing is done before the simulator is used for-training.
Baseline (ahn:
Data used to evaluate the simulation facility against the reference plant.
Eritical parameters:
- 1) Those parameters that require direct and continuous observation to operate the power. plant under manual control.
- 2) Input parameters to plant safety systems.
Discrepancies:
Any aspect of the simulation facility's physical configuration, operational performance, or design control, which deviates from the requirements established by ANSI /ANS 3.5, 1985, as endorsed by Regulatory Guide 1.149.
HED:
Human Engineering Discrepancy.
-Inspection:
An audit or review of the simulation facility's documentation, hardware, or performance for the purposes of i
determining its conformance with the requirements of 10 CFR I
55.
T&Qs:
Instruments and Controls.
LEB:
Licensee Event Report.
Misoperat19D:
The intentional performance of an incorrect response'by an operator in a simulation facility for,the purpose of an NRC simulation facility inspection when used as part of a test of that simulation facility.
NRC Form 474:
The form submitted to the NRC by the facility licensee for the certification, decertification, and for any change to a simulation' facility performance testing plan after the initial submittal of such a plan.
FRC Sta[f:
The individuals who conduct simulation facility inspections for the NRC, and report the results to the NRC.
Eerfect operator response:
The actions required of an operator in conducting the performance tests are assumed to be completed without operator error for the purposes of developing the performance tests.
The staff and the operating crew will also try to achieve this during the actual conduct of the performance tests.
The Rule:
10 CFR Part 55.
131 1
EEEE:
Simulation Facility Evaluation Procedure.
The Standard:
ANSI /ANS 3.5, 1985.
Startuo test procedures:
The procedures executed by the reference plant for the purposes of reactor and secondary system startup and testing as part of the initial plant licensing process or after significant plant modifications.
These may also be referred to as "startup test programs."
Surveillance testina:
Reference plant procedures for the periodic testing of system functions.
i 132
_______-___-a
INDEX ANS 3.5 See The Standard l
Ambient environment data collection 33 evaluation criteria 38 off-site review 15 on-site review 26 l
Assessment of findings 43 Baseline data evaluation 41
.for performance testing 23 l
Circumstances which may lead to an NRC inspection 5
Cognizant individuals 12 Control capabilities data collection 33 evaluation criteria 39 in performance testing 23 off-site review 16 on-site review 26 pilot test 100 purpose 4
scope 4
Design, updating, modification, and testing.
data collection 33 evaluation criteria 41 off-site review 17 on-site review 27 pilot test 101 purpose 5
scope 5
Discrepancies See Known discrepancies Emergency operating procedures See EOPs EO!s reasons for use 3
use in selecting operations 11 Evaluation criteria 34 Examiner standards guidance in selecting operations 10 reasons for uma 2
Facility licensee input 6
Human factors Sgg Physical fidelity / human factors Human Factors Specialist responsibilities 7
role in performance testing 31 133 l
-____.-___---._--.___a_.
a.
.A
Instrument and control configuration data collection 33 evaluation criteria 38 off-site review 15 on-site review 25 Known discrepancies 42 Lead staff member a
responsibilities 7
role in performance testing 30 LERs reasons for use 3
use in selecting operations' 11 License Examiner i
l responsibilities ~ 7 l
role in performance testing 30 l
Licensee event reports See LERs l
l Minimizing the facility licensee burden 19 NRC Form 474 I
guidance in selecting operations 9
i review of 9
l l
Observer responsibilities 7
l Off-site review conduct 8
data to be requested 18 evaluation 18 s
On-site review i
I l
conduct 20 course of_ events 27 determination of conduct 15 facility licensee notification 19 preparation 20 preparation products 27
)
l Operating crew briefing 29 operating procedures use in performance test development 22 use in selecting operations 11 l
Operations Specialist responsibilities 7
l role-in performance testing 31 l
Panel simulation i
evaluation criteria 37 data collection 32 off-site review 15 134
1 i
l on-site review 25 Perfect operator response 21 Performance testing conduct 31 data collection 23, 29 data to be requested for the off-site review 13 development 21 evaluation criteria 34 level of detail 21 j
off-site review 8
on-site review 20 pilot test 76 purpose 2
i schedule 24 scope 2
l selecting operations NRC Form 474' guidance 9
examiner standards guidance 10 i
operating procedure guidance 11 i
l plant operating history guidance 11 summary 12 i
)
type and number 10 the Standard guidance 8
)
Physical fidelity / human factors J
data collection 32 evaluation criteria 37
)
off-site review-14 on-site review 24 pilot. test 93 I
purpose 4
scope 4
i l
Pilot test 65 i
l assumptions 69 l
control capabilities 100 design, updating, modification, and testing 101 deviations from the draft plan' 70 i
evaluation' team 71 evaluation team member roles 86 location 69 I
performance testing I
conduct 85 development 81 evaluation 88 results 89 test selection common transients 79 generic events 78 initial LER events 76.
similar plant LERs 79 tests developed 85 I
135 I
physical fidelity / human. factors off-site review 93 on-site review 96 review results 96 recommendations.
design, updating, mod' i. cation, and testing. 104 evaluation team 101 general 104 performance testing 101-physical fidelity / human factors.
104 supporting documentation 106 Plant operating history guidance for selecting operations 11 Review phases 5-Similar-plant LERs use in selecting operations 11 Simulation facility operator briefing 29 The Standard for defining the scope of performance testing 2
guidance for selecting operations 8
136
4 p,c,ot m u
vete.= nioutaron co.. o
iainoai~u. a u,,-,noc.m.,w.
l
~s auctio~s o~ voi ama i Evaluation Procedure or Simulation Facilities Certified Under 10 C 55 FM Reput
' " # * " ".a i
j q
oo~ru vi
..ux oais, December 1987 l
J. Wachtel, C. Plott, K.
. Laughery, B. F. Gore
/ba'e aoa' '55v5o l
vi.a Dec er 1987 i #f aFoaMsNG OaGANGAiloN NAM. ANo. AILING i$5 (#wsuae le Code /
8 PRoJi[T ASK/WoaK uNti NuMBEa -
Division of Licensee Perfor ce and Quality Evaluation FA
-I Office of Nuclear Reactor Re ation oa caa~r auuna U.S. Nuclear Regulatory Commi
'on Washington, DC 20555
~
so. sconsoaino onoaugatio= =awi amo uniuno aoones
,.. i,, cone, ii. Tvre or aeront
)
Division of Licensee Performanc nd Quality Evaluati Final Technical Office of Nuclear Reactor Regula on U.S. Nuclear Regulatory Commissi
- ,iaioo cov mo u.---
Washington, DC 20555 09/85 - 12/87 i, sue uoi=1.av ~om f-i... 1 o ci a=,
This document describes the pro dure to be followed by NRC for the inspection of 1
simulation facilities certified unde 10 CFR 55. Inspections are divided into four areas 1
based on the types of evaluations.. con cted: 1) performance testing; 2) physical fidelity /
human factors; 3) control capab Titie and 4) design, updating, modification,'and testing.
NRC staff representing s ral di iplines including license examiners, operations specialists, and human fact experts, under direction of a team leader, will perform these inspections.
A simulation facili inspection ma include on-site and off-site phases. The off-site phase will consist of i examination of s mulation facility documentation and an i
identification of tho
. operations and pr edures which may be considered ~for use in performance testing uring the on-site pha In the on-site review, the staff will work closely with the f cility licensee to cond
' a sound and fair inspection and to evaluate the results of t>bse tests that are conduct Inspectio indings will be based on sta, judgment of the simulation facility's compliance wi 10 CFR 55.45. Findings may ran from "no adverse impact on the conduct of operating te s" through degrees of " adverse im' t requiring correction" to " adverse s
impacts so
>rious that the simulation facility not be used in the conduct of operating.ests until the discrepancies are correc d and the simulation facility is recertif' d to the NRC."
'"" u da Tr',"sliiM E'IEf" facility, plant-referenced
'mulator,
E*^E" cer-fication, performance tests, inspection, license
- mination, Unlimited op ating test.
16 8 CURITY C. AS$tFICATioN m,,,
.,oisti,icas.orim e~ono riaus Un assified
/
'ENEUa"ssified l,[
>> muusta oF emons
{
18 PRIC.
. _.. _ _ _ _ _ _ _ _ _. _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _