ML20212R284

From kanterella
Jump to navigation Jump to search
Evaluation Procedure for Simulation Facilities Certified Under 10 CFR 55.Draft Report
ML20212R284
Person / Time
Issue date: 03/31/1987
From: Laughery K, Plott C, Wachtel J
Office of Nuclear Reactor Regulation
To:
References
NUREG-1258, NUREG-1258-DRFT, NUDOCS 8704270073
Download: ML20212R284 (141)


Text

NUREG-1258 O

Evaluation Procedure for Simulation Facilities Certified under 10 CFR 55 Draft Report O

U.S. Nuclear Regulatory Commission Office of Nuclear Reactor Regulation

)

K. R. Laughery, C. Plott, J. Wachtel p " "' % ,,

  • l o I 7042 0 870331 1258 R ppg

r O

NOTICE Availability of Reierence Materials Cited in NRC Publications Most documents cited in NRC publications will be available from one of the following sources:

1. The NRC Public Document Room,1717 H Street, N.W.

Washington, DC 20555

2. The Superintendent of Documents, U.S. Government Printing Office, Post Office Box 37082, Washington, DC 20013 7082
3. The National Technical Information Service, Springfield, VA 22161 Although the listing that follows represents the majority of documents cited in NRC publications, it is not intended to be exhaustive.

Referenced documents available for inspection and copying for a fee from the NRC Public Docu-ment Room include NRC correspondence and internal NRC memoranda; NRC Office of Inspection and Enforcement bulletins, circulars, information notices, inspection and investigation notices; Licensee Event Reports; vendor reports and correspondence; Commission papers; and applicant and licensee documents and correspondence.

The following documents in the NUREG series are available for purchase from the GPO Sales Program: formal NRC staff and contractor reports, NRC-sponsored conference proceedings, and NRC booklets and brochures. Also available are Regulatory Guides, NRC regulations in the Code of Federal Regulations, and Nuclear Regulatory Commission issuances.

Documents available from the National Technical Information Service include NUREG series reports and technical reports prepared by other federal agencies and reports prepared by the Atomic Energy Commission, forerunner agency to the Nuclear Regulatory Commission.

l Documents available from public and special technical libraries include all open literature items, such as books, journal and periodical articles, and transactions. Federal Register notices, federal and state legislation, and congressional reports can usually be obtained from these libraries.

Documents such as theses, dissertations, foreign reports and translations, and non NRC conference proceedings are available for purchase from the organization sponsoring the publication cited.

Single copies of NRC draft reports are available free, to the extent of supply, upon written request to the Division of Technical Information and Document Control, U.S. Nuclear Regulatory Com-mission, Washingtoa, DC 20555.

Copies of industry codes and standards used in a substantive manner in the NRC regulatory process are maintained at the NRC Library, 7920 Norfolk Avenue, Bethesda, Maryland, and are available there for reference use by the public. Codes and standards are usually copyrighted and may be purchased from the originating organization or, if they are American National Standards, from the American National Standards Institute,1430 Broadway, New York, NY 10018.

O

NUREG-1258 m

Evaluation Procedure for Simulation Facilities Certified under 10 CFR 55 Draft Report Manuscript Completed: March 1987 Date Published: March 1987 K. R. Laughery, C. Plott, J. Wachtel Division of Human Factors Technology Office of Nuclear Reactor Regulation U.S. Nuclear Regulatory Commission OWashington, DC 20555

p. ....,
k. )

O l

1

ABSTRACT This document describes the procedure to be followed by the

/

( NRC for the inspection of simulation facilities certified by facility licensees under 10 CFR 55. Inspections are divided into four major areas based on the types of evaluations

' conducted: 1) performance testing; 2) physical fidelity / human factors; 3) control capabilities; and 4) design, updating, modification and testing.

i The purpose of performance testing is to verify that the dynamic behavior of the simulation facility adequately represents that of the reference plant.

Physical fidelity / human factors evaluation is performed to 1

verify the comparability of the simulation facility and the reference plant in the areas of panel simulation, instrument and control configuration, and ambient operating environment.

The evaluation of control capabilities is undertaken to 1

l verify the adequacy of those features of the simulation l

facility which allow its operator to direct and monitor its operation during an operator licensing examination.

Review of the simulation facility's design data, data updating, modification and testing is done to ensure that the configuration of the simulation facility is kept current O with the reference plant.

NRC staff consisting of an interdisciplinary group of license examiners, operations specialists, and human factors .

experts, under the direction of a leader, will perform these l inspections. l A simulation facility inspection may include off-site and i I

< on-site phases. The off-site phase will consist of an examination of the documentation for the simulation facility and an identification of those operations which may be 7

considered for use in performance testing in the on-site i

phase of the inspection. In the on-site review, the staff will work closely with the facility licensee to conduct a sound and fair inspection and to evaluate the results of tests that are conducted.

Inspection findings will be based upon the staff's judgement of the simulation facility's compliance with 10 CFR 55.45.

Such findings may range from: 1) no adverse impact on the conduct of operating tests; to 2) degrees of adverse impact which lead the staff to impose requirements for corrections to the simulation facility; to 3) serious adverse impacts such that the staff informs the facility licensee that the simulation facility may not be used in the conduct of operating tests until discrepancies are corrected and the simulation facility is recertified to the NRC.

iii i

TABLE OF CONTENTS ABSTRACT........................................... iii FOREWARD........................................... xi 1.- INTRODUCTION................................... 1 1.1 Purpose................................... 1 7

1.2 Background................................ 1 1.3 Scope of Inspection....................... 2 1.3.1 Performance Testing............. 2 1.3.2 Physical Fidelity / Human Factors. 4 1.3.3 Control Capabilities............ 4 1.3.4 Design, Updating, Modification, and Testing................... 4 l 1.4 Overview of the Review Process............ 5 I'

2. THE ROLE OF THE NRC STAFF IN THE CONDUCT OF SIMULATION FACILITY INSPECTIONS . . . . . . . . . . . 7
3. THE OFF-SITE PHASE............................. 8 3.1 Review of Available Data.................. 8

}

i 3.1.1 Performance Testing............. 8 3.1.1.1 The Standard............... 8 3.1.1.2 NRC-Form 474............... 9 3.1.1.3 Operator Licensing Examination Guidance..... 10 3.1.1.4 Reference Plant and Similar Plant Operating History.. 11 3.1.1.5 Operating Procedures....... 11 3.1.1.6 Cognizant Individuals...... 12 3.1.1.7 Steps for Identifying Operations for Performance Testing...... 12 t

3.1.1.8 Additional Considerations.. 13 3.1.1.9 Data to be Requested....... 13 3.1.1.10 Products................... 14 4

3.1.2 Physical Fidelity / Human Factors. 14

)

3.1.2.1 Panel Simulation........... 15 3.1.2.2 Instrument and Control Configuration............ 15 3.1.2.3 Ambient Environment........ 15 3.1.2.4 Products................... 15 3.1.3 Control Capabilities............ 16 3.1.3.1 Products................... 17 V

l

l 3.1.4 Design, Updating, Modification, and Testing...................

17 3.1.4.1 Products................... 17 3.2 Request Information from the Facility Licensee................................ 17 3.3 Review of the Data Obtained from'the Facility Licensee....................... 18 3.4 Determine if the On-Site Review will be Conducted............................ 18

4. THE ON-SITE REVIEW............................. 20 4.1 Preparation for the On-Site Review........ 20 4.1.1 Performance Testing............. 20 4.1.1.1 Steps for Developing the Performance Test......... 21 4.1.2 Physical Fidelity / Human Factors. 24 4.1.2.1 Panel Simulation........... 24 4.1.2.2 Instrument and Control -

Configuration............ 25 4.1.2.3 Ambient Environment........ 26 4.1.3 Control Capabilities............ 26 4.1.4 Design, Updating Modification and Testing................... 26 4.1.5 Products........................ 27 4.2 General Course of Events for the On-Site Review.......................... 27 4.2.1 Activities for Each Day......... 27 4.2.2 Analysis of Results and Staff Response...................... 28 4.3 Data Collection........................... 28 4.3.1 Performance Testing............. 28 4.3.2 Physical Fidelity / Human Factors. 31 4.3.2.1 Panel Simulation........... 32 4.3.2.2 Instrument and Control Configuration............ 32 4.3.2.3 Ambient Environment........ 32 4.3.3 Control Capabilities............ 33 4.3.4 Design, Updating, Modification, and Testing................... 33 vi

-. =. ._ . _ _ _ - -. - = . .

5. EVALUATION CRITERIA............................ 34 5.1 Performance Testing....................... 34 5.1.1 Evaluation of Parameters Measured...................... 34 5.1.2 Evaluation of the General Performance of the Simulation Facility...................... 36 5.2 Physical Fidelity / Human Factors........... 37 i 5.2.1 Panel Simulation Evaluation..... 37 5.2.2 Instrument and control 38 Configuration Evaluation......

5.2.3 Ambient Environment Evaluation.. 38 5.3 Control Capabilities...................... 39 5.4 Design, Updating, Modification and Testing 41 5.5 Known Discrepancies....................... 42 l

5.6 Assessing the Results of the Evaluations.. 42 REFERENCES......................................... 45 APPENDIX A - SIMUIATION FACLITY PERFORMANCE TEST FORM................................ 47 APPENDIX B - IEC DATA COLLECTION FORMS............. 51 APPENDIX C - TEST OF THE METHODOIAGY FOR THE SIMULATION FACILITY EVALUATION PLAN. 64 GLOSSARY........................................... 129 a

vii f 1 l

l 1 l 1

l

\

i I

LIST OF FIGURES l

Figure 1 Simulation Facility Inspection Typical Parameter Analysis.................... 35 O 1 l

vili

I LIST OF TABLES 1

Table 1 Type and Number of Operations to be 10 Selected for Performance Testing........

I I

l t

1 4

1 l

l

\

k l

kX I

. . _ _ _ - .,-.n., , - , . _ , . - --,-n., , _ . - - - - - . _ _ . _ , , .-- --. -- - - - _ _ _ _ . - _ . - _ - . - - . _ , _

a i

FOREWORD On February 12, 1987, the Commission approved major revisions to 10 CFR 55, " Operator Licenses," and Regulatory Guide 1.149, " Nuclear Power Plant Simulation Facilities for Use in Operator Licensing Examinations." The regulation specifies two types of simulation facilities for the conduct of operating tests: 1) a plant-referenced simulator that meets the requirements of ANS 3.5, 1985 as endorsed by Regulatory Guide 1.149; and 2) a simulation facility, other j than a plant referenced simulator, which is approved by the i NRC. Facility licensees who intend to use a plant-

' referenced simulator have up to 46 months from.the effective date of the Rula to certify this plant-referenced simulator to the NRC. Facility licensees who intend to seek approval for other than a plant-referenced simulator are required to i

l submit, within one year after the effective date of the Rule, a plan for the development of this simulation facility.

The Statement of Considerations published with the revisions j to 10 CFR 55 states that the guidance to be used by the Commission in its inspection of simulation facilities will be made publicly available six months prior to use. The staff's Simulation Facility Evaluation Procedure (SFEP) contained herein has been pilot-tested at an existing plant-referenced simulator. A discussion of this pilot-test is contained in Appendix C to the SFEP. Comments on this i

l

! procedure received within sixty days of publication will be considered and incorporated as appropriate.

Comments should be addressed to:

Chief, Operator Licensing Branch

" Division of Human Factors Technology Office of Nuclear Reactor Regulation U. S. Nuclear Regulatory Commission 4

i Washington, D. C. 20555 l

The staff will implement this procedure by issuance of an l

Inspection Module, and will begin the conduct of inspections

' of certified plant-referenced simulators in six months.

i sincerely yours, d</ L T. & a R William T. Russell, Director Division of Human Factors Technology A

O xi

1 i l l

1. INTRODUCTION l.1 Purpose This document describes the procedure to be used by the NRC for the inspection of simulation facilities which have been certified by facility licensees under 10 CFR 55.45(b).
NRC's objective is to communicate to the U.S. commercial nuclear power industry the methods which will be employed and the data which may be examined by the NRC to ensure a simulation facility's adequacy for use in the conduct of operating tests.

1.2 Backaround l

Paragraph 55.45, " Operating Tests," of 10 CFR Part 55,

" Operators Licenses," (the Rule) requires that an applicant for an operator or senior operator license demonstrate both

! an understanding of and the ability to perform certain essential job tasks. It specifies that this demonstration j will be done through the administration of operating tests i in a plant walkthrough and in a simulation facility. The simulation facility may be one which consists solely of a

! plant-referenced simulator certified to the Commission by i the facility licensee, or it may be one which has been l approved by the Commission after application for such j approval has been made by the facility licensee.

i

} NRC Regulatory Guide 1.149 supplements 10 CFR 55 by identifying one acceptable method for a facility licensee to use in certifying or applying for approval of a simulation facility. Regulatory Guide 1.149 endorses ANSI /ANS 3.5,

, 1985 (the Standard) with certain exceptions. By following i the guidance presented in Regulatory Guide 1.149 and the Standard, a facility licensee will be able to certify a simulation facility which will satisfy the NRC requirements

contained in 10 CFR 55.

Once a certification or an application for approval has been

, made, the Commission may inspect the simulation facility

and/or the documents and records associated with it or with its certification or application for approval' at any time. j The Simulation Facility Evaluation Procedure contained herein applies to those simulation facilities which have been certified to the NRC by the facility licensee under 10

! CFR 55.45 (b) (5) as plant referenced simulators. Procedures for the inspection of simulation facilities for which application for approval has been made to the NRC by the facility licensee under 10 CFR 55.45(b) (4) will be handled i

on a case-by-case basis and, to the extent applicable, will

! follow this procedure.

! 1

l 1.3 Scope of Inspection As discussed above, the NRC will conduct simulation facility inspections against the requirements of 10 CFR Part 55, using the criteria provided in ANSI /ANS 3.5, 1985 as endorsed by Regulatory Guide 1.149. This procedure is divided into four major areas based on the type of inspection to be conducted. These are:

Performance Testing Physical Fidelity / Human Factors Control Capabilities Design, Updating, Modification and Testing The scope of each of these areas is described below. This procedure imposes na requirements, acceptance criteria or staff positions. Rather, it cites the appropriate source documents for all regulatory requirements.

1.3.1 Performance Testing The purpose of performance testing is to verify that the dynamic behavior of the simulation facility adequately represents that of the reference plant. To do this, a variety of normal, abnormal and emergency oper.ttions may be tested. Since only a limited time will be available for the conduct of performance tests, only a small number of such tests will actually be run. To the extent possible, operations will be selected which: 1) are appropriate to the simulation facility and its reference plant, 2) are relevant to operator licensing examinations, and 3) offer actual reference plant data as a baseline for simulation facility comparison. The following steps will be taken to narrow the scope of the operations which may be selected for use in performance testing.

First, those operations not required for certification will be eliminated. The normal operations and malfunctions given in Sections 3.1.1 and 3.1.2 of the Standard delineate the scope of performance tests required for certification of the simulation facility. In addition, NRC-Form 474 requires the facility licensee to identify the sabset of operations and malfunctions for which the simulation facility has been certified. This narrows the scope of the performance testing to those operations and malfunctions which are appropriate for the given simulation facility.

Second, the focus will be on the simulation facility's acceptability as a tool for conducting operator licensing l examinations. This will limit performance testing to those I operations which may be included in an operator licensing l

examination. Examiner Standards ES-301 and ES-302 give the procedures for developing and administering operator licensing examinations. These documents provide guidance l 2

for determining the number and variety of operations to be

[ )

includod in a licensing examination. This guidance will be

'V used as a means of selecting operations for performance testing which are representative of those included in licensing examinations.

Third, to the extent possible operations and events will be selected which have actually occurred at the reference plant or in similar plants. This will help to ensure that the operations selected would be legitimate candidates for operator licensing examinations, and that actual reference plant data for these operations is available. ,

1 The operations and malfunctions to be selected based on their occurrence at the reference plant or a similar plant may be identified through the use of Licensee Event Reports (LERs) for those plants. These LERs give a brief description of any unusual or unexpected events which have occurred in the plants. LERis also describe the state of the plant prior to the event's occurrence. While many LERs will not be relevant, there will be some which are appropriate for use in a performance test of the simulation facility.

The operations and malfunctions identified from the review of the LERs, while useful, will probably not cover all areas which must be considered in conducting an operator licensing N examination. For example, while it is likely that LERs will address operations which require the use of normal and abnormal operating procedures (including component and instrument failures), it is less likely that operations which require the use of emergency procedures will be addrensed.

It is essential, for the conduct of a license examination, l

that the simulation facility permit a candidate to mitigate the consequences of an event using the reference plant's Emergency Operating Procedures (EOPs). It must also be possible for the candidate to employ the reference plant's normal and abnormal operating procedures as required.

Indeed, use of reference plant procedures is part of the definition of " plant referenced simulator" in 10 CFR 55.

Given this and the limitations of the LERs mentioned above, the reference plant EOPs and other operating procedures may be used as a basis for the selection of operations and malfunctions for performance testing. In addition, the guidance given by the Standard may be used for the selection of these higher risk, low probability operations and malfunctions.

In summary, the scope of the performance tests to be conducted will be narrowed based on:

- the requirements of the Standard 3

- the limitations indicated by NRC-Form 474

- the needs of an operator licensing examination

- the operations identified from the reference plant and similar plant operating histories, and

- the reference plant operating procedures; in particular, the EOPs 1.3.2 Physical Fidelity / Human Factors Physical Fidelity / Human Factors addresses the comparability of the simulation facility and the reference plant in the areas of panel simulation, instrument and control configuration, and the ambient operating environment. The criteria given in the Standard for these areas indicates that, in general, there must be no differences between the reference plant's control room and that of the simulation facility. Deviations may exist if they do not detract from an examiner's ability to evaluate operator candidate performance during a licensing examination.

The Standard gives little further guidance or criteria than the broad statements given above for the evaluation of these concerns. In order to ensure that these evaluations are done in a consistent and comprehensive manner, questions have been developed to be used for this review and they are presented in Section 5.2 of this procedure. These are objective ("yes" or "no") questions, thus enabling the initial evaluation to be straightforward. A "no" response to a question does not necessarily indicate a discrepancy.

Such responses will be evaluated, singly and as a group, for their impact on the conduct of an operator licensing {

examination. This systematic process ensures that the scope of the Physical Fidelity / Human Factors evaluation is within the limits established by the Standard.

1.3.3 Control Capabilities control Capabilities are those features which allow the simulation facility operator to direct and monitor the operation of the simulation facility. This includes the ability to:

- produce a variety of initial conditions and malfunctions

- freeze the simulation

- represent the actions of auxiliary or remote operators

- detect or determine when the simulation has gone beyond plant or simulation facility design limits, and 4

1 I

- - monitor and record critical parameters j

The Standard specifies requirements for these features, and these specifications have been directly incorporated into this procedure. (

l 1.3.4 Design, Updating, Modification and Testing Review of the simulation facility's design data, data 1 updating, modifications and testing is done to ensure that the configuration of the simulation This facilityis is kept done by reviewing current l with that of the reference plant.

selected reference plant and simulation facility design, modification and testing records to confirm that thenature of The simulation facility is being kept up to date.

these reviews and the schedules for conducting them are given in the Standard and in the Rule, and are simply restated for this procedure. )

1.4 overview of the Procedure The wide variations in concept, design and operation of simulation facilities make it impossible to The delineate procedure a

precise approach applicable in all cases.

described herein is intended to be applied as appropriate for the collection of the information necessary to judge a simulation facility's acceptability in accordance with the O Rule.

There are three circumstances which may lead to an NRC inspection of a simulation facility.

1. The NRC may conduct periodic inspections on a '

random basis. l

2. During the preparation for or the conduct of I operator licensing examinations an NRC examiner may learn about or encounter unexpected behavior from a simulation facility. Once a simulation facility has been certified, examiners will report any apparent deficiencies and the context in which they were encountered. Such examiner reports may lead to a simulation facility inspection.
3. Incomplete or questionable data submitted by the facility licensee in support of its Form 474 certification could lead to an inspection.

A simulation facility inspection may consist of off-site and on-site phases. The off-site phase involves a review of performance test documentation for the simulation facility.

An additional function to be performed off-site is the 5

1

l l

identification of those operations which may be considered for use in on-site performance testing, as well as features of the other areas of evaluation to be reviewed during the i on-site phase. 1 Upon completion of the off-site phase, the inspection may be concluded. If the on-site phase is to be performed, data appropriate for the construction of the performance tests may be requested from the facility licensee. Identification of the appropriate data is best accomplished on-site prior to the actual conduct of the inspection. All or part of the four areas of review (Performance Testing; Physical Fidelity / Human Factors; Control Capabilities; Design, Updating, Modification and Testing) may be included in the on-site phase at the discretion of the NRC Staff (the staf f) .

Once on-site, the staff will work closely with the facility licensee to make any necessary modifications to the performance tests. This will help to ensure that the performance tests to be conducted are both sound and fair.

The other areas to be inspected may also be modified as needed. By incorporating facility licensee input into the performance tests prior to the conduct of the actual data collection and evaluation, any potential weaknesses in the methodology and baseline data used for the review will be minimized. This will allow the results of the review to stand on their own merit.

Operation of the simulation facility or demonstrations of reference plant functions (e.g. annunciator test, auditory signals) will be performed by facility licensee personnel.

The staff will be present to direct the test, observe, and to record and analyze data.

When the data collection is complete, a preliminary evaluation of the results will be made. The staff will then discuss their preliminary findings with the facility licensee who will be given the opportunity to provide additional information related to these findings.

The final results will be documented in an NRC inspection report.

O 6

2. THE ROLE OF THE NRC STAFF IN THE CONDUCT OF SIMUIATION FACILITY INSPECTIONS O The staff will conduct all simulation facility inspections and will make findings with respect to a simulation facility's compliance with the regulations. Staff members who perform such inspections will represent an interdisciplinary group of technical areas, including license examination, operations, and human factors.

A lead staff member will be responsible for overall coordination of simulation facility inspections including:

serving as liaison between the NRC and facility licensees; informing the parties involved in an inspection of their individual responsibilities, and coordinating their efforts; scheduling all aspects of the inspection; identifying and resolving any problems encountered during the course of an inspection; reporting the progress and findings of the inspection to the facility licensee and the NRC; and providing information about the staff's findings for any needed follow-on activities.

The license examiner will support all phases of this procedure by providing expertise in the areas of the licensing examination process, the job of the license candidate, and the use of control room procedures.

O The operations specialist will provide plant operations expertise to the procedure. Responsibilities will include:

review of the performance tests performed by the facility licensee; development and evaluation of the performance tests performed by the staff; data collection and evaluation for the simulation facility control capabilities, design, updating, modifications and testing; and participation in other areas of the procedure as required.

The Human Factors Specialist will support the procedure by performing data collection and evaluation of the simulation facility environment, degree of panel simulation, and instrument and control characteristics.

An observer from another facility and/or INPO may participate in the staff inspections of simulation facilities, provided that the facility licensee whose simulation facility is being evaluated does not object. The staff believes that an observer would facilitate communication about the nature and process of inspections.

The observer may assist the staff in the collection and analysis of simulation facility data, but will not perform evaluations or participate in staff decision-making regarding the findings of simulation facility inspections.

7 s

l

3. THE OFF-SITE PHASE Once the NRC decides to conduct an inspection of a simulation facility, the staff members who will perform the inspection will be selected and the off-site review will begin.

3.1 Review of Available Data The first step in the off-site review is to examine the information about the simulation facility which the staff has at its disposal. The type of information to be examined and its usefulness to the review is described here for each of the four major areas of evaluation.

j 3.1.1 Performance Testing The nature of performance testing is such that it is more suitably evaluated during the on-site review. Some aspects of simulation facility performance can be evaluated during the off-site review, however. The staff will use the off-site review phase to begin the selecting the operations and malfunctions to be included in the performance tests during l the on-site review. I As discussed in Section 1.3 " Scope of the Review",

Performance Testing will emphasize those operations which:

are required for certification as set forth in the Rule and the Standard, and confirmed on NRC-Form 474 are relevant to licensing examinations have actually occurred in the reference plant or similar plants (where feasible), and make use of the reference plant's operating procedures.

The guidance for performing the off-site evaluation and for selecting operations within these limitations is given below.

3.1.1.1 The Standard Using the listing of normal operations given in Section 3.1.1 of the Standard, the listing of plant malfunctions given in Section 3.1.2 of the Standard and the guidance given in Appendix B of the Standard, the staff will identify those operations and malfunctions which are appropriate to the reference plant under consideration.

I O

8

3.1.1.2 NRC-Form 474 The facility licensee's submittal on NRC-Form 474 will be examined to ensure that it meets the requirements for certification of a simulation facility consisting solely of a plant-referenced simulator in accordance with 10 CFR Part 55.45. The following will be confirmed:

1. That the list of normal operations, steady state operations, transient operations and malfunctions for which the simulation facility has been certified encompasses all of the functions for a plant-referenced simulator, for the applicable plant-type, as required by 10 CFR 55 and Sections 3.1.1, 3.1.2 and Appendix B of the Standard.
2. That the listed operations and malfunctions have been incorporated into the performance and operability test cycle as required by 10 CFR Part
55. 45 (b) ( 5) in accordance with the guidance in Regulatory Guide 1.149 (unless the facility licensee has proposed an alternate method to that given in Regulatory Guide 1.149).

The staff will identify any missing or unscheduled operations and malfunctions, and may request additional information about them from the facility licensee.

O NRC-Form 474 should contain a listing of the performance tests conducted in support of its certification of the simulation facility by the facility licensee. While these must address all of the requirements for certification, it is possible that they will not reflect all possible.

combinations of operations, malfunctions and conditions which may actually occur in the reference plant and of which the simulation facility may be capable. The staff will use the information given in NRC-Form 474 as an indication of the types of operations, malfunctions and testing conditions which would be reasonable to include in its inspection.

By cross referencing the operations and malfunctions identified as appropriate by the staff based on the Rule and the Standard against those identified in NRC-Form 474, the scope of operations and malfunctions available for testing will be identified. .

Although the staff is not limited to evaluating only those tests conducted for certification by the facility licensee as given on NRC-Form 474, some of these tests might be reviewed to give an indication of the scope and quality of the facility licensee's testing program. For example, by reviewing the facility licensee's performance test documentation, the staff will be able to assess the basic test procedure, the parameters monitored, the data used as a 9

baseline, and the criteria used for determining performance i acceptability. The selection of a performance test (or  !

tests) to be reviewed will be based on staff judgement, and j the facility licensee may be requested to provide the details for a " representative test."

3.1.1.3 Operator Licensing Examination Guidance The guidance contained in Examiner Standard ES-302 for developing operator licensing examinations will be used to I determine the approximate number and relative proportions of operations and malfunctions to be included in the ,

performance tests. Accordingly, the number and types of '

operations and malfunctions to be included in the performance tests are given in Table 1.

Table 1 Type and Number of Operations to be Selected for Performance Testing Type of Maximum Number Maximum Number Performance Test of Performance of Performance Tests Selected Tests Conducted in the in the Off-site Review On-site Review Normal operations 3 2 Abnormal operations 9 6 (including instrument and component failures)

Emergency operations 3 2 Total 15 10 Table 1 presents the number of each type of operation and malfunction for which information will be requested during the off-site review, and for which tests will be conducted during the on-site review. In addition, consideration will be given to selecting a range of operations or malfunction within each operation type, as well as a variety of operating conditions for the performance testing. For example, operations or malfunctions which affect reactor power control, the condensate system, the turbine, and electrical distribution may be selected to obtain breadth.

Tests of the same system at two or three different power i levels may be selected to satisfy testing under a variety of plant conditions.

10 1

3.1.1.4 Reference Plant and Similar Plant Operating History l

Because the staff will be able to observe performance tests for only a very limited aumber of operations and j

malfunctions, it is most practical to select those malfunctions and operations which may be directly evaluated against actual plant data when possible, and which fall within the scope of operator licensing examinations.

Reviewing the operating histories for the reference plant and similar plants will allow the staff to identify operations and malfunctions which have actually occurred and for which plant data should be available.

i' A primary source of this data will be the Licensee Event Reports (LERs) for the reference plant and similar plants.

LERs describe abnormal and emergency operating events which occur at the plant. They provide a summary of the nature of the event, the compo'nents and instruments involved, and the general plant conditions at the time the event occurred.

There is often more extensive data collected after an LER-reportable event (e.g., data collected for post-trip reviews in accordance with Generic Letter 83-28 " Required Actions l

)

Based on Generic Implications of Salem ATWS Events", July 8, 1983). In addition, summaries of LERs are available to the l

i staff.

Although many LERs will be a good source of operating event jl

'O data, most of them will not be relevant to simulation facility evaluations. Therefore, the guidance given elsewhere in this section will be used when reviewing them.

I As many of the operations and malfunctions as possible will be selected from the reference plant LERs. Similar-plant LERs will be used when there is insufficient operating history for the reference plant, or when a similar-plant LER is judged to be especially relevant. When LERs from similar plants are being considered, the primary bases for determining an LER's applicability will be the significance of the test and the similarity between the systems involved at the similar-plant and those in the reference plant.

3.1.1.5 Operating Procedures An examiner's ability to evaluate a license candidate's use

' of the reference plant's operating procedures is essential for the conduct of a licensing examination. This is particularly true for emergency operations since the Emergency Operating Procedures (EOPs) may be the operator's only guidance. Further, since it is unlikely that the i candidate will have had actual experience with most plant emergency conditions, the candidate's ability to use plant procedures will be an important basis for evaluation during l

l the examination. 'Thus, it is important that the simulation i 11 i

facility allow these procedures to be used as they would in the reference plant.

As discussed above, the review of the reference plant's operating history probably will not provide all of the operations required for the performance tests. The operating procedures provide the logical complement for completing the set of operations to be evaluated.

It should be noted that other plant procedures besides the EOPs will be considered as well. For example, surveillance procedures make good performance tests. They are usually straightforward, with actual reference plant data available for comparison. Startup test procedures will also be considered. These procedures have reference plant data and acceptance criteria associated with them, and they are cited in the Standard for use in evaluating transient data (Section 4.2.1). Other procedures may be used as appropriate.

3.1.1.6 Cognizant Individuals Individuals available to the staff who are knowledgeable about the operating characteristics of the reference plant and/or the simulation facility can identify operations and malfunctions to be considered for use in performance tests.

These people may include license examiners who have given examinations at the simulation facility. Their guidance may be useful in: identifying problem areas at the simulation facility; ensuring that the operations selected are relevant to operating licensing examinations; and identifying operations for which reference plant data is available.

3.1.1.7 Steps for Identifying Operations for Performance Testing The result of the activities described above will be a candidate set of operations and malfunctions for use in performance testing. The basic steps for making this selection are summarized below. This is not a strict sequence of events, and many of the steps may be conducted in parallel.

1. Identify the operations and malfunctions given in the Rule and the Standard which are applicable to the reference plant.
2. Review the simulation facility's NRC-Form 474 for completeness.
3. Consider the potential simulation ality limitations indicated by the operations and malfunctions given on NRC-Form 474.

12

_ . - _m _ __._ _ _ __ _ _ _ . . _ _ _ . __ _ _ _ _ __ _ _ _

4. Review the LERs for the reference plant and for
similar plants for appropriate operations and i malfunctions.

! 5. Identify operations and events for the performance

! tests which are based on the operating procedures.

6. Contact cognizant individuals regarding the

~

selection of operations and malfunctions and ,

potential sources of reference plant data. ,

i .

7. Apply the guidance'related to operator licensing examinations for selecting the number and variety of operations to include.
8. Select the subset of operations and malfunctions for which information will be requested from the facility licensee.

2 3.1.1.8 Additional Considerations I

) In the course of determining which operations and

! malfunctions are candidates for performance testing, the following additional issues will be considered.  ;

l

1. Due to the limited on-site time available for j conduct of the performance tests, exceptionally time consuming operations will be avoided. Parts -l of such operations may be useful, however.  !
2. Those operations and events which involve actions i and/or decisions on the part of the operator (s) are
more likely to be candidates for licensing examinations, and therefore will receive greater emphasis.
3. Operations or malfunctions associated with any reported inappropriate or unexpected behavior from the simulation facility will be considered for inclusion in the performance test.

3.1.1.9 Data to be Requested q Once the operations and malfunctions to be considered for l performance testing have been selected, the staff will identify the data associated with them, and request this data from the facility licensee where such data is not I already available to the NRC. This data will be used by the staff to develop performance tests for the on-site review.

The following data sources are candidates. An on-site visit by one or more members of the staff may be required for the collection of this data.

1

! 13 4

,__ . . . . . _ . - . . ,-- ,,v-, w ,1-v----.-- -v v. , - --v - - - - - -,o. - - , - - - - ,--v , . -. - -

1. Control room normal, abnormal and emergency operating procedures.
2. Listings of differences, if any, between the procedures used in the reference plant and those used in the simulation facility.
3. Descriptions of selected performance tests conducted by the facility licensee in support of certification of the simulation facility.
4. Summaries of reference plant data obtained as a result of events associated with LERs.
5. Procedures and data from selected reference plant startup tests.
6. Relevant simulator exercise guides.
7. Relevant simulator acceptance test procedures.

3.1.1.10 Products The products of this portion of the off-site review will be:

1. A listing of any relevant operations and malfunctions which were not addressed by NRC-Form 474.
2. A listing of candidate operations and malfunctions being considered for performance testing.
3. A listing of the data to be requested from the facility licensee.

3.1.2 Physical Fidelity / Human Factors As discussed in Section 1.3, " Scope of the Review", Physical Fidelity / Human Factors addresses the comparability of the simulation facility and reference plant in the areas of panel simulation, instrument and control configuration, and ambient environment.

Each of these areas was addressed in detail for the reference plant during the Control Room Design Review (CRDR). This review addressed a variety of control room features including: control room and panel layout, displays, controls, all aspects of the control room environment, and differences between the physical configuration of the simulation facility and reference plant. Human Engineering Discrepancies (HEDs) were prepared as required and submitted to the NRC in a CRDR Final Report. These HEDs may be useful for identifying problem areas or changes that were made to the reference plant's control room. In conducting the CRDR, 14

it was necessary for the utility to collect data on i lighting, the auditory environment, and signal coding. This data may also have been included in the CRDR Final Report and can be of use to the staff.

It should be noted that the CRDR will become less useful over time since the simulation facility will undergo modifications to maintain its fidelity with the reference

! plant.

The staff may meet with cognizant individuals who can direct it to areas of interest for the Physical Fidelity / Human Factors review. This may include license examiners who have

]

conducted operator licensing examinations at the simulation facility, and those staff members who evaluated the CRDR data. Other data sources will be considered as necessary.

3.1.2.1 Panel Simulation The area of panel simulation concerns the layout of panels within the control room, the layout of instruments and controls on the panels, and the use and application of information and localization aids. Information and localization schemes used in the reference plant control room may include background shading, demarcation, mimics, hierarchical labeling, color coding, and shape coding.

4 From the review of the available data the staff will

. identify portions of the control room and its panels for which they may request drawings or photographs from the facility licensee.

3.1.2.2 Instrument and Control Configuration For the purpose of a simulation facility inspection, instrument and control (I&C) configuration is concerned with the physical appearance and operation of the displays and controls on the boards. Examples include the range and units displayed on meters, and the switch positions on a controls. Information that the staff may request might include photographs of specific IEC's, the application of zone banding schemes for displays, or various CRT displays.

3.1.2.3 Ambient Environment The ambient environment includes the lighting, auditory j environment, alarms and auditory signals, and communications systems. The staff will identify the data for this area to be requested from the facility licensee for the review.

3.1.2.4 Products

The products of the review of Physical Fidelity / Human 15

Factors data for the off-site review may include the following:

1. Photographs or drawings of the known reference plant / simulation facility physical differences.
2. Photographs or drawings of the control room and/or selected panels for both the reference plant and the simulation facility.
3. Descriptions and/or photographs of selected coding schemes used in the control room of the reference plant and in the simulation facility.
4. Photographs of selected I&Cs for both the reference plant and the simulation facility.
5. Data for relevant characteristics of the ambient environment for the reference plant and the simulation facility control rooms.

3.1.3 control capabilities The Control capabilities are those features of the simulation facility which allow the simulator operator to control and monitor the simulation facility while it is being used for operator licensing examinations. The staff may not have data about these capabilities at its disposal.

Therefore, requests will be made based on the requirements of the Standard. This information may be requested not only for review but also as information which might be helpful in developing the performance test.

Documentation for the following simulation facility control and monitoring capabilities will be considered for request.

1. The number and variety of initial conditions available on the simulation facility.
2. The capabilit malfunctions.y for inserting and terminating
3. The capability of simulating simultaneous and/or sequential malfunctions.
4. The capability of incorporating new malfunctions into the system.

These capabilities are concerned more with performance test development than with evaluation of the simulation facility.

16

l

5. The capabilities of freezing, fast time, slow time, backtrack and snapshots.*
6. The capability for allowing the simulation facility operator to perform the functions of auxiliary or remote operators.
7. Administrative controls or other means for alerting the simulation facility operator when the simulation has gone beyond plant or simulator design limits.
8. The capabilities for monitoring and recording critical parameters.

3.1.3.1 Products A listing of the Control Capabilities for which data will be requested.

3.1.4 Design, Updating, Modifications and Testing The staff may have little information at its disposal (with the exception of the testing schedule provided on NRC Form 474) regarding Design, Updating, Modification and Testing of the simulation facility. As a result, requests for data in these areas will be made based on the requirements of the Standard. The testing schedule will be reviewed and O additional information requested if needed.

3.1.4.1 Products Documentation for the following areas will be considered for request.

1. Records of reference plant modifications for specific time periods. A time period may be selected at random, or based on information available to the staff such as NRC-imposed plant or simulation facility modifications.
2. The status of the modifications identified above with respect to their incorporation into the simulation facility (e.g., pending assessment, modification completed, not incorporated).

Requests for justifications for not incorporating reference plant modifications into the simulation facility will also be considered.

3.2 Recuest the Information from the Facility Licensee After the staff review of the available information about the reference plant and the simulation facility, a listing of the data to be requested from the facility licensee in 17 l

each of the four major areas will be developed. The staff will prepare a letter to the facility licensee explaining that an inspection of its simulation facility is to be conducted. This letter will explain:

- the nature of the inspection

- the personnel who will conduct it

- the tentative schedule for the review

- the information required from the facility licensee

- the cooperation requested from the facility licensee As an alternative, a member or members of the staff may visit the facility licensee to obtain and review this information. Such a visit would permit the staff to directly and efficiently determine the availability and utility of the data needed for the review. It would also minimize the amount of data transmitted between the facility licensee and the staff.

3.3 Review of the Data Obtained from the Facility Licensee once the data requested from the facility licensee has been obtained, it will be reviewed to determine if everything requested was received. A listing of any missing or incomplete data will be made so that, if necessary, it may be requested.

The data will be evaluated using the applicable evaluation criteria given in Section 5.0 of this procedure. A list will be made of any items not in compliance with the evaluation criteria. The staff will review this listing to determine the impact of such discrepancies on the simulation facility's acceptability for the conduct of operator licensing examinations. l 3.4 Determine if the On-Site Review will be Conducted l Once the assessment of the data obtained from the facility licensee has been made, the staff will determine if the on- l site review will be conducted. This decision will be based the results of the evaluation of the data received. This may also include information obtained during the review of available data (e.g. reports from cognizant individuals such as license examiners; findings in HEDs or LERs; recent events at the reference plant or simulation facility which indicate potential problems with the simulation facility).

O 18

If the staff decides to conduct the on-site review, the facility licensee will be notified of:

- the schedule for the review l

- personnel, facilities and data that the staff will need to have available to them, and

- any additional data that must be provided to the staff in advance of the on-site review, and a date by which such data is needed.

i 1

't

l 1

I 19

4. THE ON-SITE REVIEW I During the on-site review, the staff will evaluate the simulation facility in detail. This may include conducting performance tests, visiting the reference plant control room to verify the physical fidelity of the simulation facility, or further investigation into any other characteristics of the simulation facility identified during the off-site review.

This section has been broken into three major parts. The first part addresses the preparation for the review prior to the arrival of the staff on-site. The second addresses the course of events for the on-site review. The third part adresses the conduct of the review itself.

4.1 Preparation for the On-Site Review The purpose of this preparation phase is to identify, in detail, the data to be collected on-site and the methods for its collection. Because the staff will spend less than one week on-site at the simulation facility, it will be necessary to devote as much of that time as possible to actual data collection. To do this effectively, all of the data to be collected should be clearly identified, the data collection procedures should be in place and understood, and all data collection forms should be prepared prior to arrival at the site. The subsections for each of the major areas of data collection describe this preparation process.

During this preparation phase, the staff will communicate with the facility licensee as required. This will be done to clarify any questions the staff may have about the data received from the facility licensee or about the actual operation of the reference plant and simulation facility.

It may be desirable for one or more staff members to visit the site prior to the on-site review to: clarify data availability, format and access; plan the logistics of the on-site review; and establish contacts at the working level with the facility licensee.

To the extent possible, the several areas of the review will be tied to the performance tests being conducted. For example, the instruments and controls to be used in the I&C configuration evaluation, and the control capabilities to be evaluated will be selected from those associated with the performance tests being performed.

4.1.1 Performance Testing The purpose of this part of the preparation is to make the final determination as to which operations and malfunctions are to be included in the performance tests, and then to develop these tests. The selection of such operations and 20

! malfunctions.will be based on the results of the off-site review.

The process for developing the performance tests is a lI combination of those processes used for developing: (a) simulator acceptance test procedures (ATPs), (b) operator licensing examinations, and (c) simulator training and i

exercise guides developed by the facility licensee. The l

process described in this procedure adopts the level of

' detail used by the ATPs and combines with it the approach to j testing given in the examiner standards and in the exercise and training guides. The result is a. performance test that focuses on the behavior of the simulation facility in the l

context of operations which are relevant to operator licensing examinations.

Other than the required level of detail, the only significant difference between this process and an operator j licensing. examination is the focus on the behavior of the

! simulation facility as opposed to that of the operator license candidate. Thus, " perfect operator response" will be needed for the conduct of performance tests, except when the i requirements of a particular performance test (such as to i replicate an LER) dictate that specific operator responses 3

be recreated.

i l

The steps described in this procedure for developing the performance tests are e:camples of the content and format to consider when conducting a review. They will be varied to l

meet the needs of specific circumstances. If a facility licensee's performance test is to be repeated, if startup

]

test procedures are to be run, or if a surveillance test is to be conducted, it may be possible to utilize these directly without a formal development process. As a general l rule however, performance tests will be developed to a level I of detail and in such a format that they could be repeated

! with the same results expected.

i l 4.1.1.1 Steps for Developing the Performance Tests l

The preparation for the performance tests to be conducted at (

l the simulation facility will be accomplished by completing the following steps.

I

) A. The normal, abnormal and emergency operations to be l J

included in the performance test will be selected from i i those identified in the off-site review. The following l J

guidance will be used when making this selection:

1. A broad spectrum of events will be chosen which i J

exercises as many plant systems as possible. This l

may include limiting cases of the evolutions i selected.

i 21

___----.___-_-___,_.1.._--_m._-_.__-_._ . _ . _ , . . , - _ , _ . _ _ . . . . - -

2. Known simulation facility and reference plant operational limitations will be considered. ,
3. Events or operations which are suspected of not meeting the requirements of the Rule or the Standard will be considered.
4. If a particular potential problem is to be investigated, performance tests may be selected which approach it from various directions. This will help to indicate the magnitude of the )

discrepancy, if it exists.

5. Misoperations on the control boards which initiate malfunctions, such as those reported in an LER, will be considered.
6. Information from the facility licensee that it may not be possible to run certain performance tests on the simulation facility will be considered. This may occur, for example, if particular malfunctions or events are not simulated.

In addition to the events and operations, scenario characteristics such as initial conditions and the timing of events will be specified.

A brief outline describing the performance test will be developed at this point. It will be used as the basis for the more detailed development of the test in later steps.

B. Once the operations, events and scenarios for the performance test have been determined, the staff will identify the appropriate plant / simulation facility procedures to be used.

C. Working through the procedures, a step by step outline (similar to the " Simulator Scenario Form" in Examiner Standard ES-302) will be developed. This first draft will be primarily concerned with determining the general sequence of events.

D. After the procedural steps have been delineated, the staff will:

1. Identify the critical parameters for each performance test.
2. Identify all critical alarms and automatic actions which would be expected to occur in the reference plant.

22 O

1

3. Identify control actions which would result in an expected change in any of the critical parameters.

~

1 This may include misoperations used to initiate events.

4. For each of the critical parameters identified, determine the start time, duration, and required frequency of recording. This will be based on the need to capture sufficient information at the proper resolution for evaluation of the simulation j- facility.
5. For the alarms, automatic actions and control l functions defined above, determine the approximate 4

l times, setpoints and sequences at/in which they I would be expected to occur.

j 6. Identify simulation facility control capabilities The required for conducting the performance test.

> points in the performance test at which they are to occur will also be determined. The simulation facility control capabilities to be considered i include:

I a. Freezing the simulation.

b. Simulation of auxiliary operator functions performed outside the control room.

4 l

1

! c. The means for alerting the instructor when the

simulation approaches simulation facility or j plant design limits.

I d. Initialization conditions.

i

e. Insertion of malfunctions.

i f. Adjustable rates for malfunctions.

Simulation of simultaneous or sequential

) g.

malfunctions.

1 E. All baseline data which will be used for comparison in the performance test evaluation will be identified. A list will be made of any needed baseline data that was l

i not obtained during the off-site review.

l F. For each of the identified critical parameters, as well i

as the annunciators, automatic actions, and response to.

i control functions, the means of data collection will be i determined. These means may include:

i i

Direct recording by the simulation facility's 4 computers.

I i 23 i

1

\

(

Strip chart recordings or other data logging devices built into the system.

Manual observation and recording by the staff.

The specific means used will be based on availability in the order of preference listed above. A list of the available means, including the points at which they are to be used in the performance test, will be made.

G. Any differences between the simulation facility procedures used for the performance test and the equivalent procedures used in the reference plant should be noted.

H. When the draft of the performance test is completed, it will be more formally developed. The Simulation Facility Performance Test Form given in Appendix A should be used for this purpose.

Any additional information that may be required for the development of the performance tests will be obtained from the facility licensee. This process will result in a completely developed performance test.

Once the performance tests are developed they will be prioritized for their importance in the evaluation of the simulation facility. This will help to ensure that those tests which are expected to provide the most information will be run. A tentative schedule for the conduct of the tests will be developed as well.

A typical performance test schedule might be to start with a normal operation, followed by a few higher priority abnormal operations, then the highest priority emergency operations, and conclude with the lower priority normal and abnormal operations.

4.1.2 Physical Fidelity / Human Factors Most of the data needed for the on-site review of Physical l Fidelity / Human Factors will already be available to the staff, or will be readily obtainable from the facility licensee. In some cases it may be necessary to review original documentation for HEDs, the CRDR or other data sources. This may require that the cognizant staff member visit the location where these data are stored prior to the on-site review.

4.1.2.1 Panel Simulation The selection of the control room layout features and the panels to be evaluated should be based on the following:

24

l

1. Features and panels associated with the performance tests (if conducted).
2. Features and panels specifically cited in the LERs j

reviewed.

Features and panels identified in the HEDs or other t

j 3.

. data reviewed.

'l Reference plant and simulation facility documentation, listings, drawings and photographs will be selected and used for:

1.

~

1. Verification of the location of panels and systems.
2. Verification of the layout of IECs on panels.
3. Verification of the consistent use of any of the f following information and localization schemes:
a. Background shading.
b. Demarcation.

i c. Mimics.

I

d. Hierarchical labeling.

I e. Color coding.

f. Shape coding.

1 l Listings will be made for each of the above features selected for evaluation.

! 4.1.2.2 Instrument and Control Configuration If a performance test is to be conducted, the I&Cs to be 4

l I evaluated will be selected from the critical parameters, l

{

annunciators and controls identified during the preparation l

for the performance test. If 1 performance test is not conducted as part of the review, the following sources will j be used to identify IECs for evaluation (These sources may also be used in addition to performance test IEC i identification.):

1'

l. I&Cs specifically cited in the LERs reviewed.
2. I&Cs identified in the HEDs or other data reviewed.

4

3. IECs associated with systems or panels for which the staff has requested data for the panel simulation portion of the review.

25 i

i i

4. I&Cs identified as Type A, Category 1 in accordance with Regulatory Guide 1.97 (Revision 3, May 1983).

In accordance with this regulatory guide, these I&Cs are considered to be the most critical for monitoring plant conditions during emergency operations.

Once all of the candidate components are identified, a maximum of 100 will be selected for evaluation. The available information for each I&C will be gathered photographically, or through use of the appropriate form given in Appendix B.

4.1.2.3 Ambient Environment The staff may require the facility licensee to identify and provide justifications for any differences between the reference plant control room and the simulation facility environment, in accordance with Section 3.2.3 of the Standard for the following:

1. Ambient auditory environment.
2. Alarms and auditory signals including coding of the alarms and signals.
3. Lighting.
4. Emergency lighting.
5. Availability and operability of communications systems.

If requested, these data should be ready for review by the staff on-site.

4.1.3 Control Capabilities If performance tests are to be conducted, simulation facility control capabilities may be incorporated into them and tested during the course of the performance tests.

These tests of the control capabilities will be cited on the Simulation Facility Performance Test Form.

If performance tests are not conducted, the staff may make arrangements with the facility licensee for demonstration of selected capabilities.

4.1.4 Design, Updating, Modification and Testing If investigations of the data and the means used for simulation facility design, updating, modification and testing are to be conducted during the on-site review, they 26 l

will be identified. Specific questions and additional information to be addressed should be delineated as well.

i \

4.1.5 Products When the preparation for the on-site review is complete, the I

staff will have the following products (if all areas of the review are to be conducted): i

1. The fully developed performance test on the i

Simulation Facility Performance Test Form.

2. A listing of the panel simulation features and environmental data to be evtluated.

l

3. A listing of I&cs to be evaluated.
4. A listing of the control capabilities and the areas l of design, updating, modification and testing to be evaluated.

4.2 General Course of Events for the On-Site Review l Data collection for the on-site review will normally be conducted within a period of one week at the simulation facility. Access to the reference plant control room will be required for some members of the staff for one day. The n/

\m-time spent on-site will be used to collect and analyze data, perform a preliminary evaluation of the results and discuss these preliminary findings with the facility licensee.

After leaving the site, the final results will be developed and findings documented in an NRC inspection report.

The following is an overview of the activities to be performed on each of the days of the on-site review as well as the activities to be performed by the staff after the completion of the on-site visit.

4.2.1 Activities for Each Day Day 1 Upon arrival at the simulation facility the staff-will meet with the facility licensee to outline the activities to be performed during the inspection. Once the meeting has been completed, the staff will begin the inspection with the assistance of the appropriate simulation facility personnel as required.

I Before beginning data collection, the staff will review the l performance tests to be run with the facility licensee.

Based on this review, modifications will be made to the tests if needed, to ensure that they provide a sound and O fair test of the simulation facility.

27 l

After the discussion of the tests to be conducted is finished, any needed preparation for the data collection will begin (as time permits).

Days 2 and 3 The actual data collection will begin on day two of the on-site inspection and will continue into day three. It will be primarily devoted to conducting the simulation facility performance testing, which may require the participation of all members of the staff at times. On a non-competing schedule, the staff will conduct the other aspects of the inspection.

Day 4 The activities for day four will primarily involve the evaluation of the data collected using the guidance given in this document. Any unfinished data collection will also continue as needed.

Day 5 Any of the evaluations not completed on day four will be finished on day five. Then, the staff will present the preliminary findings to the facility licensee, who will be given the opportunity to provide additional information related to them. Upon conclusion of this meeting, the staff will leave the site.

4.2.2 Analysis of Results and staff Response After returning from the on-site review, the staff will evaluate the data collected in greater detail if necessary, and prepare a report of its findings. These final results will be documented in an NRC inspection report.

4.3 Data Collection The following subsections give the procedures for collecting the data for each of the data groupings. They should be applied as appropriate for the specific reviews which the staff has chosen to conduct.

4.3.1 performance Testing The following steps will be completed in performing this activity:

A. The simulation facility operator will be briefed on the performance test to be conducted. The following areas will be emphasized in order to ensure their inclusion and proper function in the performance test:

28

l

1. The operations and malfunctions to be simulated.
2. The' instrument and' component failures to be simulated.
3. The scenario conditions including the initial conditions and the sequence of. events. l The simulation facility control functions to be

~

4.  !

tested.  !

5. The parameters on which data will be' collected, and  ;

the methods for this data collection. )

B. Potential _ problem areas will be~ identified and resolved. l Any problem areas that would result in changes to the performance test,.or in the inability to collect the desired data will be noted, and changes will be made accordingly.

The performance test will be set up on the simulation i C.

facility by the facility licensee.

l I D. Individual operations and malfunctions will be pretested as required. The specific activities to be pretested will be left to the judgement of the staff and the simulation facility operator.

l l

O-

[ E. All automatic data collection mechanisms will be tested as necessary.

F. All manual data collection activities will be rehearsed.

G. The operating crew which will be performing the manual This portion of the performance test will be briefed.

briefing will include the following:

1. The entire performance test will be reviewed in .

detail with emphasis on the activities that the l crew is to perform.

2. It will be emphasized that simulation facility performance, not operator performance is being evaluated. Even though " perfect operator response" l may be needed for certain tests, it will be made '

clear that the operators will not be penalized in any way if an' operator error is made.

3. Any intentional misoperations that are required in conducting the performance test (e.g., in order to replicate an LER) will be described.

.O 29

4. Any questions about the operating crew's role in the conduct of the performance test will be answered.

It should be noted that this operating crew need not be made up of licensed operators, although this may be desirable. The makeup of the operating crew will be left to the discretion of the facility licensee.

H. The role that each individual member of the staff will play in conducting the performance test will be determined and explained. The specific role for each staff member may vary from one performance test to another. In general however, the following assignments may be expected:

1. The lead staff member will generally be the test director or will assign another member of the staff to that role. He or she will assure that the test is completed in a timely manner and that all test conditions are met, and all required data is collected. The test director will also approve any modifications to the test that are required before or during the course of the test.
2. The license examiner will be responsible for monitoring that the test proceeded as planned, and tracking adherence to plant procedures. The license examiner will note any deviations in the performance of the test as well as any suspect behavior of the simulation facility.
3. The operations specialist's duties are likely to vary. He or she will, however, assist the license examiner, monitor specific systems, operations or instruments of interest, and perform other functions as required.
4. The human factors specialist will monitor the performance test for any human interface problems which may be encountered, and will perform other functions as required.

At this point everything should be in place for conducting the performance test of the simulation facility.

The simulation facility performance test will be conducted using the following procedure.

A. The test director will oversee the performance tests to ensure they are conducted as planned. Facility licensee personnel will actually operate the simulation facility, while the staff will be present as observers.

30

B. The scenarios, operations and events for each phase of a i

test will be briefly reviewed again by all participants prior to commencement of each exercise.

1 i C. The test will begin, and will follow the activities i

listed on the Simulation Facility Performance Test Form as closely as possible.

i D. During the course of the test the. operating crew and the simulation facility operator will call out the actions This will help to l they are performing as they do them.

keep all parties involved in the performance test aware 4

.{

l of what is happening at all times during the test.

1 E. Members of the staff will follow the course of the test using the Simulation Facility Performance Test Form.

l

They will annotate the form and make other notes as j required to indicate both appropriate and inappropriate j performance of the simulation facility as the test
progresses.

F. A member of the staff will periodically check all active j

data recording mechanisms to ensure that they are functioning properly. Simulation facility personnel i should be available for any needed modifications or repairs to the equipment.

If unusual simulation facility behavior is encountered i

O G.

during the performance test, it may be desirable to freeze the simulation so that the details of what has

)

occurred can be understood and documented if necessary.

j This decision will be left to the discretion of the staff and the simulation facility personnel.

l H. Upon completion of each test, all participants, l

including the operating crew and the simulation facility operator, should participate in a short debriefing.

This will help to assure that all relevant information and observations about the test have been noted.

L I. During each performance test, the staff will spot check the procedures being used to ensure that they are the same as those used in the reference plant control room.

Only those differences previously identified by the facility licensee should be present. Any differences found, whether previously identified or not, should be evaluated for their impact on the conduct of a licensing examination.

i l 4.3.2 Physical Fidelity / Human Factors i

j The human factors specialist will perform the physical J fidelity / human factors evaluation with assistance from other members of the staff and facility licensee personnel as 31 i

1

needed.

Since visual observation and comparison is an important factor in these reviews, it may be advantageous to use

" instant" photographs for the data collection. Any information which would help to clarify the content of the photographs should be written on the back of them. These photographs may be used in lieu of the other means of data collection described in this document.

Due to the great amount of detail in the data collection for this portion of the review, there is a possibility of error.

Thus, the human factors specialist should confirm any discrepancies found.

4.3.2.1 Panel Simulation The data collection for the panel simulation will be conducted using the following steps.

A. The selected drawings or photographs for the control room layouts for both the reference plant and simulation facility will be verified. They will then be compared and the differences will be noted.

B. The drawings or photographs for the panels selected for review for both the reference plant and simulation facility will be verified. They will then be compared for the location of both systems and components on the panels and the differences will be noted.

C. Drawings and/or photographs for the informational and localization schemes selected for review for both the reference plant and simulation facility will be verified. They will then be compared and the differences will be noted.

4.3.2.2 Instrument and Control Configuration The I&C data will be collected from the reference plant and the simulation facility control rooms. The data will be collected using photographs or the appropriate forms given in Appendix B. The instructions for filling out these forms is contained in the Appendix.

4.3.2.3 Ambient Environment The human factors specialist will evaluate the environmental data that was requested from the facility licensee. He or she will visit the reference plant and simulation facility control rooms to verify the accuracy of this data, if required. If necessary, the human factors specialist will request that certain functions (e.g. alarms, annunciator test) be demonstrated in the simulation facility and/or the 32

l i

i i

reference plant. Cognizant facility licensee personnel should perform these demonstrations.

4.3.3 Control Capabilities Any simulation facility control capability data that will be

collected independently of a performance test will be done using the following steps.

I A. The staff will discuss the data to be collected with the simulation facility operator.

B. Any data collection mechanisms required will be set up.

l C. The functions will then be performed by the simulation facility operator. The staff will monitor the testing j

' of the simulation facility control functions.

4.3.4 Design, Updating, Modification and Testing

Any simulation facility design, updating, modification and testing data to be reviewed will be collected by the staff with the assistance of facility licensee personnel as required.

i J

l I l l

t

-l i

i i

I s

33 i

l l

. - , . - , . - - , . . ~ . . . . - . - , - . . - , -

l S. EVALUATION CRITERIA This section specifies the evaluation criteria to be used for the review. The criteria are organized into the four major areas of review used throughout this procedure. The last part of this section addresses the assessment of the results of these evaluations.

5.1 Performance Testina l The performance test data will be reviewed and the following evaluations will be made. The parentheses below the criteria are the references to the relevant sections of the Standard.

5.1.1 Evaluation of Parameters Measured Each of the parameters tested will be evaluated using the following criteria. The decision tree analysis shown in Figure 1 will be used. All of the paths in the tree should be followed to their conclusion. A determination will be made of the impact of any noncompliances on the acceptability of the simulation facility for conduct of a licensing examination.

General

1. Are expected relationships between this parameter and other parameters, according to the baseline data, reflected over the course of the performance test?

(3.1.1, 3.1.2 and A3.1)

Alarms and Automatic Actions

1. Did all of the alarms and automatic actions occur that would have occurred in the reference plant?

(4.2.1(c))

2. Did any alarms or automatic actions occur that would not have occurred in the reference plant?

(4.2.1(c))

Transient Operations

1. If applicable reference plant start-up test procedure acceptance criteria exist, does the value represented by the parameter fall within these criteria? .

(4.2.l(a))

2. Does the observable change in the parameter violate the physical laws of nature?

(4.2.1(b) and 4.2.2) 34

s s

i i

k Yes Yes ,

-+ Transient or --* Transient -> Any Start @  : Are Start w  : OK I

Steady state? -

1 Test criteria -- Test criteria -

Applicable? Met?

No No

-+ End 1 No Yes

Any violation of --+ Change in the --+ OK 1'

Physical Laws? - Proper Direction? 9 j Yes No No Yes

-+ Steady  : Criticat?  : Within 110%  : OK State - - of Baseline?

Tes

{ Perfor1mmnce Yes No j -* Any  : Fait Test

  • Yes impact 9

$ Parameter

- Within 12%  : OK on an No Exand 4 Quellffed of Baseline? No

]  : Pass k

l Yes Yes

!

  • 60 Minute  : Wtihin 12% Averaged  : OK Test? - Over the Power Range?

i No No

- End i

l Yes Yes

- ALL Appropriate  : No Inappropriate  : OK

-+ Atares Alarus Occurred? - Alarsis Occurred? -

! No No

" Yes

-+ Proper Relationships  : OK i

i with other Parasseters?

No -

5

,1 Figure 1 Simulation facility inspection typical parameter analysis, i

3. Is the observable change in the parameter in the same direction as that expected from the baseline data?

(4.2.l(b) and 4.2.2)

Steady State operations

1. If it is a critical parameter, does it fall within i2% of its reference value?

(4.l(3))

2. If it is a noncritical parameter, does it fall within t10% of its reference value?

(4.1(3))

3. Has the accuracy of the computed values been determined for a minimum of three points over the power range? (This may have been tested or simply confirmed.)

(4.1)

4. For a 60 minute test, does the value of the parameter not vary more than i2% over the 60 minute period?

(4.1(2) and A3.2(1))

5.1.2 Evaluation of the General Performance of the Simulation Facility The general performance of the simulation facility will be evaluated using the following criteria. A determination will be made of the impact of any noncompliance on the acceptability of the simulation facility for the conduct of a licensing examination.

1. Where applicable to the malfunctions tested, does the simulation facility provide the operator the capability of taking action to recover the plant, mitigate the consequences, or both?

(3.1.2)

2. For the performance tests conducted, was the simulation capable of continuing until such a time that a stable, controllable and safe condition is I attained which can be continued to cold shutdown l conditions, or until the simulation facility 1

, operating limits are reached?

l (3.1.2) l

3. Did the simulation facility provide the appropriate response to operator errors, if any were tested?

(4.l(3), 4.1 (4 ) )

36 i

i I

j 4.

Did the simulation facility respond inappropriately l

to any correct operator actions?

(4.1(3), 4.1(4))

5. Were there any differences identified between the procedures used in the simulation facility and those in the reference plant?

' (3, A1.4 and A3.2)

6. When tested by the staff, was simulation facility instrument error no greater than that of the comparable meter, transducer or related instrument system of the reference plant?

(4.1(1) )

i 5.2 Physical Fidelitv/ Human Factors 5.2.1 Panel Simulation Evaluation f

l The following criteria will be used in evaluating the panel simulation.

Control Roon Layout

1. Does the simulation facility contain sufficient f

operational panels to provide controls, instrumentation, alarms and other man-machine interfaces to conduct the normal evolutions and O* respond to the malfunctions required by the Standard?

(3.2.1 and A1.2. (1) , A1.2 (2) )

2. Do differences from the reference plant, in ths relative locations of panels to each other result in a detraction in the ability to conduct a licensing examination?

(3.2.1 and A1. 2. (1) , A1. 2 (2) )

3. If panels not in the main operating area, such as back panels and remote shutdown panels, are not included, is there adequate simulation of the
i information obtained from them or the control functions performed on them to conduct a licensing

] examination?

(3.3.2)

I

! Panel Layout i

1. Are systems on the same panels as in the reference f

plant?

(3.2.1 and A1.2(3))

O 37 i

l

2. Are systems in the same relative locations to each other within and across panels as they are in the reference plant?

(3.2.1 and A1.2(3))

3. Is the general layout of components within a system or on a panel the same as in the reference plant?

(3.2.2 and A1.2(3))

5.2.2 Instrument and Control Configuration Evaluation All of the differences found between the I&Cs reviewed for the simulation facility and the reference plant will be evaluated in order to determine their impact on the performance of a licensing examination. The requirements of Sections 3.2.1, 3.2.2 and A1.2(2) of the Standard will be used for guidance in making these decisions.

5.2.3 Ambient Environment Evaluation The following criteria will be used in the evaluation of the ambient environment. All of the criteria given are based on the requirements and guidance given in Section 3.2.3 and A1. 2 (4 ) of the Standard. Since the Standard is not as specific as the criteria given here, judgements of discrepancies will be based on available human factors data or good human factors practice.

Normal and emergency control room lighting.

1. Both should be simulated.
2. The significance of any effects on the ability to conduct a licensing examination due to differences in the illumination levels or location of the lighting will be determined. The following factors will be considered in making these judgements:

Do differences in illumination levels affect the readability of any displays?

Do differences in lighting fixture locations result in variations in glare or illumination which could affect readability of displays?

Alarms, signals and incidental noise.

1. All audible alarms should be simulated to the extent that they may be used in conducting a licensing examination.
2. Other signals and incidental noise should be simulated to the extent that they may be used in conducting a licensing examination.

38 i

i I

3. If auditory coding is used, it should be identical ' , '

for the reference plant and the simulation facility. l r

Communications systems.

j

1. All communications systems that are expected to be  ;

i used for communicating with auxiliary operators (or i

examiners acting as auxiliary operators) during an

~ examination should be available and operational.

Operator Cuing and Information Aids Documentation for operator cuing and information aids, including panel drawings and photographs will be reviewed for each of the following when applicable:

1

! a. Background shading,

b. Mimics.

I c. Demarcation.

d. Coding schemes.

l e. Labeling schemes.

i They should be applied in the same instances and in the same manner in the simulation facility as they are in the reference plant. Judgements about any deviations in the use of such aids will be made using the guidance given in Sections 3.2.1 and A1.2(2) of the Standard.

5.3 Control Capabilities The following criteria will be used for the evaluation of the simulation facility control capabilities.

Control

1. Does the simulation facility possess a minimum capability for storage of 20 initialization conditions? For simulation facilities which have commenced operations within the last 18 months, or are referenced to plants which have which have commenced operations within the last 18 months, are at least ten of the conditions operational?

(3. 4.1, A1. 3 (1) and 5. 2)

2. Do the initialization conditions include a variety i 1

j

' of plant operating conditions, fission product poison concentrations, and times in core life?

(3.4.1 and A1.3 (1) )

39

3. Does the simulation facility have the capability of freezing the simulation?

(3.4.3 and A1.3 (2))

4. Is it possible to conveniently insert and terminate each of the malfunctions being evaluated?

(3.4.2 and A1.3 (2))

5. Is the simulation facility capable of simulating simultaneous or sequential malfunctions, or both, if these malfunctions can be expected by design or operational experience?

(3.4.2 and A1.3 (2))

6. Where operator actions are a function of the degree of severity of a malfunction, does the simulation facility have adjustable rates of such a range to represent the plant malfunction conditions?

(3.1.2 and A1.3 (2))

7. Are there any cues to the operator, other than those that would occur in the reference plant, that a malfunction has been introduced into the simulation?

(3.4.2 and A1. 3 (2) )

8. Are provisions (administrative or other) in place for incorporating additional malfunctions identified from operational experience?

(3.4.2 and A1.3(2))

Instructor Interface

1. For simulated actions performed outside the control room, does the capability exist for the simulation facility operator to perform the actions of an auxiliary operator?

(3. 4. 4, A1. 3 (3) and A1.3 (4))

2. Are provisions made for alerting the simulation facility operator when any aspect of the simulation approaches the simulation facility or plant design limits?

(4.3)

Monitoring

1. Are the critical parameters identified for performance testing obtainable in hardcopy form as either plots or printouts?

(4.4) 40 O

2. Is the parametric and time resolution of the hardcopy data for the parameters sufficient to 7-~s determine compliance with the performance test criteria?

(4.4) ,

5.4 Desian. Updatina. Modification and Testina The criteria given in this section are based on the requirements of Sections 5 and A2(4) of the Standard.

Design Data

1. Does baseline data exist for all parameters tested?

(3.1.2, 5.1, A2 and A3.3)

2. If multiple sources of baseline data are available, are they used in the following order unless otherwise justified?
a. Reference plant operational data - data collected directly from the reference plant.
b. Analytical or design data - data generated through engineering analyses with a sound theoretical basis.

O c. Similar plant data - data collected from a plant which is very similar in design and operation to the reference plant.

d. Other data - data which does not come from any of the above sources, including subject matter expert estimates.

(3.1.2, 5.1, A2 and A3.3)

Updating and Modification

1. If the reference plant has been in operation for 18 months, has plant data been included in the data base?

(5.2)

2. Is there an annual review of reference plant modifications?

(5.2)

3. Has the simulation facility update design data been revised as appropriate, based on an engineering, training value, and licensing examination assessment of the reference plant modifications?

(5.2 and 5.3) 41

4. Is there a means of incorporating student feedback on the simulation facility into the updating and modification process?

(5.2)

5. Have all modifications to the simulation facility design data base and the simulation facility identified in Item 3 above, been made within 12 months of identification?

(5.3)

Testing

1. Are data from simulation facility performance tests which were performed after completion of initial construction and after any configuration or performance modifications available for review?

(5.4.1 and A2(4))

2. Is data from the annual operability testing available for review?

(5.4.2 and A2(4))

5.5 Known Discrenancies Any discrepancies identified during the course of the review which were previously known to the facility licensee and for which resolutions or justifications were provided will be reviewed. The staff will determine:

1. If any of the discrepancies could have a significant adverse affect on the conduct of a licensing examination.
2. If there are any facility licensee resolutions or responses with which the staff does not agree.

5.6 Assessina the Results of the Inspection Findinas Upon completion of an inspection, the staff will review any of the discrepant items with respect to their impact, if any, on the ability to use the simulation facility to conduct a licensing examination. This review will be conducted for discrepant items individually and in combination.

If the discrepancies found are judged to have little or no adverse effect on the conduct of a licensing examination, the staff will recommend only that the facility licensee correct them or document a basis for accepting them as is.

If the discrepancies are found to have a minor but definite impact on the ability to conduct a licensing examination, the staff will reauire that the facility licensee correct 42

the discrepancies as part of its ongoing simulation facility

(T update program as required by the Standard. Discrepancies (j which constitute a " minor but definite" impact include those whose impact can be easily accounted for by a license examiner and those which an examiner can work around.

If the discrepancies are found to adversely affect the ability to conduct a reliable examination on a given procedure, system or event, the staff will recuire that the facility licensee correct these discrepancies on an .

accelerated schedule (i.e., less than the time permitted by f the Standard). Examinations will not be conducted using the procedure, system or event until the correction is made, r If the discrepancies are found to greatly hinder or limit the ability to conduct a reliable examination on the l simulation facility, such that the requirements of 10 CFR 55.45(b) cannot be met, then operating examinations shall D21 be conducted until the facility licensee has corrected the discrepancies and recertified the simulation facility.

O 43

i i

REFERENCES American National Standard for Nuclear Power Plant .

Simulators for use in Operator Training, ANSI /ANS-3.5-1985.

American Nuclear Society, La Grange Park, IL.

Oak Ridge National Laboratory, " Licensee Event Report (LER)

Compilation." NUREG/CR-2000, ONRL/NSIC-200. Oak Ridge, Tennessee 37831.

Title 10, code of Federal Regulations, Part 55, " Operator Licenses." February 12, 1987. Government Printing Office, Washington, D. C.

U.S. Nuclear Regulatory Commission, " Operator Licensing l

Examiner Standards." NUREG 1021, Revision 3, September 1986.

Available for purchase from National Technical Information Service, Springfeild, Virginia 22161.

U.S. Nuclear Regulatory Commission, " Required Actions Based on Generic Implications of Salem ATWS Events." Generic j

Letter 83-28, July 8, 1983. Copies are available from the Office of Management and Budget, Reports Management Room

' 3208, New Executive Office Building, Washington, D.C.

20503.

U.S. Nuclear Regulatory Commission, Regulatory Guide 1.149,

" Nuclear Power Plant Simulation Facilities for use in Operator License Examinations." Copies are available from U.S. Government Printing Office, Washington, D.C. 20402.

ATTN: Regulatory Guide Account.

U.S. Nuclear Regulatory Commission, Regulatory Guide 1.97,

" Instrumentation for Light-Water-Cooled Nuclear Power Plants to Assess Plant and Environs Conditions During and Following -

an Accident." Copies are available from U.S. Government Printing Office, Washington, D.C. 20402. ATTN: Regulatory Guide Account.

1 1

45 l

APPENDIX A SIMULATION FACILITY PERFROMANCE TEST FORM This appendix contains a sample " Simulation Facility  :

, Performance Test Form" which may be used for developing and conducting the performance test for the simulation facility.

t I

j 1

47 l

SIMULATION FACILITY PERFORMANCE TEST AUDIT FORM Simulation Facility XYZ Simulation Facility Date 00/00/00 Reference Plant Plant XYZ Performance Test NATURAL CIRCULATION - LER 85-019 I. Initial Condition: 100% power, middle of life II. Data Collection Method Identified Critical Parameters:

Analog: Mark the recorders TO for Test 9.

Digital: Ensure data is recorded as follows:

Every 15 seconds until manual trip.

Every 5 seconds for one subsequent minute.

Every 60 seconds until RCP start Every 15 seconds until completion.

PZR pressure PZR PORV status PZR level VCT level Charging flow Letdown flow PZR spray line T N-45 (Nuclear Power)

D rod bank position Turbine first stage p - Tref Tave - auct hi Tcold - loop wide range - A,B,C Thot - loop wide range - A,B,C Incore Temperature Subcooling Margin RCS loop flow - A,B,C

  • RCS pump radial bearing temperature Main Steam header p SG pressure - A,B,C SG feed flow - A,B,C SG steam flow - A,B,C SG level - narrow range - A,B,C SG level - wide range - A,B,C SG PORV status - A,B,C SG safety status - A,B,C
  • Spotcheck of components
  • Denotes data not logged. Ensure data recording mechanism is inplace.
  1. Denotes data not expected to be critical for this test.

O 48

'f III. Procedure

1. Simulation facility instructor fails 14J-4 (480 VAC) open, then puts simulation facility in " freeze".

Verify the following:

A. CCW valves (106A,B,C) to RCP closed B. Spotcheck of other equipment deenerization from attached load list.

2. Resume simulation. Panel operators commenc power reduction at 4%/ min. Freeze simulation facility at 87% power. Verify the following:

A. Elevated RCP bearing temperature (should be between 195F and 216F).

B. Increasing Tave, PZR press & level

3. Resume simulation. Panel operators perform manual reactor trip immediately, perform expected actions (secures RCPs about two l

minutes later upon entry into ES-0.1).

s observe:

. Natural circulation indications develop and stabilize RC flows decrease to 10% in 30 sec further decrease more slowly Tc decreases slightly, approaching SG Tsat  ;

Th increases to 30-50F above Tc, stabilizes l a Incore thermocouples track Th (560-570F)

SG pressure stable, with steaming indicated

4. Restart RCP A 30 min after Rx trip after component cooling reestablished.

l Observe:

. Th, Tc converge (all 3 loops)

I

. SG A pressure increase, no increase on SG B,C

. Possible PZR pressure decrease 49

O IV. Data Collection

1. Record all annunciators of interest and attach to this procedure.
2. Verify standard printouts are collected, marked, and attached to this procedure.
3. Verify all other data specified in Section II is collected, marked, and attached to this procedure.

V. Preliminary Evaluation

1. Compare simulation facility data with plant data or expected response and note any discrepacies.
2. Attempt to resolve the discrepacies with the plant content expert. Append resolution or recommended followup action to this procedure.

O 50

APPENDIX B I&C DATA COLLECTION FORMS This appendix contains the directions and forms for collecting the IEC data for the simulation facility inspection.

The data may be collected in either the reference plant control room or the simulation facility control room first.

Regardless of the location of this first data collection, all of the relevant information about the instrument or control should be gathered.

It is expected that not all of the fields on the forms will always be applicable. In these cases they should be marked

as not applicable (N/A). It is also possible that the forms
are not completely comprehensive. In this case, additional fields should be added as needed and additional relevant information should be collected. Relevant notations and comments on the part of the data collector are also encouraged. These may help in clarification and understanding of the data during later analysis.

Once the data is collected at the first location (reference 3

plant or simulation facility) the data sheets should be i taken to the second location and the data verified. If the data for a field is identical, a check should be placed in the corresponding field for the second location. If the data is different, the field for the second location should be filled in with the differing data.

Separate data collection forms for annunciators, meters and digital displays, recorders, CRTs and controls are included here. Explanations for the data fields particular to each individual form are included with the forms.

i l

1 i

I l

(

51

)

I

The following data fields are common to all of the I&C Data Collection Forms.

Simulation - The name of the simulation facility being Facility evaluated.

Reference Plant - The name, unit and docket number of the reference plant.

Date - The date the data is being collected.

Performance Test - The number and/or name of the Reference performancetest in which the component is referenced.

Procedure - The procedure name, number and step in Reference which the component is referenced.

These two references permit the component to be put back into operational context for the evaluation of the significance of any discrepancies found.

Label Name - The exact wording on the label of the component.

Identifier - Any numeric or alphanumeric codes which will aid in the unique identification of the component. Examples include equipment piece numbers and dimensional location measures.

Panel - The panel number and/or name on which the component is located.

Location Within - May include specific measurements or Panel references to appropriate board prints or photographs.

f

O 52

1 ANNUNCIATOR DATA COLLECTION FORM Simulation Facility Date Reference Plant Performance Test Reference Procedure Reference Reference Plant Simulation Facility Feature Label Name Identifier l

Panel Location Within i Panel ne Color 1

53 1

Annunciator Data Field Definitions Type - The type of annunciator - standard, fire panel, emergency safegaurd system Color - The color of the annunciator when illuminated l

O s.

O

METER AND DIGITAL DISPLAY DATA COLLECTION FORM Simulation Facility Date Reference Plant Performance Test Reference Procedure Reference Feature Reference Plant Simulation Facility Label Name Identifier Panel C') Location Within 1 l

Panel Type Parameter Measured Units Range Divisions Zone Banding and Setpoint Indication O

55

Meter and Digital Display Data Field Definitions Type - The type of Meter - edgewise, rotary or Digital Display - electronic counters, LED's, LCD's, drum counters, printers Parameter - The parameter that the display reflects.

Measured Units - The units in which the parameter is reflected on the display.

Range - The range that is displayed (meters and recorders) or that can be displayed (digital displays and CRTs).

Divisions - The divisions in which the parameter is reflected. (In some cases there may be several different divisions used for scaling purposes. In these cases the range over which each division is used should be included with the division).

Zone Banding and - The ranges / points and colors used for Setpoint zone banding and setpoints displayed.

Indication O

O 56

RECORDER DATA COLLECTION FORM b

Q Simulation Facility Date Reference Plant Performance Test Reference Procedure Reference Feature Reference Plant Simulation Facility Label Name i

Identifier Pen or Point Identifier Panel O. Location Within Panel l Type Parameter Measured Units Range Divisions Zone Banding and Setpoint Indication i,

Pen Color 57

Recorder Data Field Definitions Type - The type of Recorder - single pen, dual pen, multipen, multipoint Pen or Point - The number or code for the pen or point Identifier under investigation.

Parameter - The parameter that the display reflects.

Measured Units - The units in which the parameter is reflected on the display.

Range - The range that is displayed (meters and recorders) or that can be displayed (digital displays and CRTs).

Divisions - The divisions in which the parameter is reflected. (In some cases there may be several different divisions used for scaling purposes. In these cases the range over which each division is used should be included with the division).

Zone Banding and - The ranges / points and colors used for Setpoint zone banding and setpoints displayed.

Indication Pen Color -

The color of the pen under investigation.

O 58

INDICATING LIGHT DATA COLLECTION FORM Simulation Facility Date Reference Plant l l

Performance Test Reference Procedure Reference

  • l Reference Plant Simulation Facility Feature Label Name I

Identifier Panel

(-) Panel Type color Light Legends or Symbols O

59 l _ _ _ _ . _ . ._ ___ _ __ _ _

Indicating Light Data Field Definitions Type - The type of Indicating Light - legend, non-legend Color - The color of the light when illuminated Light Legends - The legends or symbols used on indicating or Symbols lights. .

O 60 O

1 CONTROL DATA COLLECTION FORM Simulation Facility Date  !

Reference Plant Performance Test Reference I

i Procedure Reference Reference Plant Simulation Facility Feature Label Name 1

1 Identifier Panel Location Within l Panel Type Function Switch Positions or Control Scale Shape or Color coding l

i I

O 61

Control Data Field Definitions Type - The type of Control - discrete rotary (including J-handle, T-handle and star-handle), continuous rotary, toggle switches, thumbwheels, pushbuttons (lighting and non-lighting, legend and non-legend)

Function - The function (valve control - throttle, seal in; breaker, controller) and/or type of activation (as is, spring return, pull to lock) for controls.

Switch Positions - The names of switch positions (discrete or Control Scales controls), ranges for scales (continuous controls) or legends on push buttons.

Shape or Color - The shape or color code used for that Coding control if coding is used.

O O

62 1

[N .

COMPUTER EVALUATION

( ,) i Due to the complexity.of computers, a form is not appropriate. The following criteria.for simulatong between .

the simulation facility and the reference plant should be ,

met along with the appropriate criteria from the forms for displays and controls cited.

~

Input

1. Is the same input device used (e.g., keyboard, control panel)?
2. Are the command sequences the same for calling up given displays, requesting information, running programs and performing calculations?
3. Are the same names used for given displays, points, programs and calculations?
4. Based on tha type of input device, are the general requirements for controls met (see the controls data o collection form)?

s- Output

1. Are the same displays available (e.g., CRTs, printers, meters, recorders)?
2. Is the display format the same?
3. Based on the nature of the information displayed and the type of display, are the requirements for meters, recorders and indicating lights met (see the apropriate data collection form)?

s 63

.-. - - - , - - . - - - - - . _ , . - - - . - - - . . , - ---.-n,---.-

O l

APPENDIX C i

l 1

TEST OF THE METHODOLOGY FOR THE SIMULATION FACILITY EVALUATION PLAN '

O 64 O

.=. - - - . - . -. . -. - -- _ - _ _ _ _ _ _ _ _

TABLE OF CONTENTS s-

1. INTRODUCTION............................. 57 1.1 Assumptions Made in the conduct of the Pilot Test................. 67 -

4 1.2 Deviations from the Draft SFEP...... 68 1.3 The ISFET........................... 69 1.4 Kick-Off and Exit Meetings.......... 70

2. PERFORMANCE TESTING...................... 74 2.1 Performance Test Selection.......... 74 2.1.1 LER Events at North Anna....... 74 2.1.2 Emergency Events Not ~

Represented In LERs.......... 76 2.1.3 LER Events From Similar Plants. 77 2.1.4 Common Transients Not Representedby LERS........... 77 2.2 Performance Test Developemnt........ 79 2.2.1 Initial Development............ 79 2.2.2 Revisions To Initial Test Procedures................... 80 2.3 Conduct of the Performance Tests.... 84 2.3.1 Test Panel Hakeup.............. 84 2.3.2 Conduct Of Individual Tests.... 85 2.3.3 Test Sequencing And Duration... 86 2.4 Evaluation of the Performance Tests. 86 2.5 Results of the Performance Tests.... 87 2.5.1 Normal Operations.............. 87 2.5.2 Abnormal Operations............ 88 2.5.3 Emergency Operations........... 90

, 3. PHYSICAL FIDELITY / HUMAN FACTORS.......... 91 1

l 3.1 Off-Site Review and Data Collection. 91 I

3.2 On-Site Review and Data Collection.. 96 i 3.3 Results of the Review............... 96

4. CONTROL CAPABILITIES..................... 98
5. DESIGN,-UPDATING, MODIFICATION AND TESTING............................ 98
6. CONCLUSIONS, OBSERVATIONS
AND RECOMMENDATIONS.................... 99 4

6.1 The SFET............................ 99 65 I . - _ . = - - . , . - - . . - . . - . . - - - -- - - . - - - - - - - -

. - . . -. ... ~ . , . . . , . - . _ _ _

6.2 Performance Testing................. 99 6.3 Physical Fidelity / Human Factors..... 102 6.4 Design, Updating, Modification and Testing....................... 102 6.5 General............................. 102 I

7. SUPPORTING DOCUMENTATION................. 103 O

s 66 O

l

1. INTRODUCTION

, This document discusses the pilot test of the methodology for the draft Simulation Facility Evaluation Plan (SFEP) as of November 1, 1986. The purpose of this test was to determine the usefulness of the methodology as a tool for evaluating the acceptability of simulation facilities consisting solely of plant-referenced simulators in accordance with proposed 10 CFR Part 55 and ANSI /ANS 3.5, 1985, as endorsed by proposed Regulatory Guide 1.149.

The test was conducted at the simulation facility located at the North Anna training center, using North Anna Unit 1 as the reference plant. Both of these facilities are owned and operated by the Virginia Power Company. The test was conducted during the week of November 17, 1986. On-site preparation and data collection for the test was performed the previous week by members of the Interim Simulation Facility Evaluation Team (ISFET).

Due to time and manpower constraints, the actual pilot test deviated, in part, from the methodology identified in the draft SFEP, and certain assumptions were made in order to conduct the pilot test. The primary differences were in the conduct of the Off-Site Review. These deviations and assumptions are described later in this document.

( 1.1 Assumotions Made in the Conduct of the Pilot Test certain assumptions were made in order in order to perform a reasonably representative pilot test. They were
1. That proposed 10 CFR 55 (the Rule) and Regulatory Guide 1.149 had been published.
2. That the Rule had been in effect for at least four years.
3. That Virginia Power (the facility licensee) had completed performance testing for the North Anna simulation facility.
4. That the facility licensee had certified the North Anna simulation facility to the NRC on NRC Form 474.
5. That this was a random audit; that is, that there had been no reports from license examiners or anonymous plant personnel of any problems with the simulation facility.

67

6. That all of the steps that were to be completed prior to the on-site review had been done in accordance with the (draft)

SFEP.

1.2 Deviations from the Draft SFEP Time constraints and logistical problems made it necessary to deviate from certain parts of the methodology given in the draft SFEP. These deviations were primarily made to expedite the off-site data collection (SFEP section 3.0),

and the preparation (SFEP section 4.1) and evaluation (SFEP sections 5.1.1 and 5.1.2) portions of the plan. The major effect of these deviations was that the methodology for performing the off-site review was tested less thoroughly than that for the on-site review. None of these deviations, however, led to the ISFET's inability to conduct the test, or to achieve the intended results of the test.

The following listing describes the deviations from the draft SFEP that were made during the pilot test.

1. The time scale for the activities to be conducted prior to the on-site review was compressed.

Members of the ISFET arrived on-site the week prior to the pilot test and conducted the data collection and preparation for the following weeks review.

They worked closely with facility licensee personnel during this period, and communication was personal and informal rather than the formal correspondence identified in the draft SFEP.

2. As a matter of expediency, and with the full support of the facility licensee's staff, members of the ISFET performed some of the data collection which the draft SFEP indicates should be collected by the facility licensee and provided to the ISFET.
3. An off-site review was not conducted. Since the Rule had not yet been published, much of the data that would have been requested for the off-site review had not been collected or compiled by the <

facility licensee. Given the experience gained as a result of this exercise however, the methodology for the off-site review given in the plan seems reasonable.

4. An intended deviation included in the pilot test, was a performance test based on a generic LER which exercised the Emergency Operating Procedures. This was done to determine if this type of performance test should be incorporated as part of the SFEP, which in draft form, recognizes the use of only reference plant and similar plant LERs for use in 68

performance testing. ,

s

5. Personnel schedules made it necessary to have two

'- different individuals serve as the License Examiner on the ISFET. The first person filled the role during the early planning stages of the review.

The second person filled the role during the final planning stages and the actual conduct of the on-

< site review.

6. Four days were spent at the simulation facility for the on-site review instead of the three perscribed ,

by the SFEP. The extra day was used to perform the initial analysis of the data. ]

~

7. The two Operations Specialists did not completely j meet the position descriptions given in the draft SFEP. The first one was actually an employee of the facility licensee. The second, a peer evaluator from another facility licensee, had extensive operations, training and simulator testing experience, albeit with BWRs.  !

The performance tests were not developed and

)

8.

documented to the fine level of detail described in the draft SFEP. Due to time constraints, the ISFET adopted a rule of thumb which said: " Develop them to the point at which another ISFET could Despite come in and run them and get the same results."

these differences, the basic content of the l

performance tests were the same as that given in the draft SFEP, and the ISFET was able to i successfully conduct and evaluate the tests.

9. The performance tests were not developed to a level of detail which identified the specific I&Cs used.

r As a result, the Human Factors Specialist asked the reference plant Operations Specialist to identify the I&Cs associated with the critical parameter.

These were then used for the IEC Inventory. The Human Factors Specialist verified that these I&Cs were in fact used during the course of the 1

)

performance tests.

1.3 The ISFET The ISFET was made up of the following individuals. Except i

for the cases mentioned in the Deviations section, they met t the requirements of the draft SFEP.

1 1

69 i

lig u Affiliation Role Ron Laughery Micro Analysis Team Leader l and Design j Bryan Gore Battelle - Pacific License Examiner l Northwest Laboratories (Initial Planning)

Bob Gruel Battelle - Pacific License Examiner Northwest Laboratories (Pilot test)

Dave Roessner Iowa Electric Operations Specialist (Peer Evaluator) j Alan Kozak Virginia Power Operations I Specialist I (Reference Plant Operations)

Chris Plott Micro Analysis Human Factors and Design Specialist In addition to the members of the ISEFT, there were also two NRC observers present during the test of the methodology.

They were John Hannon and Jerry Wachtel of the Division of Human Factors Technology, Operator Licensing Branch.

1.4 Kick-Off and Exit Meetinus Briefings were held with members of the ISFET, NRC representatives and facility licensee representatives, both before and after the conduct of the pilot test. In the pre-briefing the content and intent of the pilot test were discussed. In the post-briefing the results of the pilot test and their impact on the SFEP were discussed.

Listings of the attendees for both of these meetings is given on the following pages.

O 70

KICK-OFF MEETING ATTENDEES Affilliation Position Name Ben DeLamorton Virginia Power Supervisor - Training (Simulator)

Virginia Power Superintendant -

Larry Edmonds Nuclear Training (NAPS)

Anil K. Jain Virginia Power Senior Simulator Specialist Virginia Power Manager - MAPS Dave Cruden Terry Williams Virginia Power Manager - Power Training Services e

Allan Kozak Virginia Power Senior Instructor -

Nuclear (Simulator)

David Roessner Iowa Electric Senior Simulator Engineer Jerry Wachtel NRC/DHFT/OLB Training and Assessment Specialist NRC/DHFT/OLB Exam Development John Hannon Ron Laughery MA&D President Chris Plott MA&D Human Factors Engineer Bob Gruel Battelle-Northwest Westinghouse Licensing Examiner i

i 71

._.___.-,-n,---=.. -- --------,--.-,.,--.r,-- -

. - - , , , -y,,,---,

EXIT MEETING ATTENDEES Name Affilliation Position Ben DeLamorton Virginia Power Supervisor - Training (Simulator)

Larry Edmonds Virginia Power Superintendant -

Nuclear Training (NAPS)

Anil K. Jain Virginia Power Senior Simulator Specialist Dave Cruden Virginia Power Manager - MAPS Terry Williams Virginia Power Manager - Power Training Services Allan Kozak Virginia Power Senior Instructor -

Nuclear (Simulator)

L. Richard Buck Virginia Power Supervisor of Training - Operations Robert Soderholm Virginia Power Instructor -

Operations (Surrey Simulator)

E. R. Smith, Jr. Virginia Power Assisstant Station Manager - North Anna Larry Gardner Virginia Power SPS - Training David Roessner Iowa Electric Senior Simulator Engineer Bill Russel NRC/DHFT/OLB Director Jerry Wachtel NRC/DHFT/OLB Training and Assessment Specialist John Hannon NRC/DHFT/OLB Exam Development Ron Laughery MA&D President Chris Plott MA&D Human Factors Engineer Bob Gruel Battelle-Northwest Westinghouse Licensing Examiner Bryan Gore Battelle-Northwest Project Manager 72

4 l

Mike Wyatt INPO Senior Program

' Manager - Simulators Jean-Pierre EPRI Program Manager Sursock Bill Gardner Combustion Program Manager -

Engineering Simulators (SFEP Steering j Committee representative) i i

1 9

4 I

I i

i i

i 73 i

t

2. PERFORMANCE TESTING 2.1 Performance Test Selection At the time of this pilot test the facility had not done its own performance testing, and had not certified the performance of the simulator. Consequently, the performance tests were selected and planned without benefit of a Phase 1 review. Such a review would have facilitated test planning by providing information on tests completed and their results, data available for evaluation of tests developed by the ISFET, and possible deficiencies in simulation facility performance observed during operator licensing exams.

In the absence of information from a Phase 1 review, several factors were used to guide the selection of performance tests. First, it was desired to evaluate as broad a range of simulation capability as possible. Second, it was desired to base as many tests as possible on events for which the facility licensee had plant data to compare against the simulated values of important parameters. This would minimize potential uncertainties in expected simulation facility performance, allowing unambiguous evaluation of test results. Third, it was desired to evaluate the ability of the simulator to support application of the reference plant's emergency operating procedures.

Evaluation of a license candidate's use of EOPs is an important part of the license examination process.

Performance tests were selected from the following sources:

events reported in recent North Anna LERs, emergency events not expected to be represented in LERs for most plants, LER events from "similar" plants, and commonly expected transients not represented by LERs.

2.1.1 LER Events at North Anna Events reported in LERs were the prime candidates for selection as performance tests, since it was expected that plant data would be available from the facility licensee against which to compare simulation facility results. All North Anna LERs listed in NUREG/CR 2000 between January, 1984 and August, 1986 (N = 155) were reviewed for applicability. Several events, all from Unit 1, were selected for use in performance testing. These were:

LER 84-014 Reactor trip from 20% power caused by low steam generator level during manual feedwater control by the operator. This system is sensitive at low power levels because small changes in valve position can cause large changes in feedwater flow. Valve leakage can also complicate control.

74 l

i This event tests the modeling of the control sensitivity O of the main feedwater system.

LER 85-017 Dropped control rods at 16% power, not causing reactor trip.

This event tests reactivity modeling of the reactor core, including changes in flux level and profile.

! LER 86-002 Reactor trip from 100% power caused by a malfunction of the turbine control. Rapid governor valve closure 7

caused " shrink" in all steam generators resulting in a i

SG lo-lo level trip.

This event tests steam generator modeling. It also tests modeling and operation of all normal post-trip .

control actions, alarms and interlocks.

LER 84-019 i

Reactor trip from 100% power caused by loss of a 125 VAC vital bus. Reactor trip was due to false indication of loss of an RC pump. Various components tripped, and various indications and controls were lost.

This event tests modeling of the 125 VAC vital '

i electrical system, and its interactions with instruments and control systems.

l l

l LER 85-019 Natural circulation following a manual trip of all RC

pumps. Loss of a 4160 VAC emergency bus at 100% power caused loss of component cooling water to the RC pumps,

] forcing the operators to trip the reactor, and then trip l'

the RC pumps.

I This event tests thermohydraulic modeling of RCS flow in '

the absence of forced flow. It also tests modeling of the 4160 VAC emergency electrical system and its l i

4 interactions with plant equipment.

The draft SFEP specifies that performance tests should address two normal evolutions, six abnormal evolutions, and two emergencies. Of the five events listed above, the first was considered to represent a normal evolution because feedwater control prior to the trip was the point of interest. The middle three were co..sidered to represent j

i O abnormal events since they represented specific plant 75 I

i i

- - - , - , , - _ - - , . . . . - , - , . . - - , ~ ,---,----,._.,.--,,----,.----,,-._,--..~.-.,--n,-----

malfunctions, but in no case was RCS inventory control, pressure control, or the transport of heat from the reactor core jeopardized. The last event was considered to represent an emergency event, due to interruption of forced reactor coolant flow.

2.1.2 Emergency Events Not Represented In LERs As was expected, review of the North Anna LERs yielded no major emergency events: e.g. small break LOCA exceeding makeup capability, unisolable steam line break, total loss of main and auxiliary feedwater, or steam generator tube rupture. Due to the importance of these events to operator licensing examinations, the ISFET decided to select one as a performance test, while cognizant of possible difficulties in evaluating test results.

For several reasons, a steam generator tube rupture was selected for the performance test. The most important reason is that it addresses the greatest variety of physical phenomena, and thus exercises the broadest range of simulation facility performance. This transient involves the initiation of safety injection, the inflow of primary coolant into the secondary system (steam generator), RC pump trip and the establishment of natural circulation, the formation of a steam bubble in the reactor vessel head, and subsequent collapse of the steam bubble on restart of an RC pump. It is the most complex of the SBLOCA events in that it involves mass transfer between the primary and secondary cooling systems. With regard to the other emergencies, a total loss of feedwater event (main and auxiliary) is dealt with procedurally by opening the PORV and using HHSI cooling, thus turning the transient into a SBLOCA. A main steam line rupture is important, from the standpoint of simulator modeling, in respect to recriticality caused by reactivity addition from the moderator temperature coefficient of reactivity. However, MTC effects can be verified in other, less dramatic transients. Thus, the escalating SGTR event is a logical choice for evaluating the simulation facility's performance on emergency events.

One additional reason supports choice of a SGTR event to l represent the major emergencies. On January 25, 1982 at the l Ginna reactor made by Westinghouse a major SGTR event occurred, which progressed through the entire spectrum of phenomena described above. This event is thus clearly relevant to the Westinghouse North Anna plant, even though North Anna has three RCS loops compared to two at Ginna.

Even though detailed comparison of parameter values is probably not possible nor necessary, the detailed chronology of the Ginna event identified important trends, alarms and automatic actions which allowed a qualitative evaluation of the North Anna simulation facility's ability to reproduce a i corresponding event.

76

^

o i

2.1.3 LER Events From Similar Plants The two-unit Westinghouse plants at Surry and Turkey Point are all of the three-loop design used at North Anna. LERs published between January and August, 1986 for Surry (N =

56) and Turkey Point (N = 62) were reviewed to identify j additional events of relevance. As was expected, several
events similar to North Anna events were identified, including reactor trips due to problems with feedwater control, turbine control, dropped control rods and One of these was unique in that it-l electrical power supply.
resulted in a trip on high RCS pressure. It was therefore

! selected for the pilot test, j

Turkey Point Unit 4 rJR 85-017 Loss of a 125 VAC vital power invertor failed nuclear I instrumentation and pressurizer level indication.

Pressurizer spray failed, pressurizer heaters interlocked off, and letdown isolated. One PORV had been previously blocked due to leakage, and the other

' failed to automatically open on high RCS pressure. The i reactor tripped on high pressure due to failure of I either PORV to operate.

2.1.4 Common Transients Not Represented By LERs i

Three events were selected for performance testing as

) representatives of significant transients which might be survivable without reactor trip, and hence might not be 4

found in LERs. These events were:

1 Load rejection at maximum rate (200%/ min) from 100% to 50% power.

j Trip of one main feedwater pump at 100% power. ,

i i Trip of one RC pump at 28% power (just below reactor

{ trip setpoint).

These events were selected because they introduce i

I significant perturbations into the primary and secondary coolant systems. Thus, they test the modeling of various instrumentation and control systems which respond, bringing

' the systems back into balance so that trip setpoints are not exceeded.

j As discussed above, these tests were selected to address as

} broad a range of simulator modeling as possible, since results of simulation facility testing were not available.

l The tests were chosen to utilize actual plant data for l l

evaluation wherever possible. These tests were believed to

be representative of those which would normally be performed

) by the facility licensee in evaluating and certifying the 77 f

1 a _- -

performance of the simulation facility.

The load rejection event was initially classified as a normal event, since Westinghouse data shows that their plants are designed to survive it without tripping. (It was later reclassified as an abnormal event upon the recommendation of the facility licensec.) The other events were classified as abnormal events. With these events, then, the desired complement of two normal, six abnormal, and two emergency events was achieved for performance testing.

The tests initially selected for performance testing and their classifications were:

NORMAL OPERATIONS

1. LOAD REJECTION - 200% PER MINUTE
2. SG LEVEL CONTROL 15-25% POWER LER 84-014 ABNORMAL OPERATIONS
3. MFP TRIP - 100% POWER
4. DROPPED CONTROL RODS - 16% POWER LER 85-017
5. RCP TRIP - POWER < REACTOR TRIP SETPOINT
6. TURBINE CONTROL MALFUNCTION LER 86-002
7. INVERTOR LOSS, HIGH P TRIP TURKEY POINT LER 85-017
8. 125 VAC LOSS LER 84-019 EMERGENCY OPERATIONS
9. NATURAL CIRCULATION LER 85-019
10. LARGE STEAM GENERATOR TUBE RUPTURE During test development, two of these tests were dropped and replaced, as discussed in Section 2.2. These decisions were made primarily on the basis of data availability for test evaluation. In retrospect, the logic of this test selection process was reasonable and appropriate.

78

[~'N 2.2 Performance Test Development U Initial conditions for the tests, malfunction input information, and required operator actions were specified based upon LER information, where available, and otherwise upon the conditions desired for the test. Qualitative expectations of parameter changes and the automatic actions of plant control systems were specified based upon plant design and knowledge of system interactions.

Test development was performed in two phases. The first phase was carried out before travel to the facility. The initial selection of performance tests was done two and one-half weeks before testing was scheduled to begin, via a 4 conference telephone conversation between personnel at PNL, MA&D and NRC. During the next week and one-half, PNL license examiners developed preliminary test procedure content for eight of the initially selected tests.

The second phase of test development was carried out at the North Anna site during the week before the actual pilot test. There, PNL personnel joined the ISFET members from l North Anna, Iowa Electric, and MA&D to finalize the test procedures. Eight test procedures were finalized prior to initiation of pilot testing.

() 2.2.1 Initial Development The initial development of the test procedures was based primarily on generic knowledge of Westinghouse systems and control system design, since timing precluded shipment and/or study of plant-specific information. Draft emergency 4

operating procedures from North Anna were available at PNL from work on a different project, however, and they provided information useful in the planning process. Development of the SGTR test scenario was based on information from the extensive analysis of the Ginna event contained in NUREG-0909 (1982).

The initial test procedure development process focused on specifying not only scenario initial conditions, malfunction inputs, and operator actions, but also the best estimates of the expected responses of the simulation facility.

Knowledge of Westinghouse operating and control system designs was used to identify expected trends in critical i j

parameters and the resulting automatic control and interlock actions.

The test procedures initially developed thus included lists of expected observations to be verified during test performance. This information proved to be very helpful during the actual pilot test. It facilitated understanding of the evolution of the transient and helped confirm that 79 1

1

the test was proceeding as expected. It also allowed real-time initial evaluation of overall simulation facility performance.

The incorporation of lists of expected observations into the test procedures greatly facilitated the evaluation process.

Essentially all findings reported from these tests were initially identified during test performance by the use of these lists. Thus, careful pre-test development of expected results can minimize the time required for post-test performance evaluation.

2.2.2 Revisions To Initial TLct Procedures Upon arrival at the North Anna site, the initial test procedures were reviewed with an SRO-licensed member of the training staff who was included in the ISFET. His input was very important in confirming expected trends, correcting misinterpretations, and in adding plant-specific information.

At this time, existing plant data for events reported in the LERs, as well as for non-LER events selected for performance tests, was gathered and reviewed. As discussed below, several changes were made in testing plans due to inadequacies in the available data. It is clear that, as called for in the SFEP, data should be acquired from the facility licensee prior to starting test procedure development.

Personal interactions between the ISFET and facility personnel were quite helpful to communications in both directions. They helped inform the facility licensee just what information was needed, and they showed the ISFET what was available. One lesson learned from this effort is that a personal visit by a member of the ISFET to the facility licensee should accompany the request for plant data, both to ensure effective communications, and to minimize false starts and wasted effort by both ISFET and facility licensee personnel.

Several important changes were made during planning work at North Anna.

1. The attempt to develop a test of manual feedwater flow control sensitivity at low power levels, LER 84-014 (Section 2.1.1.1), was abandoned. There were several reasons for this decision, but the primary one was inability to develop a reproducible test. The LER resulted from the difficulty of manually controlling feedwater flow, and it was felt impractical to define feedwater control manipulation directions leading to repeatable actions. In addition, there was no information in 80

l 1

1 the LER packet at the plant from which to determine manual actions associated with the LER. Finally,

there was very little data in the LER packet j against which to correlate plant response. l

' l l 2. The test involving the trip of one main feedwater l

pump at 100% power (Section 2.1.4.2) was deleted.

It was felt that the maximum rate runback test adequately covered most of the modeling addressed j by this test. An additional consideration was that

! no plant data existed for evaluation of the results i

of such a test, whereas plant data did exist for i, the runback test.

I 3. A performance test in the normal operations category was selected to replace the deleted feedwater flow control test. This test was a straightforward performance of the surveillance procedure 1-PT-71.1, Steam Driven Auxiliary Feedwater Pump (1-FW-P-2) And Valve Test, to be j evaluated by comparison of simulation facility 4 surveillance results with recent plant surveillance

data. No " procedure" was developed for this test j other than the actual plant surveillance procedure.
4. A performance test in the abnormal operations j category was developed to replace the deleted main

's feedwater pump trip test. This test was developed j

from a recent plant LER for which considerable data was available in the plant LER package.

LER 86-006 j Reactor trip from 100% power caused by closure of the B Main Steam Trip Valve (MSIV). Safety injection was automatically initiated due to high steam flow coincident with low steam pressure in i the unaffected lines.

! This event tests steam system modeling, steam pressure control, and safety injection initiation logic.

] 5. The test based on the Turkey Point LER (Section l 2.1.3.1) was determined to be of marginal utility

for evaluation of the North Anna simulator. Little i correlation was found between the instruments and l controls failed by the invertor loss at Turkey I j Point and the effects of invertor loss at North Anna. Although a test could be constructed based on loss of the same equipment at North Anna, the i Turkey Point LER data packet was not available to i the ISFET, so no data were available for evaluating I l the test results. The ISFET agreed to leave this 81 i

test for last, because it was felt that not all tests might be completed in the sixteen hour time period allotted. No formal test procedure was developed, but it was planned to replicate the equipment failures identified in the LER and verify that RCS pressure increased to the reactor trip setpoint. Ultimately, time constraints prevented this test from being performed.

6. The SGTR performance test was modified to introduce stepwise escalation of the leak to allow assessment of flow balances for leaks within the capacity of the charging system with and without letdown.
7. A list of critical parameters was developed from i the review of the performance tests which had been l developed. This list contained approximately thirty parameters, most of which were either plotted on chart recorders or could be printed out by the simulation facility. It was decided to record all of these data for all performance tests, since it was simpler to acquire more data than necessary than to alter the data-logging programming between tests. In addition to this list, any information to be logged by observers was included for the test when necessary.

With the incorporation of these seven changes, draft test procedures were formalized and written up into proper procedure format. This required a significant amount of work which could not be delegated to secretarial staff occause considerable development work was done in the process,and because test development had been carried out using a personal computer word processing program. This allowed preliminary drafts brought from PNL to be finalized at the facility. As revisions and extensions were made they were recorded directly into the evolving test procedure.

Formalization of the test procedures prior to the actual pilot testing proved to be very desirable. It allowed all members of the ISFET as well as the simulator operators to work from uniform, clean, hard copy. This helped minimize confusion and ambiguity in the test performance phase.

Copies of these test procedures in the format used by the ISFET are presented in Appendix A. Although not editorially perfect, they functioned well. In addition, when used as a check-off sheet during the testing phase, they provided an immediate record of actions taken and observations recorded.

As has been noted, essentially all of the findings reported from these tests were initially identified during test performance. Thus, this pre-test development was quite important to successful, efficient test performance and evaluation.

82

Performance Tests Develoned For The North Anna Simulation Facility NORMAL OPERATIONS I

1. SURVEILLANCE OF STEAM DRIVEN AUX FEEDWATER PUMP 4

ABNORMAL OPERATIONS i

2. 50% LOAD REJECTION - 200% PER MINUTE
3. MSIV CLOSURE - 100% POWER i LER 86-006 1
4. DROPPED CONTROL RODS - 16% POWER 1 LER 85-017
5. RCP TRIP - 28% POWER 4
6. TURBINE CONTROL MALFUNCTION LER 86-002 j 7. 125 VAC LOSS, STUCK MFW VALVES LER 84-019 l
8. INVERTOR LOSS, HI P TRIP TURKEY POINT LER 85-017 EMERGENCY OPERATIONS I

l 9. NATURAL CIRCULATION i

LER 85-019

10. STEAM GENERATOR TUBE RUPTURE OF INCREASING MAGNITUDE l

f 1

i t

I

}

f I

i 83 i

l 2.3 Conduct of hte Performance Tests Nine performance tests were run sequentially during two eight hour shifts. Only the test based on the Turkey Point LER was omitted. Test performance took the entire time alloted, with essentially no breaks except for simulator set-up, and pre-test review of procedures and expectations.

2.3.1 Test Panel Makeup The tests were conducted by members of the ISFET and facility staff. ISFET members and their contributions were:

1. An operator licensing examiner knowledgeable in Westinghouse systems and procedures. He functioned as performance test director (as appointed by the Team Leader), maintained real-time cognizance of the progress of the test, verified and checked off expected simulation facility performance as indicated in the test procedures, and determined when test procedure steps had been satisfied so that continuation to succeeding steps was warranted.
2. An operations specialist with specific knowledge of facility systems and procedures, and of the simulation facility capabilities and operations. He was an SRO-licensed member of the facility licensee's training staff. He worked with the test director to ensure correct performance of the test, and provided verification of test progress and results. He also functioned as a panel operator, performing operator actions as required by test or operating procedure. He also provided interface and direction to the two other facility licensee personnel assisting in the test performance: a simulator operator, and a second panel operator.
3. An operations specialist experienced in simulator operation and testing, and in BWR operations and training. He contributed to data gathering and analysis, and to resolution of questions addressing the interface between the simulation facility and the tests.
4. Two human factors specialists, one of whom was the team leader for the project. They contributed to the f gathering of data which could not be automatically recorded by the simulator (and, of course, to non-performance test parts of the simulation facility pilot test).

l O

84

i 5

{ 5. A second operator licensing examiner who had developed  :

the initial drafts of the performance tests, and who had  ;

contributed to their finalization at the facility, made additional contributions to test tracking and data gathering.

f-

6. Facility licensee personnel who participated in test j performance were the trainer / member of the ISFET, a j panel operator from a similar plant owned by the same utility, and a simulator operator who was employed by Virginia Power.

I

This test panel makeup proved adequate for all phases of the performance tests.

j 2.3.2 Conduct of Individual Tests

The performance tests were conducted sequentially. Before each test the ISFET test director reviewed the test j procedure and its objectives with the other members of the ,

2 ISFET.

As was discushed above, an inclusive list of critical parameters had been developed during test planning, and data were automatically charted or printed out for all parameters ,

during each test. This minimized set-up time between tests. l However, not all data of interest coald be printed out (eg.  ;

pressurizer heater status). As was appropriate to each  !

] test, ISFET members were assigned to log needed data. In l j addition, observation plans were made so that any deviations i of simulator performance from that expected would be recognized at the time and the need for additional testing j evaluated. These observations proved to be of central i importance to the ISFET's ability to provide meaningful evaluations of the test results on-site.

l While the ISFET was conducting the pre-test briefing, the l

simulator operators input the relevant initialization conditions and marked recorder charts. During the test, j data-logging intervals were changed as required to ensure i adequate data during rapidly changing conditions, yet I minimize data records during slow moving evolutions.

\

As each test was performed the ISFET members verified the jl general performance of the simulation facility against the j expected results, which were listed in the test procedure.

I At times simulation was frozen to ensure understanding of l developments, and also to allow checking of load lists j following electrical failures. Many questions were raised l

.j and discussed, but not all could be answered before the  ;

ISFET left the site. This was particularly the case during tests which were not replications of LER events. Overall, this process proved quite successful in providing the ISFET j

with confidence that the simulation facility was performing l 85 1

generally as expected. It also demonstrated the effectiveness of the testing procedure by identifying a few discrepancies between expected results and the results of simulation, as will be discussed in Section 2.5.

After each test, recorder charts were marked and data logging printouts were cellected. Data and notes taken by the observers were also collected.

2.3.3 Test Sequencing and Duration The test sequence was selected to begin with the more straightforward tests, yet also to ensure that the more complex and significant tests were accomplished. For this reason the two emergency operations tests were scheduled at the beginning of the second day of testing. The test sequence scheduled for the first day was (numbers are from the above listing of tests developed) 2, 4, 5, 6, 1, and 3.

These tests were all completed.

The tests scheduled for the second day were 9, 10, 7, and 8.

The first three of these were completed.

On the basis of these results the ISFET concluded that the two-day period of testing is reasonable for the performance of ten meaningful tests. This assumes some improvement in efficiency on the part of the team as additional simulation facilities are tested.

2.4 Evaluation of the Performance Tests The most significant evaluation of simulation facility performance took place during test performance itself. As has been discussed, the test procedures developed prior to testing identified important responses expected from the simulation facility, including automatic actions and critical parameter trends caused by inter-system interactions. These actions and trends were verified as much as possible during test performance. Verification of electrical modeling was also performed during testing, by freezing simulation after electrical supply loss and comparing load lists with indications of failed equipment.

All reported deficiencies identified in the performance of the North Anna simulation facility were discovered during performance testing, as opposed to during post-test data review. The incorporation of all available information on expected simulation facility response ensured that clearly successful and unsuccessful performance would be identified ,

during testing. The one-day review of the test data provided confirmation of the observations made during testing. It also allowed verification that the traces of critical parameters obtained from the simulation facility I

were consistent with those from actual plant data, where it 86

was available. However, detailed plant data for critical g

parameters was available for only three events, evenPotential though  ;

five of the events were based on North Anna LERs.

discrepancies between expected simulation facility i performance and observed responses were noted. However, no additional conclusions were substantiated. A more detailed For data review may have helped resolve some of them.

others, additional, or repeated,. simulation might have been required. For yet others, perhaps no answers short of comparison with plant data or engineering calculations would have been found.

From these results it is clear that careful pre-test  !

development of test procedures, including listings of expected simulation facility responses, is a significantIn factor in simulation facility performance evaluation.

addition to expediting the evaluation of simulation facility performance, it also ensures the separation of expectations j

l and results.

2.5 Results of the Perofrmance Tests This section presents a brief description of successes and deficiencies identified for each of the tests which was performed. It should be noted that a result of "none found" in the deficiency category does not necessarily guarantee

, that no deficiencies exist. It only indicates that no deficiencies were identified based on the data available and the criteria developed.

2.5.1 Normal Operations SURVEILLANCE OF TURBINE DRIVEN AUX FEEDWATER PUMP SUCCESS: All steps of this test were conducted in accordance with the reference plant procedure. The AFW system was lined up, started and operated. All valves procedurally designated for operation from the control room were operated.

Pump output flow and pressure were as l expected. i j

DEFICIENCY: Two valves failed to stroke within 10% of the stroke time measured in the reference plant. (Required in accordance with ANSI /ANS 3.5, 1985, Section 4.1) . One

(

was fast, and one was slow. However, neither was outside of the performance I band allowed for the reference plant system by procedure criteria. The i facility licensee was unaware of the problem.

1 87

2.5.2 Abnormal Operations 50% LOAD REJECTION - 200% PER MINUTE SUCCESS: Steam pressure was appropriately controlled by steam dump valves. Steam generator level remained above the trip setpoint. Pressurizer level and pressura were controlled by heaters and spray within appropriate values. The reactor did not trip.

DEFICIENCY: Runback of control rods did not follow Tave-Tref program. Runback decreased from maximum speed sooner than expected.

The facility licensee was aware of the problem and had plans to remedy it.

MSIV CLOSURE - 100% POWER LER 86-006 SUCCESS: Reactor and turbine tripped as expected.

Post trip response of all parameters was appropriate.

DEFICIENCY: Safety injection was not initiated, as had occurred in the LER. Steam pressure failed to drop sufficiently. The facility licensee was aware of the problem and had plans to remedy it.

DROPPED CONTROL RODS - 16% POWER LER 85-017 SUCCESS: Rod bottom lights lit and rod position indicators indicated that the selected rods had dropped. As in the LER, the reactor did not trip on negative flux rate, and reactor power decreased 2%. As was expected (due to control rod positions) imbalance became less negative (by 0.5%), although no LER data were available for comparison.

DEFICIENCY: None found.

1 O 88

._. . _ . - _ . . - _ . . . = . _ - = - - _ _ _ _ -- .

I 4

I RCP TRIP - 28% POWER SUCCESS: All changes were as expected. Flow decrease in C loop and flow increases in A and B loops were appropriate in speed i and magnitude. Thot and Tave in A and B 4 loops increased appropriately. Tcold in C loop agreed with Tcold in the other loops, and C loop Thot dropped below Tcold while remaining above Tsat in the C

}

steam generator indicating appropriate reverse flow in C loop. Steam flows from the A and B SGs increased, and flow from the B SG decreased. Steam header pressure decreased, with steam pressure

! higher in the A and B SGs than in the C i SG.

DEFICIENCY: None found.

TURBINE CONTROL MALFUNCTION LER S6-002 l

f SUCCESS: As in the LER, the reactor tripped on steam generator low-low level. Post trip behavior of all parameters agreed extremely well with LER data.

1 DEFICIENCY: None found, l

i l 125 VAC BUS LOSS l LER 84-019 SUCCESS: As in the LER, loss of power to the components and instruments served by Bus 1-III resulted in a reactor trip on an

) indicated (false) loss of the C RC pump.

An audit of the loads listed in the LER showed correct indications of power loss j

to all but one load.

' DEFICIENCY: Boric acid transfer pump 2A could be transferred from slow to fast speed, l

although the load list for Bus 1-III indicated that this transfer should be disabled by bus loss. The facility licensee was unaware of this problem.

i I 89 f

l

INVERTOR LOSS, HI P TRIP TURKEY POINT LER 85-017 This scenario was not run due to lack of time.

2.5.3 Emergency Operations NATURAL CIRCULATION LER 85-019 SUCCESS: Loss of the 14J-4 motor control center initiated this transient. All loads of this MCC which were sample audited (35 out of 83) failed upon loss of the MCC.

RC pump bearing temperatures increased as expected after loss of component cooling water flow. After RC pump trip, natural circulation indications developed appropriately. After start of the A RC pump, reverse flow indications developed in B and C loops.

DEFICIEHCX: Recirculation valves for the condensate pumps, the heater drain pumps, and for the main feedwater pumps were modeled for automatic operation in the simulation facility, whereas in the plant there are also isolation valves which are manually closed or throttled. The facility licensee was unaware of this problem.

STEAM GENERATOR TUBE RUPTURE OF INCREASING MAGNITUDE SUCCESS: During the leak escalation phase primary system flow balances (charging plus seal injection equals letdown plus leak) were satisfactory for various leak rates.

When the scenario was rerun with a large leak followed by opening of the PORV, rapidly increasing pressurizer level indicated establishment of a steam bubble in the reactor vessel head. During decrease of pressure in the RCS, safety injection flow appropriately increased as pressure decreased. After restart of an RC pump, variations of pump motor current indicated possible two-phase flow.

Subsequently, decreasing pressurizer level indicated the expected condensation of the steam bubble.

DEFICIENCY: None found.

90

, 3. PHYSICAL FIDELITY / HUMAN FACTORS f 3.1 Off-Site Review and Data Collection l

As described in the SFEP, the first step for the evaluation of the human factors / physical fidelity was to review the

Human Engineering Discrepancies (HEDs) that were generated

! as a result of the control Room Design Review (CRDR) . The j HEDs were reviewed by the Human Factors Specialist to ,

i identify control room data and characteristics which could be of use in evaluating the comparability the simulation facility to the reference plant.

4

The HEDs reviewed were those which were submitted to the NRC i as part of the Summary. Report for the CRDR. In general, these were abbreviated versions of the original HEDs and did

! not contain sufficient detail. As a result, while the HEDs j

contained some useful information, in many cases it was not l

complete enough for use in the review. In these cases the i HEDs were flagged for further investigation.

J

' The Summary Report for the CRDR also described the data collection methodology used for collecting control room environmental data. While in most cases the actual data was not reported, these descriptions did allow the Human Factors i Specialist to identify environmental data which could be requested from the facility licensee.

] ,

The Summary Report also included the checklist which was used for evaluating the control room during the CRDR. From j

i this, the Human Factors Specialist was able to identify I characteristics of the control room for which an HED may not I have been written but which would be of interest for this review. For examples i 1. The types of communications systems used by the

! control room operators were identified.

2. The fact that color coding schemes were in use for j

background shading, annunciators and controls was i ascertained.

I 1 3. The fact that the auditory signals used in the i control room were appropriately coded was indicated.

As part of the CRDR a task analysis of the operator actions j required for mitigating emergency events was conducted. In j conducting this analysis the instruments and controls (IECs) required to perform these actions were identified. The I&Cs

{

were identified from the perspective of "what's needed" in i

1 the control room as oppcsed to "what exists" in the control

! room. From this analysis HEDs were written which addressed the following types of problemst j

l 91

}

1. IECs which are needed in the control room but do not currently exist.
2. I&Cs which exist in the control room but are not in the proper location.
3. I&Cs which exist in the control room but do not reflect parameters as required (displays) or perform control functions as required (controls).

Since the I&Cs identified in these HEDs (particularly those for the later two types) are considered to be significant to emergency operations, they make a good sample for I&C Inventory. This would be true whether or not any modification is actually made to the reference plant as a result of the HED. For this review a sampling of I&Cs for each these types of HEDs was made. This easily resulted in selecting over 50 separate I&Cs.

The Summary Report for the CRDR and the HEDs included in it seems to be a good starting point for identifying physical characteristics of the simulation facility to be reviewed.

As a result of this initial review, the following characteristics were identified for further investigation:

1. Drawings of the control room layout were obtained.
2. Data from the control room lighting survey were obtained.
3. The fact that color coding was in use for annunciators, controls, background shading and indicating lights was determined.
4. Panels which are difficult to see or reach in the control room were identified.
5. The communications systems used in the control room were identified.
6. The alarms and auditory signals audible in the i control room were identified.
7. A sample of over 50 separate I&Cs which could be used for the I&C Inventory were identified.

At this point in the human factors / physical fidelity review the SFEP indicates that any additional information needed be requested from the facility licensee. Due to the abbreviated pilot test schedule, this was done by having the Human Factors specialist sort through the available data l

92

l

' himself.with the permission of the facility licensee.

I Two observations were made as a result of this exercise.

First, the data itself was not necessarily complete or well organized. (In the case of the simulation facility that was the subject of this pilot test, the data was kept by the contractor who had performed the CRDR; not by the facility licensee.) In some cases the data was collected and not retained. This was particularly true for those items for

}

l which the collected data was found to be acceptable. For l

example, if the meaning of colors used on indicating lights was found to be acceptable in accordance with the CRDR criteria, the data collected for color meaning was not retained. In other cases poor organization of available j data required considerable searching to find what was needed. If these kinds of problems are typical of the industry, they could result in the facility licensee sending l

' incorrect data or no data at all in response to the SFET's I

request.

[ The second observation is that having the human factors ,

! specialist go through the data may result in obtaining the i best information available to meet the needs of the SFET. i This process can result in the identification of additional useful information that was not apparent from the HEDs l

j provided in the Summary Report. In this investigation for i example, while searching through the original HEDs the human factors specialist found some drawings and listings of IECs that were not logically laid out. While the HEDs for these had been included in the Summary Report, the actual drawings

! and listings had not and there was no indication that they l existed.

I The SFEP indicates that at this point the data received from I

the facility licensee should be reviewed to determine if it f is in compliance with the requirements of the Standard.

i This step was not conducted for this review due to the fact that simulation facility data was not yet required to be l

a collected.

1 The next step in the SFEP was to prepare for the on-site l

i review. Most of this preparation was completed during the 1 activities discussed above. The only thing yet to be prepared was the I&C Inventory. There were two possible methods for selecting I&Cs for the inventory evaluation.

The first was to select the IECs associated with the j

critical parameters included in the performance tests to be

conducted, if any. The second was to select the IECs based i on the HEDs and from other sources such as those identified as critical by Regulatory Guide 1.97 (Revision 3, May 1983).

t Since performance tests were conducted for this review, the j I&Cs were selected using the first method. It should be

]

noted that the SFEP recognizes the possibility of conducting j a human factors / physical fidelity review even if performance i 93 ,

l i

--___- _------ -- -.~, _

tests are not required. In such cases, the second method stated above would be used.

Once the performance tests were developed and the critical parameters identified, 100 I&Cs were selected for evaluation. This was done by the Human Factors Specialist and the reference plant Operations Specialist. The operations Specialist identified all of the controls, displays and annunciators associated with each of the critical parameters. The Human Factors Specialist then selected a sample of 100 I&Cs from those identified. An effort was made to select those which were used most in &

monitoring and controlling the parameters. An effort was also made to maintain the same relative proportions of each type of I&C in the sample as there was in the original set '

identified.

At this point the preparation for the on-site review was complete.

3.2 On-Site Review and Data Collection The on-site review for human factors / physical fidelity was conducted in accordance with the methodology described in the SFEP. Data was first collected at the simulation facility and then verified in the reference plant. Any discrepancies from the Standard that were identified were then discussed with the facility licensee to determine if they had been addressed. They were then classified as being

1) in the process of modification; 2) determined, by the facility licensee, to not have an impact on training; or 3) not addressed by the facility licensee.

(

l The only variation from the plan was that some of the I infornation obtained for the evaluation was volunteered by the control room operators while the ISFET was in the control room conducting the review. Specifically, there l were a variety of control room noises that could be heard under certain conditions in the reference plant but were not simulated in the simulation facility. While this kind of information may not always be available to a Simulation Facility Evaluation Team (SFET), it should be incorporated into the review when it is.

3.3 Results of the Review once any discrepancies had been identified and discussed with the facility licensee, the Human Factors Specialist and the License Examiner evaluated them to determine their impact on the conduct of a licensing examination.

The following items were found to be discrepant from the Standard, and had not been addressed by the facility licensee. The facility licensee indicated that these 94

discrepancies would be resolved.

O 1. Discrepancy - Out of 36 annunciators which are backlit red in the control room to indicate immediate operator action, one was found not to be backlit red in the simulation facility (panel 21B, annunciator H1, "PRZ RELIEF TK HI TEMP").

Assessment - This could have an impact on the conduct of a licensing examination since it is a legitimate cue which is not being presented to the operator.

2. Discrepancy - The recorder which records T-HOT /T-COLD for Loop 3 has a range of 0 - 700 in increments of 10 in the reference plant and a range of 0 - 5 in increments of .1 in the simulation facility.

Assessment - The other two loops have the proper range displayed so that this would have a minimal impact on the conduct of a licensing examination.

3. Discrepancy - The selector switches for the red and blue pens for the NIS recorder (NR-45) indicate that the Delta Flux for channels I and III may be recorded with the red pen and channels II and IV with the blue pen.

In the simulation facility this is interchanged so that the selector switch for the red pen indicates channels C

O II and IV and the selector switch for the blue pen inrM. cates channels I and III.

Assessment - There is redundant indication near by so that the there would be minimal impact on the conduct of a licensing examination.

4. Discrepancy - Of the 50 meters sampled, 23 had zone banding and 18 of those were found to be discrepant.

The discrepancies consisted mainly of missing banding (usually the low end of the scale) and mismatches for the starting points of the banding. These starting point mismatches seemed to be systematic in that the zone banding in the simulation facility started right at a setpoint while the zone banding in the reference plant started a few divisions before the setpoint. The facility licensee indicated that this may be a result of a recent modification to the plant which may have changed the setpoints.

Assessment - This could have an impact on the conduct of an examination since the zone banding allows the operator to easily scan the boards to determine if a parameter is out of range.

O 95

5. Discrepancy - Out of the 100 I&Cs sampled, 5 had missing i labels or labels with misspellings.

Assessment - The impact of these missing or misspelled labels on a licensing exam would be minimal.

The following items were found to be discrepant from the Standard, but are being addressed by the facility licensee.

1. The lighting level and distribution for both the normal and emergency lighting is not the same in the reference plant and the simulation facility. The glare reduction features used in the reference plant are not used in the simulation facility as well. The facility licensee has determined through a training value assessment that these differences have minimal impact.
2. The annunciator alarm for the back panels is not simulated. The facility licensee has a modification to the simulation facility in progress to correct this.
3. The following control room noises are not simulated:

Currently being assessed Feedwater reg valves for unit 2 Feedwater lifter relief valves PORVs Popped open safety valve MSR relief valve Steam break Modification in progress Air damper change on SI

4. Not all of the common panels / instrumentation are simulated. The facility licensee has determined through a training value assessment that these differences have minimal impact.
5. Not all of the telephone communications systems are simulated. The facility licensee has a modification in progress to correct this.
6. The saturation margin meters are not at all alike. The facility licensee has a modification in progress to correct this.
7. Annunciator 21C Al "VCT HI-LO LEVEL" has been broken into 2 annunciators in the plant. The facility licensee has a modification in progress to correct this.

O 96

8. The four "PRZ PORV" meters are in the simulation O facility but no longer exist in the plant. The facility licensee has a modification in progress to correct this.
9. The pressurizer power relief valve controls have two sets of red / green lights in the plant and only one set in the simulation facility. The facility licensee has a modification in progress to correct this.

a I

O i

I

, O 97 l

~

.-.--r,., ----- ,, - - -v----,,- e e -- - --- - ,,~-- , - , - - - - , , - - - - - - + - - - - - , -

4. CONTROL CAPABILITIES Testing of the control capabilities was included as part of the Performance Testing. The simulation facility met all of the requirements of the Standard for this area.
5. DESIGN, UPDATING, MODIFICATION AND TESTING This area was not included in the test of the SFEP methodology. The reasons for this are primarily logistical.

Since the Rule had not yet been implemented, the actual simulation facility performance tests that will be required by the Rule have not been run by the facility licensee, and the simulation facility configuration management program has not formally begun.

O O

98

6. CONCLUSIONS, OBSERVATIONS AND RECOMMENDATIONS l \ The result of the pilot test conducted at the North Anna simulation facility is that the methodology given in the draft SFEP is soundly based and workable. Although the pilot test did not follow the draft SFEP exactly, all of the fundamental methodologies included in it were tested. These methods proved to be workable and resulted in the identification of features and behaviors of the simulation facility that were not in conformance with the proposed regulation.

The following are the observations and recommendations for i possible changes to the SFEP based on the results of the pilot test of the methodology. These are made based on the experiences of the ISFET and the results of the test.

6.1 The SFET The more experience each of the members of the team has with the reference plant the better. This is especially true for the license examiner and the operations specialists.

It is very desirable to have at least one member of the team with a strong background in nuclear power plant simulator testing / evaluation.

Having someone with a strong reference plant operations background as a member of the team is a very good idea. It helps to make for a better test and makes the test development, conduct and evaluation much more efficient.

If a reference plant operations expert is not a member of the SFET, one should be asked to review the test procedures to verify their appropriateness, verify plant specific information included, resolve uncertainties and supply plant specific information which the SFET could not supply. l The peer evaluator seems to be a good idea. The option of i having such an individual as a "non-voting" member of the team is encouraged.

6.2 Performance Testina -

The guidance in the SFEP for selection of ten operations for performance testing during two days of simulator use seems i appropriate. Test operations should be chosen in the ratio of 1-3-1 of normal, abnormal and emergency events.

Wherever possible, performance tests should be developed to utilize existing plant data for evaluation.

i For tests developed without supporting plant data, the level of development of test procedures should agree with the 99

., ..m._., _ _ - - . , ~ _ , . , - _ . ~ _ _ . _ , _ - _ . - _ _ _ - . - ,

ability to predict responses based on knowledge of plant design and system interactions.

The level of detail provided in the performance test procedures used here (see Appendix A) seemed adequate. In general, the more detail incorporated in test procedures the better, as long as it is based on known plant responses.

Tests based on the reference plant's EOPs should be included.

Tests baced on "similar plant" LERs can be difficult to use unless the similar plant is very similar. Even for similar plants it may be necessary, in some cases, to determine if the systems cited in the LER are similar. Exploration of the use of sister plants for LER tests should be pursued.

Surveillance procedures seem to make good tests. They are fairly straightforward and most of the development for them is complete. They also typically exercise many aspects of an entire system as well as some of it's relationships to other systems.

The performance tests should be developed to a level at which they may be reproduced by another SFET if necessary.

They should however, be in a form which permits easy understanding of what was done (i.e. neat, organized, references included, etc.).

Requesting a copy of the actual performance test conducted by the facility licensee gives a good indication of how thorough the testing is and how good the simulation facility is.

Simply repeating one of the performance tests which the facility licensee claims to have conducted can be a good test in some cases (particularly if the test does not look very sound on paper or if license examiners have reported problems with the operations involved). Including this type of test will help to ensure that the facility licensee's testing program is meeting the requirements of the Rule and the Standard.

Considerable evaluative information can be obtained during running of the performance tests. When expected responses are clearly known beforehand, and test procedures include verification steps for expected observations, the general quality of simulator performance is apparent by the conclusion of the test.

The need for off-site evaluation of performance test data will depend primarily on the extent to which performance tests can be developed before they are run, and on the results which are obtained in the tests. If all 100

t observations are as expected, off site evaluation may not be In other cases further data evaluation may be O necessary. required, the results declared unsatisfactory, or the facility may be asked to provide further analyses and explanation of discrepancies.

Once an on-site audit is to be conducted, the SFET should work closely with the facility licensee in the development of the performance tests. This will help to ensure that the tests are fair and reasonable. This will help in avoiding any misunderstandings about methodology that was used when the results of the test are being discussed.

A facility operations expert should be present as an observer during testing and on-site data evaluation.

Questions arise during test performance and evaluation, many concerning unforseen details of plant response, which, if answered at the time, may remove confusion, refocus observations, and enhance the acquisition of needed information and data.

It is not necessary that the operating crew for the performance tests be made up of licensed personnel.

A decision tree type of analysis should be included in the SFEP for the evaluation of the performance tests. For example, if a parameter for a transient operation is being evaluated, the decision tree process may be as follows:

1. Violates physical laws?

Yes - fail No - continue

2. Change in the proper direction?

No - fail Yes - continue

3. Proper relationships with other parameters? ,

l No - fail Yes - Pass etc.

The 2%-10% criteria given for normal operations in the SFEP should be taken out. According to the Standard they only apply to steady state operations.

The phrase " key parameters" should be eliminated and only

" critical" and "non-critical" should be used.

101 L- -- - -- - - - . _ _ ____ ____ _ ._

6.3 Human Factors / Physical Fidelity Examining the original HEDs and raw data is a good idea.

When identifying I&Cs from the performance tests, it may be sufficient just to identify all of the I&Cs associated with the critical parameters. Developing the performance tests to a fine level of detail solely to identify I&cs may not be practical.

An instant camera would be an excellent way of collecting data, and for some aspects of the data collection is almost n% essary. Notes should be made on the back of the photographs so that their content and meaning are apparent.

It may be helpful to reverify any problems identified as part of this portion of the review. It involves a lot of detailed data collection which can be subject to error.

6.4 Desian, Updatina, Modification, and Testina The focus for this review should be shifted from how things are done to what is done and when.

The off-site review should include a request for data on reference plant modifications made during a given period, together with data for any resulting changes to the simulation facility or, alternatively, the facility licensee's reasons for not changing the simulation facility.

The time schedules for these changes / decisions should be included as well.

6.5 General The opportunity for certain members of the SFET to visit the facility licensee in advance of the on-site review for the purposes of establishing a rapport, examining available data and working out some of the logistics of the review, seemed to be favored by the members of the SFET and the facility licensee.

While this pilot test was completed, by and large, in accordance with the schedule described in the draft SFEP, it was done so with considerable effort and long hours on the part of the ISFET and the facility licensee's staff. If a more ambitious set of performance tests is to be included in future simulation facility evaluations, consideration should be.given to making arrangements for a longer stay on-site.

Greater emphasis should be placed on the use of reference plant procedures in the simulation facility.

It may be better to refer to an "on-site" and "off-site" review instead of the current phases.

1 102

i

7. SUPPORTING DOCUMENTATION This section contains copies of the Performance Test Procedures for the North Anna simulation facility that were used in testing the SFEP methodology. Except for the surveillance procedure (where the plant procedure was used) and the Turkey Point LER (which was not run) , the procedures -

for each of the performance tests are given in the following i order.

NORMAL OPERATIONS

1. SURVEILLANCE OF STEAM DRIVEN AUX FEEDWATER PUMP ABNORMAL OPERATIONS
2. 50% LOAD REJECTION - 200% PER MINUTE
3. MSIV CICSURE - 100% POWER LER 86-006
4. DROPPED CONTROL RODS - 16% POWER LER 85-017
5. RCP TRIP - 28% POWER
6. TURBINE CONTROL MALFUNCTION LER 86-002
7. 125 VAC LOSS, STUCK MFW VALVES LER 84-019

, 8. INVERTOR LOSS, HI P TRIP TURKEY POINT LER 85-017 EMERGENCY OPERATIONS

9. NATURAL CIRCULATION LER 85-019
10. STEAM GENERATOR TUBE RUPTURE OF INCREASING MAGNITUDE i

j 1

3 103 i

( _ . _ . _ . _ . - . . . . - . -

1

\

2. 50% LOAD REJECTION - 200% PER MINUTE I. Initial condition: 100% power II. Data Collection Method Identified Critical Parameters:

Analog: Mark the recorders TO for Test 1.

Digital: Ensure data is recorded as follows:

Every 5 seconds for first minute.

Every 15 seconds until completion.

PZR pressure PZR PORV status PZR level VCT level Charging flow Letdown flow PZR spray line T

  • PZR heater status N-45 (Nuclear Power)

D rod bank position Turbine first stage p - Tref Tave - auct hi Tcold - loop wide range - A,B,C Thot - loop wide range - A,B,C Incore Temperature subcooling Margin

  1. RCS loop flow - A,B,C Main Steam header p SG pressure - A,B,C SG feed flow - A,B,C SG steam flow - A,B,C SG level - narrow range - A,B,C SG level - wide range - A,B,C SG PORV status - A,B,C SG safety status - A,B,C
  • Denotes data not logged. Ensure data recording mechanism is in place.
  1. Denotes data not expected to be critical for this test.

O 104

, III. Procedure

1. Position data taker at turbine bypass valves.
A. Note the elapsed time and Tave - Tref mismatch for each turbine bypass valve operation.

j

2. Position data taker at pressurizar heater and spray control.

A. Note pressurizar level deviation or pressure at onset and termination i

of heater operation.

. B. Note pressurizer pressure at onset l

and termination of spray.

3. Panel operator inserts 200%/ min load reduction to 50% power. Annunciators are not acknowledged, but can be silenced. No further operator action is taken.

Observe:

4

. Turbine control rapidly throttles turbine valves.

Rx does not trip Turbine first stage impulse pressure drops SG pressure increases (until steam dumps open)

SG level drops due to pressure increase Steam Flow / Feed Flow mismatch causes FW flow decrease Tave-Tref error causes steam dump valves to trip open j (number appropriate to power change)

. Control rod runback initiated.

! Rods move in at maximum rate (Tave-Tref and Qn-Qturb program causes max l

speed)

(rod motion is sequenced by auto control j

program)

! . Increase of Tave due to steam dump response delayed.

i

' RCS pressure initially increases (PZR heater decrease, possible spray)

PZR level increase due to swell (PZR heaters on if 5% increase) i Charging flow decrease 105 1

. Reactor power decreases with rod insertion Decreasing nuclear power, Tave Rod insertion speed decreases as Tref approached Steam dump valves modulate closed as Tref approached SG pressure decreases towards normal Feedwater flow decreases with steam flow

. PZR level changes with Tave decrease PZR level decrease and potential overshoot (RCS shrink with cooling)

(Charging rate establishes level via program)

VCT level changes due to charging IV. Data Collection

1. Record all annunciators of interest and attach to this procedure.
2. Verify standard printouts are collected, marked, and attached to this procedure.
3. Verify all other data specified in Section II is collected, marked, and attached to this procedure.

V. Preliminary Evaluation

1. Compare simulator data with plant data or expected response and note any discrepancies.
2. Attempt to resolve the discrepancies with the plant content expert. Append resolution or recommended follow-up action to this procedure.

106 O

3. MSIV CIDSURE - 100% POWER LER 86-006 I. Initial Condition: 100% power II. Data Collection Method Identified Critical Parameters:

Analog: Mark the recorders TO for Test 3.

Digital: Ensure data is recorded as follows:

Every 5 seconds for first minute.

Every 15 seconds until completion.

PZR pressure PZR PORV status PZR level VCT level Charging flow Letdown flow PZR spray line T N-45 (Nuclear Power)

D rod bank position Turbine first stage p - Tref Tave - auct hi Tcold - loop wide range - A,B,C

, .. Thot - loop wide range - A,B,C Incore Temperature Subcooling Margin

  1. RCS loop flow - A,B,C Main Steam header p SG pressure - A,B,C SG feed flow - A,B,C SG steam flow - A,B,C SG level - narrow range - A,B,C SG level - wide range - A,B,C SG PORV status - A,B,C SG safety status - A,B,C
  • Denotes data not logged. Ensure data recording mechanism is in place.
  1. Denotes data not expected to be critical for this test.

107

III. Procedure

1. Simulator instructor closes B MSIV. Panel operators carry out expected response (except for acknowledging annunciators).

Observe:

. SG B pressure spikes SI actuation due to high steam flow coincident with low steam line pressure Reactor trip Turbine trip

. RCS pressure, temperature decrease per data

. SI termination, plant stabilizes IV. Data Collection

1. Record all annunciators of interest and attach to this procedure.
2. Verify standard printouts are collected, marked, and attached to this procedure.
3. Verify all other data specified in Section II is collected, marked, and attached to this procedure.

V. Preliminary Evaluation

1. Compare simulator data with plant data or expected response and note any discrepancies.
2. Attempt to resolve the discrepancies with the plant content expert. Append resolution or recommended follow-up action to this procedure.

108 O

1 4

4. DROPPED CONTROL RODS - AT 16% POWER LER 85-017 l I. Initial condition per LER as follows:

Reactor power 16%

Rod control (manual)

Main FW control (closed) i Bypass FW control (man)

Main feed pumps: A ,

Condensate pumps: A,B j

Steam Dump (Press mode) ,

II. Data Collection Method i

Identified Critical Parameters:

's Analog: Mark the recorders To for Test 4.

? Digital: Ensure data is recorded as follows:

j Every 5 seconds until 2 minutes after trip.

Every 15 seconds until completion.

PZR pressure

  1. PZR PORV status i PZR level

! VCT level charging flow i Letdown flow

! PZR spray line T N-45 (Nuclear Power)

  • N-41,42,43,44 n D rod bank position
  • IRPI for dropped rods

'

  • Rod bottom lights for dropped rods Turbine first stage p - Tref Tave - auct hi

! Tcold - loop wide range - A,B,C j Thot - loop wide range - A,B,C Incore Temperature

! Subcooling Margin

  1. RCS loop flow - A,B,C Main Steam header p SG pressure - A,B,C

! SG feed flow - A,B,C SG steam flow - A,B,C SG level - narrow range - A,B,C i

SG level - wide range - A,B,C

  1. SG PORV status - A,B,C i #SG safety status - A,B,C
  • Denotes data not logged. Ensure data recording j mechanism is in place.

l

  1. Denotes data not expected to be critical for this test. I 109 i

i

III. Procedure

1. Position data taker at power range meters.

A. Record upper and lower detector current on each power range channel.

B. Record upper and lower detector current on each power range channel after rod drop just prior to manual reactor trip.

2. Position data taker at rod position indication.

A. Verify IRPI for dropped group reads "0".

B. Verify rod bottom lights lit for dropped group.

3. Simulator instructor drops Group 1 of Bank D.

No operator action is taken.

Observe:

. Rod position indications

. NI power reduction

. Absence of neg. flux rate trip (rods are peripheral, between excore NIs)

. Flux distortion per NIs (imbalance and QPT changes)

(no data exist in LERs)

4. Once plant has stabilized and data in Sections III.1 and III.2 collected, panel operator manually trips the reactor and carries out expected actions.

Observe:

. Reactor trip response 9

110

IV. Data Collection

1. Record all annunciators of interest and attach to this procedure.
2. Verify standard printouts are collected, marked, and attached to this procedure. l

! 3. Verify all other data specified in Section II is collected, marked, and I

attached to this procedure.

V. Preliminary Evaluation

1. Compara simulator data with plant data i or expected response and note any discrepancies.
2. Attempt to resolve the discrepancies with the plant content expert. Append resolution or recommended follow-up 1

action to this procedure.

i l

I f

t h

I f

T l

5 111

5. RC TRIP - 28% POWER I. Initial Condition: 28% power Rod control in manual l Turbine on line II. Data Collection Method Identified Critical Parameters:

Analog: Mark the recorders To for Test 5.

Digital: Ensure data is recorded as follows:

Every 5 seconds for first minute.

Every 15 seconds until completion.

PZR pressure

  1. PZR PORV status PZR level VCT level charging flow
  1. Letdown flow PZR spray line T N-45 (Nuclear Power)
  • N-41,42,43,44
  1. D rod bank position
  1. Turbine first stage p - Tref Tave - auct hi Tcold - loop wide range - A,B,C Thot - loop wide range - A,B,C Incore Temperature Subcooling Margin RCS loop flow - A,B,C Main Steam header p
  • Turbine throttle valve position SG pressure - A,B,C SG feed flow - A,B,C SG steam flow - A,B,C SG level - narrow range - A,B,C SG level - wide range - A,B,C
  1. SG PORV status - A,B,C
  1. SG safety status - A,B,C
  • Denotes data not logged. Ensure data recording mechanism is in place.
  1. Denotes data not expected to be critical for this test.

O 112

III. Procedure 3

1. Position data taker at turbine control panel.

A. Verify turbine throttle valves open slightly during transient.

2. Position data taker at power range meters.

A. Record upper and lower detector current on each power range channel.

B. Record upper and lower detector current on each power range channel at conclusion of test.

3. Panel operator trips C loop RC Pump.

Observe:

. C loop flow decreases to 10% in 30 sec, further decrease more slowly, eventually reverses.

. RC flows increase in A and B loops

. C loop Th drops below Tc (reverse flow)

. Reduced steaming from C SG C feedwater flow reduced by mismatch

. Increased steaming from A & B SGs feedwater flows increased by mismatch

. C SG pressure drops to Psat for Tc SG header pressure drops to C SG pressure (no flow)

A & B SG ps higher than C (flow resistance) l Turbine valves open further (lower header pressure)

Tave increases initially, then recovers Initial increase due to reduced heat transfer NI power reduction due to MTC Tave recovers since turbine power constant

. A and B loop delta T increases t

(same heat transfer necessary)

. Core power tilt due to Temp differences l l

113

. _ . _ _ _ _ _ _ _ , _ , __. _ _ ]

IV. Data Collection

1. Record all annunciators of interest and attach to this procedure.
2. Verify standard printouts are collected, marked, and attached to this procedure.
3. Verify all other data specified in Section II is collected, marked, and attached to this procedure.

V. Preliminary Evaluation

1. Compara simulator data with plant data or expected response and note any discrepancies.
2. Attempt to resolve the discrepancies with the plant content expert. Append resolution or recommended follow-up action to this procedure.

O l

l t

114 O

) 6. TURBINE CONTROL MALFUNCTION - LER 86-002 1

I. Initial condition: 100% power II. Data Collection Method 1

Identified Critical Parameters:

4 Analog: Mark the recorders To for Test 6.

Digital: Ensure data is recorded as follows:

l Every 5 seconds for first minute.

Every 15 seconds until completion.

PZR pressure PZR PORV status PZR level VCT level

. Charging flow

, Letdown flow i PZR spray line T

  • PZR heater status N-45 (Nuclear Power)

D rod bank position Turbine first stage p - Tref Tave - auct hi

7 Tcold - loop wide range - A,B,C Thot - loop wide range - A,B,C Incore Temperature Subcooling Margin
  1. RCS loop flow - A,B,C Main Steam header p SG pressure - A,B,C SG feed flow - A,B,C SG steam flow - A,B,C SG level - narrow range - A,B,C SG level - wide range - A,B,C j

SG PORV status - A,B,C 4 SG safety status - A,B,C

  • AFW pump status

!

  • Denotes data not logged. Ensure dataj recording mechanism is in place.
  1. Denotes data not expected to be critical for this test.

115 l

l


,--,..,,-c - - - - . . - - - . - , - . . - - - - - , - . . - - - - - - - - -

l III. Procedure

1. Position data taker at turbine bypass valves.

A. Note time between initiation of malfunction and turbine bypass valve operation.

B. Note turbine bypass valve closure in controlled fashion.

C. Note time between turbine bypass valve opening and full closure.

D. Note Tave - Traf mismatch at time of turbine bypass valve full closure.

2. Position data taker at pressurizer heaters.

A. Note pressurizer level or pressure deviation at onset of heater actuation.

B. Note pressurizer level or pressure deviation at end of heater actuation.

3. Simulator instructor initiates rapid turbine governor valve closure. Operator actions limited to those necessary to fulfill immediate actions.

Observe:

. SG pressure increases SG level drops due to bubble collapse Reactor trip due to SG lo-lo level Turbine trip due to reactor trip Steam dumps open SG PORVs open AFW pumps start (due to low SG level)

. Tave increases PZR level increase PZR pressure increase '

PZR spray valve actuates

. Tave decreases toward no-load value Steam dump valves modulate closed 116

I l

l SG PORVs close j

. PZR level decreases toward no-load value

. PZR pressure decreases heaters energize subsequently recovers 1 i

l

. SG level recovers to no load value l l

. SG pressure recovers to no-load value  ;

I 1

I IV. Data Collection j 1. Record all annunciators of interest and I i

attach to this procedure. l

2. Verify standard printouts are collected, marked, and attached to this procedure.
3. Verify all other data specified in Section II is collected, marked, and attached to this procedure.

V. Preliminary Evaluation

1. Compara simulator data with plant data l

or expected response and note any discrepancies.

2. Attempt to resolve the discrepancies with the plant content expert. Append resolution or recommended follow-up i

action to this procedure.

117

l

7. 125 VAC BUS LOSS. STUCK MFW VALVES - LER 84-019 I. Initial conditions: 100% power II. Data Collection Method Identified Critical Parameters:

Analog: Mark the recorders TO for Test 8.

Digital: Ensure data is recorded as follows:

Every 5 seconds for first minute.

Every 15 seconds until completion.

PZR pressure PZR PORV status PZR level VCT level Charging flow Letdown flow PZR spray line T N-45 (Nuclear Power)

D rod bank position Turbine first stage p - Tref Tave - auct hi Tcold - loop wide range - A,B,C Thot - loop wide range - A,B,C Incore Temperature Subcooling Margin RCS loop flow - A,B,C Main Steam header p SG pressure - A,B,C SG feed flow - A,B,C SG steam flow - A,B,C SG level - narrow range - A,B,C SG level - wide range - A,B,C SG PORV status - A,B,C

  1. SG safety status - A,B,C
  • Loss of power to various components
  • Denotes data not logged. Ensure data recording mechanism is in place.
  1. Denotes data not expected to be critical for this test.

O 118

II. Procedure

1. Siaulator instructor deenergizes vital bus 1-III (invertor failure). .

Observe:

. Rx trip due to sensed C RCP loss (C RCP does not trip)

. Feedwater valves for B SG fail closed main (FCV-1488) bypass (FCV-1489)

. Level indication for B SG fails low wide range LI-1487

. Aux feedwater pump for B SG fails to auto start (1-FW-P-3B) 4 . B SG level drops below narrow range indication (no wide range ind. available)

. All circulating water pumps trip (water boxes' vacuum brkra deenergized)

O . Various containment isolation valves trip (incl. component cooling to RCPs)

. NI power range detector N43 doenergized

. 26 incore thermocouples lose power

. SSPS channel III input relays deenergize

. SSPS train B output relays deenergize

. Audit remaining items per attached load list 119

IV. Data Collection

1. Record all annunciators of interest and attach to this procedure.
2. Verify standard printouts are collected, marked, and attached to this procedure.
3. Verify all other data specified in Section II is collected, marked, and attached to this procedure.
v. Preliminary Evaluation
1. Compare simulator data with plant data or expected response and note any discrepancies.
2. Attempt to resolve the discrepancies with the plant content expert. Append resolution or recommended follow-up action to this procedure.

O O

120

9. NATURAL CIRCULATION - LER 85-019 I. Initial Condition: 100% power, middle of life II. Data Collection Method Identified Critical Parameters:

Analog: Mark the recorders TO for Test 9.

Digital: Ensure data is recorded as follows:

Every 15 seconds until manual trip.

Every 5 seconds for one subsequent minute.

Every 60 seconds until RCP start Every 15 seconds until completion.

PZR pressure PZR PORV status PZR level VCT level Charging flow Letdown flow PZR spray line T N-45 (Nuclear Power)

D rod bank position Turbine first stage p - Tref

O- Tave - auct hi Tcold - loop wide range - A,B,C Thot - loop wide range - A,B,C Incore Temperature Subcooling Margin RCS loop flow - A,B,C
  • RCS pump radial bearing temperature

, Main Steam header p SG pressure - A,B,C SG feed flow - A,B,C SG steam flow - A,B,C SG level - narrow range - A,B,C SG level - wide range - A,B,C SG PORV status - A,B,C SG safety status - A,B,C

  • Spotcheck of components
  • Denotes data not logged. Ensure data recording mechanism is in place.
  1. Denotes data not expected to be critical for this test.

121

III. Procedure 1.

Simulator instructor fails 14J-4 (480 VAC) open, then puts simulator in " freeze".

Verify the following:

A. CCW valves (106A,B,C) to RCP closed B. Spotcheck of other equipment deenergization from attached load list.

2. Resume simulation. Panel operators commence power reduction at 4%/ min. Freeze simulator at 87% power. Verify the following:

A. Elevated RCP bearing temperature (should be between 195F and 216F).

, B. Increasing Tave, PZR press & level

3. Resume simulation. Panel operators perform manual reactor trip immediately, perform expected actions (secures RCPs about two minutes later upon entry into ES-0.1).

Observe:

Natural circulation indications develop and stabilize RC flows decrease to 10% in 30 sec further decrease more slowly Tc decreases slightly, approaching SG Tsat Th increases to 30-50F above Tc, stabilizes Incore thermocouples track Th (560-570F)

SG pressure stable, with steaming indicated

4. Restart RCP A 30 min after Rx trip after component cooling reestablished.

Observe:

. Th, Tc converge (all 3 loops)

SG A pressure increase, no increase on SG B,C

. Possible PZR pressure decrease 9

122

IV. Data Collection

1. Record all annunciators of interest and attach to this procedure.
2. Verify standard printouts are collected, marked, and attached to this procedure.
3. Verify all other data specified in Section II is collected, marked, and attached to this procedure. ,

V. Preliminary Evaluation

1. Compara simulator data with plant data or expected response and note any discrepancies.
2. Attempt to resolve the discrepancies with the plant content expert. Append resolution or recommended follow-up action to this procedure.

O 123

10. STEAM GENERATOR TUBE RUPTURE I. Initial Condition: 100% power Intermediate range channel undercompensated II. Data Collection Method Identified Critical Parameters:

Analog: Mark the recorders To for Test 10.

Digital: Ensure data is recorded as follows:

Every 5 scconds for first minute after each modification.

Every 15 seconds for remaining time until completion.

PZR pressure PZR PORV status PZR level VCT level

  • RVLIS indication Charging flow
  • Charging flow through SI flowpath
  • PZR heater status PZR spray line T N-45 (Nuclear Power)
  • Intermediate range response
  • Source range response D rod bank position Turbine first stage p - Tref Tave - auct hi Tcold - loop wide range - A,B,C Thot - loop wide range - A,B,C Incore Temperature Subcooling Margin RCS loop flow - A,B,C
  • RCP current Main Steam header p SG pressure - A,B,C SG feed flow - A,B,C SG steam flow - A,B,C SG level - narrow range - A,B,C SG level - wide range - A,B,C SG PORV status - A,B,C SG safety status - A,B,C
  • Secondary radiation alarms
  • SI actuation (pump starts, valve repositions
  • Denotes data not logged. Ensure data recording mechanism is in place.
  1. Denotes data not expected to be critical for this test.

124 1

III. Procedure

1. Simulator instructor initiates 20 gpa leak in B SG. No operator action.

Observe:

. PZR level / pressure control CVCS auto control increases charging VCT tank level decreases Auto VCT makeup secondary radiation alarms

2. Simulator instructor increases leak rate to 50 gpa. Panel operators start additional charging pumps.
3. Simulator instructor increases leak rate to 100 gpm. No operator action other than to reestablish PZR heaters and to swap charging suction to RWST (if necessary).

Observe:

. PZR level / pressure decrease PZR heaters on, then deenergize on low level Letdown isolates Charging flow to maximum, then increases as RCS pressure decreases

. PZR level / pressure " stabilize" PZR level increases after letdown isolates PZR pressure increases once heaters reenergize PZR level decreases as pressure increases PZR heaters deenergize on low PZR level Cycle should continue

4. Simulator instructor increases leak rate to 800 gpm. Panel operators carryout expected response through EP-3, Step 10. No further operator action. Audit SI/ CIA actuation during plant stabilization.

Observe:

i . RPS low pressure trip (1875 psig)

Alarms and annunciators Automatic controller response to trip C

125 1

_ _ _ _ _ _ _ _ _ _ _ _ _ . . _ _ _ . . _ . _ . - , - , _ . , _ . _ _ _ _ _ , , , , . -, . _ . . _ _ _ _ . _ _ _ _ ~ . - . . _ _ _ . _ . _ , _ _ _ _ . _ ,

. SI initiation (1760 psig)

Various pumps start Valves reposition Phase A containment isolation

. RCP trip criterion met (SI flow indicated and RCS pressure < 1230 psig RCS flow coast down interlocks start RCP lift and lube oil pumps natural circulation indications develop

. B SG level increases (isolated)

B SG goes solid B SG pressure rises to SG PORV setpoint B SG PORV, possible safety valve actuation

. RCS pressure / level increase RCS pressure matches B SG pressure RCS level increases as PZR bubble condenses (inhibit heater actuation ?)

5. Panel operator opens PZR PORV.

Observe:

. Rapid PZR level increase to 100%

. RCS pressure decrease Possible accumulator discharge RCS voiding indications RVLIS indication (if operable)

Loss of subcooling Increased charging flow due to lower pressure Possible secondary-to-primary flow

6. Panel operators close PZR PORV, energize PZR heaters, start an RCP (start criterion do not have to be met),

and stop charging pumps.

Observe:

. RCP start indications Current, RCS flow Th-Tc equalize O

126

. RCS pressure recovers, stabilizes O Indications of bubble collapse Eventual return of indicated PZR level

7. Verify SR detectors not energized due to IR range undercompensation. Verify that panel operator can manually energize.

IV. Data Collection

1. Record all annunciators of interest and attach to this procedure.
2. Verify standard printouts are collected, marked, and attached to this procedure.
3. Verify all other data specified in Section II is collected, marked, and attached to this procedure.

V. Preliminary Evaluation

1. Compare simulator data with plant data or expected response and note any discrepancies.
2. Attempt to resolve the discrepancies with the plant content expert. Append resolution or recommended follow-up action to this procedure.

k 127

- - _ - _ ~ - - _ _ _ _

i GLOSSARY Baseline data: Data used to evaluate the simulation facility against the reference plant.

Critical carameters: 1) Those parameters that require direct and continuous observation to operate the power plant under manual control. 2) Input parameters to plant safety j systems. '

HED: Human Engineering Discrepancy.

inspection: An audit or review of the simulation facility's documentation, hardware, or performance for the purposes of determining its conformance with the requirements of 10 CFR 55.

I&Cs: Instruments and Controls.

LEE: Licensee Event Report Perfect onerator resnonse: The actions required of an operator in conducting the performance tests are assumed to be completed without operator error for the purposes of developing the performance tests. The SFET and the operating crew will also try to achieve this during the actual conduct of the performance tests.

the Rule: 10 CFR Part 55.

SFEP: Simulation Facility Evaluation Procedure.

the Standard: ANSI /ANS 3.5, 1985.

I i

129 l . - _ _ . __ . - . . . - - - .

FORM 335 U 8. NUCLE A a ; E ULiTOR Y COMMeSSION 1 AEPORT NuwSE A Maspeg g, rioC, any yet eis,,p enys BIBLIOGRAPHIC DATA SHEET NUREG-1258 E"222 '

DRAFT uE e:TRUCT OhsONT EREvERs, LYLE AND SusTtTLE 3 LEAVE OLANK

/aluation Procedure for Simulation Facilities

ertified Under 10CFR55 Draft Report .ONY.

j

, EAR Marcn 1987 6 DATE REPOMT ISSUEO K. Ronald Laughery, Christopher Plott, Jerry Wachtel 0,g . , , EAR March ' 1987 7 6 E~f ORMiNG ORGANi2AYiON NAWE ANO MAILING ADDRESS uncharle Coet 8 PROJECT T A$K YpORE UNet NUMBER N/A Division of Human Factors Technology ' " " " " " ' ' ' ' ' " " "

Office of Nuclear Reactor RGgulation U.S. Nuclear Regulatory Commission Washington, DC 20555 10 SPONSORING ORGANilAf TON NAME ANO M AILING AOORESS itace w ae le CodPJ tis TYPE OF REPORY Division of Human Factors Technology Draft Technical Office of Nuclear Reactor Regulation U.S. Nuclear Regulatory Commission a nma covmo "~~~ ~~

Washington, DC 20555 9/85 -3/87 12 $UPPLEMENT ARY NOTES This document describes the process to be followed by the NRC for the inspection of simulation facilities certified by facility licensees in accordance with 10CFR55. Such inspections are divided into four major technical areas: performance testing; physical d [idelity/humanfactors;controlcapabilities;anddesign, updating,modificationand N .esting.

Inspections will be performed by NRC staff with interdisciplinary skills including license examiner, operations specialist and human factors expert.

Inspections may consist of off-site and/or on-site phases. The off-site phase consists of an examination of simulation facility documentation , and the identification of those operations that may be considered for use in on-site performance testing. In the on-site phase, the staff will work with the facility licensee to conduct a review of the four technical areas, and to evaluate the results of tests that are conducted.

Findings will be based upon the staff's judgment of the degree of compliance of the simulation facility with 10CFR55 in terms of its suitability for the conduct of operating examinations.

14 DOCUME NT AN ALY5es - e M E vWORDS'DESCRtPTOR$ t$ AV AnL A9 LaT V Simulator, simulation facility, plant-referenced simulator, certification' performance tests, inspection, license examination, operating test. Ul i ed

$4 SECURiTV CLAsseFICATION ITM gener

. onTinERs,OnN aoEo naus Unclassified iIMS v000*tl

, Unclassified I o NuonRO.PAGEs

! l I 1

u.s.covener :mainc orrteciwa7-m-somm

UNITED STATES sescat rouarwetass nars NUCLEAR REGULATORY COMMISSION costast er etts enio WASHINGTON, D.C. 20555 , [sf]cg w

OFFICIAL BUSINESS PENALTY FOR PRIVATE USE,8300 l

1 O'

4 O