ML20138C035

From kanterella
Jump to navigation Jump to search
Forwards SPDS Audit Rept Detailing Results of 850827-28 Audit,For Review.Audit Identified Areas Needing Mod for Compliance W/Regulatory Requirements.Respnse to Each Identified Concern Requested
ML20138C035
Person / Time
Site: Hope Creek PSEG icon.png
Issue date: 10/15/1985
From: Butler W
Office of Nuclear Reactor Regulation
To: Mittl R
Public Service Enterprise Group
References
NUDOCS 8510220164
Download: ML20138C035 (4)


Text

~

< a OCT I 51985 Docket No. 50-354 Mr. R. L. Mitti, General hanager Nuclear Assurance and Regulation Public Service Electric & Gas Ccapany P. O.~ Box 570 T22A Newark, New Jersey 07101

Dear Mr. Mitti:

Subject:

HOPE CREEK SAFETY PARAMETER DISPLAY SYSTEM AUDIT Enclosed for your review is an audit report prepared by our consultant, Lawrence Livenacre National Laboratories, detailing the results of the Hope Creek Safety Parameter Display System (SPDS) Audit. The audit was conducted on August 27 and 28, 1985.

The audit identified several areas where the Hope Creek SFDS needs to be modified to be in complience with the regulatory requirements. These areas do not represent unresolvable problems, but do require further programatic commitments frem you.

Sections 5.1 and 5.2 of the enclosed report sumarize our audit findings.

You are requested to respond to each of the identified concerns. Please contact us if you have any questions.

Original Signed by Walter R. Butler, Chief Licensing Branch No. 2 Division of Licensing

Enclosure:

As stated cc: See next page

)ISTRIBUTION I Ncket. File ACRS(16)

PRC System JParticw NRC PDR BGrimes Local PDR Edordan NSIC GLapinsky LB#2 Reading MMcCoy EHylton JJoyce BSiegel OELD orney LBAN LD#2/DL DWa r ml WBu 1er 10/ /85 10 /85 8510220164 851015 PDR ADOCM 05000354 F PDR

280 4' O

  • g! UNITED STATES

,' k_

y; g NUCLEAR REGULATORY COMMISSION

'. .% C WASHINGTON, D. C. 20555 ora #

%g n;.> ,1

  • .... l OCT f 5 1985 Cccket No. 50-354 Mr. R. L. Pitti, General Manager Nuclear. Assurance and Regulation Public Service Electric & Gas Company P. O. Box 570, T22A Newark, New Jersey 07101

Dear Mr. Mitti:

Subject:

HOPE CREEK SAFETY PARAllETER DISPLAY SYSTEM AUDIT

, A.l Enclosed for your review is an audit report prepared by our consultant, Lawrence Livermore National Laboratories, detailing the results of the Hope. Creek Safety Parameter Display System (SPDS) Audit. The audit was conducted on August 27 and 28, 1985.

The audit identified several' areas where the Hope Creek SPDS needs to be modified to be in compliance with the regulatory requirements. These areas do not represent unresolvable problems, but do require further progratnatic comitments frun you. .

. - Sections 5.1:ard 5.2 of the enclosed report st.marize our audit findings.

You are requested to respond to each of the identified concerns. Please contact us if you have any questions. ,

OQ i . . .<

/ f,-)(.Q f Q, ,

i t Walter R. Butler, Chief Licensing Branch No. 2 Division of Licensing

Enclosure:

As stated cc: See next page ep-e e 4 WPf a 6 9 .ge* 44 + 4 e4 .g- 99 dE e g e e4 # g $ p

  • g4 .#4

- D -9 ,9 ter & 4- 3 -

t y 9 }1f -

Mr. R. L. Mitti Public Service Electric & Gas Co. Here Creek Generating Station

) l CC:

i Gregory hinor Susan C. Remis.

Richard Hubbard Division of Public Interest Aovocacy Dale Bridenbaugh New Jersey State Department of MHB Technical Associates the Public Advocate l 1723 Hamilton Avenue, Suite K Richard J. Hughes Justice Comples

San Jose, California 95125 CN-850 Trenton, New Jersey 08625 Troy B. Conner, Jr. Esquire Office of Legal Counsel Conner & ketterhahn Department of Natural Resources 1747 Pennsylvania Avenue H.W. and Environcental Control ,

Washington, D.C. 20006 89 Kings Highway P.O. Box 1401 Dover, Delaware 19902 Richard Fryling, Jr., Esquire Mr. K. W. Burrewes, Project Engineer ,

Associate General Solicitor Bechtel Pcwer Corporation Public Service Electric & Gas Company 50 Beale Street P. O. Box 570 TSE P. O. Box 3965 Newark, New Jersey 07101 San Francisco, California 94119 Mr. J. H. Ashley Resident Inspector Senior Licensing Engineer U.S.N.R.C. c/o Public Service Electric & Gas Co.

P. O. Box 241 Bethesda Office Center, Suit 550 i Hancocks Bridge, New Jersey 08038 4520 East-West Highway Bet.hesda, Maryland 20814 Richard F. Engel Deputy Attorney General Mr. A. E. Giardino Division of Law Manager - Quality Assuran' e E8C Environmental Protection Section Public Service Electric & Gas Co.

Richard J. Hughes Justice Complex P. O. Box A Hancocks Bridge, New Jersey 08038

- CN-112P Trenton, New Jersey 08625 Mr. Robert J. Tcuhey, Nr. Anthony J. Pietrofitta Acting Director General Mar.ager DNREC - Division of Power Production Engineering Environmental Control Atlantic Electric 89 Kings Highway 1199 Black Horse Pike P. O. Box 1401 Pleasantville, New Jersey 08232 Dover, Delaware 19903 Regional Administrator, Region 1 Mr. R. S. Salvesen U. S. Nuclear Regulatory Comission General Manager-Hope Creek Opera: ion 631 Park Avenue Public Service Electric & Gas Co. King of Prussia, Pennsylvania 19406 P.O. Box A Hancocks Bridge, New Jersey 08038 f

8 # 3e f g e #8 6 *r e # # w r *, ,

  • *J g gg ago, p . p- ma- g n. yq ., , g wg
  • egy

I s s.

e

? _

-(!4 .

Public Service Electric & Gas Co. Hope Creek Generating Station cc:

Mr. B. A. Preston '

Public Service Electric & Gas Co.

Hope Creek Site MC12Y Licensing Trailer 12LI '

Foot of Button wood Road Hancock's Bridge, New Jersey 08038 Ms. Rebecca Green New Jersey Eureau of Radiation '

. Protection.

380 Scotch Road Trenton,.New Jersey 08628 .

\

N 4

A A

-f f i

' +

}

.4 T

[*een' 't-o - m -a , .,- p.p y 4 ,..p. ,- ,g.,4,,,,4 of , ,, , , p. ,g ,, j, , , , , , ., ,, _ _

@ AUDIT -

OF THE SAFETY PARAMETER DISPLAY SYSTEM FOR PUBLIC SERVICE ELECTRIC AND GAS COMPANY HOPE CREEK GENERATING STATION SEPTEMBER 20, 1985 Gary L. Johnson l Jack W. Savage Lawrence Livermore National Laboratory For The United States Nuclear Regulatory Coc: mission

{

L-35dhi6d9(g~g3

AUDIT OF THE SAFETY PARAMETER DISPLAY SYSTEM FOR PUBLIC SERVICE ELECTRIC AND GAS COMPANY HOPE CREEK GENERATING STATION

1.0 INTRODUCTION

On August 27 and 28,1985 a 6 audit of the Hope Creek Generating Station (HCGS) Safety Parameter Display System (SPDS) was conducted by the NRC. This NRC audit examined the HCGS Verification and Validation program plan and reviewed the operation of the SPDS. Thus the audit specifically addressed the points of both a Design Verification Audit and a Design Validation Audit as described by Section 18.2 of NUREG 0800 [2]. The audit team was composed of one individual from the Nuclear Regulatory Commission Human Factors Engineering Branch and two individuals from the Lawrence Livermore National Laboratory acting as consultants to the NRC.

The audit was based upon the recommended criteria of NUREG 0800 Section 18.2. In accordance with that guidance, up to three separate audit meetings / site visits, as described below, may be arranged.

Design Verification Audit. The purpose of this audit meeting is to obtain additional information required to resolve any outstanding questions about the V&V program, to confirm that the V&V program is being correctly implemented, and to audit the results of the V&V activities to date. At this meeting, the applicant should provide a thorough description of the SPDS design process.

Emphasis should be placed on how the applicant is assuring that the implemented SPDS wills provide appropriate parameters, be isolated from safety systems, provide reliable and valid data, and incorporate good human engineering practice. To the extent dictated by the completeness of the V&V program plan, the HFEB reviewer will arrange for participation of PSRB and ICS3 reviewers at this meeting.

Design Validation Audit. Af ter review of all documentation, an audit may be conGacted to review the as-built prototype or installed SPDS. The purpose of this audit is to assure that the results of the applicant / licensee's testing demonstrate that the SPDS meets the functional requirements of the design and to assure that the SPDS exhibits good human engineering practice.

Installation Audit. As necessary, final audit may be conducted at the site to ascertain that the SPDS has been installed in accordance with the applicant / licensee's plan and is functioning properly. A specific concern is that the data displayed reflect the sensor signal which measures the variable displayed. This audit will be coordinated with and may be conducted by the NRC Resident Inspector.

]

Based on the advanced state of the HCGS SPDS design, the NRC staff carried out '

a combined Design Verification and Design Validation audit at the HCGS plant site and the Public Service Electric & Gas (PSE&G) training center.

During the course of this audit the NRC audit team discussed aspects of the HCGS SPDS program with PSE&G. Additionally, the HCGS control room and plant simulator were visited to ascertain the location of SPDS displays in relation to plant control boards-2.0 SAFETY PARAMETER DISPLAY SYSTEM DESIGN OVERVIEW The Safety Paraceter Display System for the Hope Creek Generating Station is one feature of the existing plant Control Room Integrated Display System (CRIDS). CRIDS receives information form the folicwing sources:

o The General Electric Nuclear Steam Supply System (NSSS) computer, o Plant process instrumentation, o The Emergency Response Facility Data Acquisition System (ERFDAS),

o The Radiation Monitoring System (RMS) computer, o Meteorological instrumentation.

The SPDS is a set of displays available via CRIDS that provide detailed information regarding each plant safety function controlled by the_ HCGS functional Emergency Operating Procedures (EOPs). These functions are:

o Reactor Power Control, o Reactor Pressure Vessel (RPV) pressure control, o RPV water level control, o Drywell pressure control, o Drywell temperature control, o Suppression Pool level control, o Suppression Pool temperature control, o Reactor Building temperature control, o Reactor Building water level control, o Reactor Building radiation control, o Offsite radiation release control.

i

[ . _ ..- .

.m . . .

~ '

The values of primary SPDS parameters are continuously shown as part of all CRIDS displays to provide an overview of plant status with regard to each primary saf ety function.

Additional displays may be called up by the operator to provide trend information and indicate the margin between the current value of the parameter and selected E0P limit or action points.

The major hardware components of the computer systems used by SPDS are:

o CRIDS: redundant Honeywell 45000 computers; o NSSS Computer a single Honeywell 4500 computer; o Process Instrumentation Inputs: seven Process Interface Units (PIUS);

o ERFDAS: a single multifunctional controller that collects data for input to the CRIDS; o RMS Data System: PSE&G plans for this system to include redundant Digital Equipment Corporation (DEC) 11/750 computers fed by redundant DEC PDP-11/44 minicomputers acting as input controllers; o Meterological Monitoring Computer: a single PDP-11/23.

The above systems are considered part of the SPDS since portions of SPDS data are processed by each.

3 0 ASSESSMENT OF THE VERIFICATION AND VALIDATION PROGRAM A Verification and Validation (V&V) Program is concerned with the process of specification, design, fabrication, testing, and installation associated with an overall system's sof tware, hardware, and operation. For the SPDS, verification is the review of the requirements to see that the right problem is being solved and a review of the design to see that it meets the requirements. Validation is the performance of tests of the integrated hardware and software systems to see that all requirements are met.

The purpose of the NRC Design Verification Audit was to obtain information about the HCGS V&V Program, to confirm that the V&V Program is being correctly implemented, and to audit the results of the V&V activities to date. The provisions of NUREG-0737, Supplement 1,[1] and the criteria suggested in NUREG-0800, Section 18.2, Appendix A were used as the basis for this audit.

NSAC/39 [3] provided additional guidance to the audit team.

Formal verification and validation was not part of PSE&Gs SPDS development process. Although PSE&Gs believes many of the review and test steps of a V&V program were conducted during system development, these activities were neither adequately documented nor sufficiently independent to totally satisfy the intent of a V&V program as discussed in NUREG-0800 and NSAC/39.~ To compensate for this fact, PSE&G plans to conduct an "after-the-fact" V&V program that includes the following features:

_g_

. , . - , ~ , . ,- , .

o Review of system capabilities against the provisions of NRC guidance documents.

o Partial testing of SPDS features. This testing is intended to establish the adequacy of more complete but less independent testing that was performed during system development and installation.

o Performance validation testing at the HCGS simulator to determine if the SPDS provides displays that are effective and understandable by the operators.

4-PSE&G intends to ensure the independence of the V&V Program by engaging an

' outside contractor, Eigen Erigineering Inc. (EEI) to devise and execute a V&V Program plan, [4]. It was stated that most of the work will be done on the Hope Creek simulator and some will be done in the control room. Some work has begun on the system requirements review and the design review. Simulator schedules, test criteria, and scenarios for analysis and walk / talk through are anticipated to become firm in October,1985. The NRC audit team recommended that PSE&G negotiate directly with PSRB during the selection of the scenarios.

The following five phases of the V&V plan are those recommended in NSAC/39

[3].

O System Requirements Review 0~ Design Review 0

Performance Validation Test 0 Field Verification Test .

O Final Report The remainder of this section presents the NRC Audit Team's observations and assessments of the HCGS V&V Program Plan which were obtained through discussions and an examination of available documentation.

3.1 System Requirements Review

=% 3 1.1 Audit Team Observations The HCGS V&V Program is just beginning to be executed by personnel from EEI.

-Tne EEI review team appears to be adequately qualified and concerned about fulfilling the NRC V&V requirements. The EEI Program Plan describes what is planned, but there is no formal documentation describing the projected organization and implementation of the plan. Neither does there appear to be a unified HCGS document which delineates system requirements or the basis for the requirements. Instead, it is planned that the V&V program will derive requirements from Section 18.2~ of NUREG-0800, NUREG-0737, NUREG-0696 [7] and NSAC/39. NUREG-0835 was listed as a source but should be omitted as it is now

incorporated as Appendix A to Section 18.2 of NUREG-0800. These system requirements will be used to develop a system requirements versus design 2 characteristics matrix of the V&V Program.

The ten items ~ listed for evaluation on page 8 of NSAC/39 are included for evaluation in the EEI V&V ProEram Plan. Several of the ten items in the EEI I Plan refer to other sections of the V&V Plan 'and are expanded to -describe the evaluation item review steps in greater detail. (e.g., Display format and content, sensor scan intervals, scale optimization, data validity, SPDS failure recognition, SPDS location criteria, reliability / maintainability

training, and personnel, definition of input and output signals, definition of installation, operation and maintenance requirements.) The expanded structure alludes to NUREG-0737/NUREG-0800 guidance and acceptance criteria, but the relationships are not complete or systematically described.

3.1.2 Audit Team Assessment NSAC/39 states that, "The system requirements are the foundation on which the completed system must be designed, built and accepted. The completed system is validated against the system requirements. The main objective of the independent system requirements review is to determine if the requirements are

. correct, complete, consistent, feasible, and testable."

Based on our review, the NRC Audit Team concludes that:

l-i o EEI intends to independently identify and list the system requirements and the sources of the requirements, as described in regulatory i documents.

o hI intends to identify design characteristics and to incorporate them

~

4 into the matrix which relates the design characteristics to the system j requirements.

The NSAC/39 statement implies that a determination of the acceptability of a list of as-built system requirements can be determined by comparing them with j a reference list of system requirements which is known to be correct, complete, consistent, feasible and testable. PSE&G intends that the

! requirements identified by EEI in the matrix will be this reference list.

I Since SPDS requirements were not formally established and documented during the design process EEI intends to use an alternative way to review the system

requirements by by passing the comparison methodology described above, and

! directly comparing the as-built equipment and system characteristics with the

requirements listed by EEI in the matrix. The NRC audit team concluded this approach will, if carefully implemented, accomplish the objectives of system i requirements review. The rationale and justification for this approach should be cialy and completely explained in the V&V final report. It is also essential that the methodology used to identify the as-built SPDS characteristics be completely and clearly explained in the report. Great care

)

and ingenuity must be exercised in the design and construction of the matrix,

~ ~ ~ ' ~ "~

, . - . _ _ _ _ _ _ . X ~. - ~ ~ '

i

, (or computerized equivalent) in order that the system requirements review will l be systematic, accurate and complete, and will unerringly identify and 1 tabulate discrepancies to ensure the validity and accuracy of the V&V work.

It is essential that the simulator be an exact duplicate of the control room if it will used for the comparison, or that any differences be identified, accounted for, and justified. .

32 Design Verification Review 3 2.1 Audit Team Observations The Design Verification Review is also just beginning; thus, the EEI review team V&V execution and documentation was not yet available for the NRC' Audit

'. Team to examine. Documents available were:

o EEI SPDS V&V Program Plan (PSE-1210-01 Section IV) [4].  ;

o Operations Engineering, Inc., (OEI) reports:

  • Display Design and Implementation (OEI 8407-1) [9].
  • SPDS Display Feature Development (OEI 8407-2) [10].

1 4

'l

  • SPDS Display Function Descriptions (OEI 8407-3) [11].

+

The OEI reports describe the analyses of needs and the development of the displays chosen to be presented on the SPDS CRT screens. The Design Review sections, .Part IV A, B of the EEI V&V Program plan, are stated to consist, in ,

part, of review and evaluation of existing HCGS system documentation and descriptions but these are not specifically named or referenced. During the design review EEI plans to complete the requirements / design characteristics matrix by reviewing HCGS, SPDS documentation, and as-built system characteristics to verify that each requirement in the matrix is set. The reference used to establish fulfillment of each requirement will be documented on the matrix.

A supplemental design " walk-through" is planned to compare actual on-screen display format and content with descriptive display documentation.

Deficiencies will be identified and documented for resolution and implementi, ion of corrective actions.

Documents generated by the human factors review during the system development were not named but should be audited by the V&V review team to ensure that identified deficiencies were properly resolved and appropriate corrective

actions implemented in the design of the displays.

Since the computer systems used by SPDS were in existence prior to the genesis of SPDS requirements, PSE&G does not intend to include review of hardware ,

capao111 ties in the SPDS V&V process. PSE&G believes that the successf ul

, demonstration that the SPDS system performs the intended functions will verify that the existing computer systems adequately support SPDS needs.

_7_

h

, ,g*-*,*=-*e f* * , ,4 r,y ;- _

=

A system performance validation test defined in Section V of the Plan will assess system performance.

3.2.2 Audit Team Assessment NSAC/39 states that, "The review of the hardware and sof tware design is focused on determining if the design is a correct implementation 'of the requirements ."

Many items such as system architecture, input / output interfaces, operating sequences, information flow, testability, human factors engineering, etc., are addressed in Table 2 of NSAC/39. This table includes items which might identify design requirements helpful in completing the requirements matrix of the V&V Program Plan, and in addressing the evaluation guidelines of NUREG-0800, Section 18.2 The general approach outlined for the HCGS SPDS V&V design review is acceptable. Nevertheless since the specific steps that will be used to implement this approach have not yet been developed the NRC audit team could not reach a conclusion regarding the expected adequacy of the V&V end product.

The design matrix (or computerized equivalent) is potentially a very complex document which must correlate all design features with system requirements.

PSE&G must assure that this complexity will be adequately addressed, and that a complete and acceptable comparison of system compliance with system requirements will be achieved.

It is i=portant that-all aspects of the plan and its execution be completely reported in sufficient detail to demonstrate to the NRC that the intent of the V&V ' review is accomplished.

While the NRC audit team agrees that design review of the existing computer systems which support the SPDS is unnecessary, the features of these systems that are critical to the operation of the SPDS must be documented, coordinated, and controlled to prevent the installation of future computer system modifications that could impair the operation of the SPDS. It is recommended that the V&V team verify that requirements imposed upon the existing computers by the SPDS are adequately documented and coordinated.

33 Validation Test 331 Audit Team Observations Hope Creek SPDS validation is planned to include static testing and dynamic testing. Static testing is intended to validate the system's compliance with performance requirements. Dynamic testing is intended to validate that the SPDS is an effective aid to operators in responding to plant transients.

. l 1

PSE&G stated during the audit that a significant amount of testing was done during'SPDS development and installation to insure that the as-built system functions as intended; therefore, complete retesting of all system character 1stics need not be performed during the validation phase. The previous testing was not, however, independent or sufficiently documented to  ;

completely satisfy the need for validation testing. Thus PSE&G intends to perform validation testing that demonstrates that randomly selected portions of each system design characteristic function as intended. Positive test l results for each set of partial tests will be interpreted as verification that i the previously conducted testing ndequately validated the characteristic under test.

Dynamic testing will consist of simulator drills with HCGS operating crews responding to three plant transient scenarios. Each drill will be run through j once without using the SPDS and once while using the SPDS. Operator performance will be noted during each drill and performance with and without '

the SPDS will be compared. It is intended to record sufficient data to deconstrate whether the operators can determine if plant conditions warrant l entry into an EOP, how the appropriate EOP is selected, and whether the ,

transient is mitigated within an acceptable time frame. It is not clear to ,

the NRC audit team how this will be accomplished. and documented. PSE&G plans to interpret improved operator performance with the SPDS vs. without the SPDS i as validation of system effectiveness. 3 The EEI V&V program plan states that validation will focus on demonstrating SPDS " effectiveness". Effectiveness requires " compatibility" and "understandability" which, in turn, require assurance that SPDS displays can be readily " perceived" and " comprehended" by plant operators. It is intended to confirm that the displayed variables are " sufficient" to assess the >

critical safety functions, and that the SPDS system is suitably isolated from -

other safety-related systems.

The validation testing will~ be coordinated with objectives and methodologies described in other sections of the V&V plan as follows:

Subj ect V&V Plan Section I A, 1 D, II A, IV  :

Suf ficiency Isolation I B II A, IV, II B Effectiveness II C Perception I D II D l Comprehension I D, Il D s Compatibility IV l' Data Validity IC Requirements Matrix II A, IV Acceptance Criteria IV I

Several of the subject terms above are abstract and are not defined in the plan, the terminology should be clarified and made more concrete. ,

L ____. _

- - ^ ^ ~ ~~~ ~ ~ "~

_ - ; .:. __ . _ J . _ _ _ . . _ . .

It is stated that acceptance criteria will be developed from the results of

! the requirements review. The static acceptance tests and criteria described in the plan will consist of a minimum of seven items to be applied depending on applicability to the specific design. The plan does not explain the

methodology for selecting acceptance criteria or how it will be determined that they are comprehensive and sufficient to accomplish their goals.

332 Audit Team Assessment ,

t NSAC/39 states, "Af ter the system has been tested by the developer, separate V&V tests are performed to determine if the ccmpleted system meets the design

~

requirements. Test plans and procedures are prepared prior to validation testing. Test execution and results analysis complete the validation testing

activity with any identified discrepancies documented for resolution."

i The plan to p'&rtially test randomly selected portions of each SPDS characteristic during the static testing is acceptable provided PSE&G verifles that the ability of the SPDS to fulfill each requirement outlined in the system requirements matrix was completely tested during the previous

! developmental and/or installation testing. The audit team recommends that PSE&G use the design characteristics vs. requirements matrix to document the existence of these previous tests. Whenever it cannot be determined that i previous testing completely demonstrated the SPDS's ability to fulfill a system requirement, thorough and rigorous testing of that feature must be conducted as part of the validation process.

Thk audit team concluded that the general approach outlined for the dynamic teeting will provide an acceptable validation of the SPDS's usefulness to the operators. In order to obtain the most benefit from tt.ese tests timely operator feedback must be obtained. PSE&G should develop and implement a structured methodology to obtain candid opinions and reccamendations about the SPDS from the operators who participa:te in the dynamic testing.

The intent to combine multiple failures in the dynamic validation test i l

scenarios is appropriate. PSE&G should be careful, however, to insure that

, the dynamic test scenarios include events that are more severe than the FSAR ,

l design basis events. l

' While the approach to the V&V process is acceptable, PSE&G has not yet determined exactly how the tests will be performed to meet the validation requirements and how EEI will ensure the completeness and accuracy of the work they intend to do. Very little is said about how the identification and documentation of deficiencies and corrective actions will be made.  ;

Consequently the NRC audit team is not certain whether the V&V plan will be acceptably executed.

I This uncertainty is partly due to the use of indefinite vs precisely stated descriptive terms in the plan, (e.g., "as many as possible", "with reasonable j assurance", " randomly Lelected", "at least one") without accompanying  !

assurance that the test details will be completely researched, well planned, '

and coordinated with the requirements matrix and the auditable documentation which the NRC requires. There is minimal mention of the coordination of I vendor / developer tests with on-site V&V tests, or how the details of the t identification and tracking of identified deficiencies and the selection and .

implementation of corrective actions will be accomplished. The EEI V&V plan j does not presently mention most of the items in Tables 3 and 4 of NSAC/39.

~

--- .~- ~. , -in- .

__ j

It is recommended that these concerns be addressed in the planning, execution and reporting so that NRC reviewers will be satisfied that the V&V validation tests were adequately and completely performed.

3.4 Field Verification 3.4.1 Audit Team Observations NSAC/39 states "The objective of this activity is to verify that the validated system was properly installed. As a minimum, field verification consists of

' s verifying that each input is correctly connected and that the signal range is consistent with the design."

EEI plans to review construction, installation, and test specifications of the already-installed SPDS system to ensure that sensors are correctly connected and system power supply transfer schemes are correct. Randomly selected paraneters will be tested to verify that the variable being displayed is driven by the correct sensor. The design review " walk-through" will be coordinated with the verification audit of the system including a randomly selected check of Class IE isolation devices. Graphic displays will be reviewed to verify that the format and content of the SPDS are the same as those of the simulator. A formally documented detailed step-by-step execution plan for V&V verification does not now exist.

3 4.2 Audit Team Assessment The general approach to field verification testing is acceptable. However the use of random testing must be justified for each feature on the basis that complete testing was performed earlier. The NRC audit team suggests that the design characteristics vs requirements matrix be used to document the existence of the previous tests and to identify untested features that require through and rigorous field verification as part of the V&V program.

As with the validation testing, there is presently not enough information to justify an NRC audit team conclusion whether the execution of the plan will be acceptable to the NRC.

4.0 ASSESSMENT OF SPDS DESIGN The NRC audit team assessed the SPDS system with respect to Supplement t to '

NUREG 0737 and the specific review criteria suggested by NUREG 0800, Section 18.2, Appendix A. This portion of the audit addressed most points of a Design Validation Audit. The following provides a discussion of the HCGS SPDS design features relative to the provisions of Supplement 1 to NUREG 0737, and the corresponding audit team assessment in each area, i

)

i l

, . , , +. . . . . . - . . . . . . . __

e l

4.1 "THE SPDS SHOULD PROVIDE A CONCISE DISPLAY ..."

4.1.1 Audit Team Observations The information needed to assess the status of the reactivity control, reactor core cooling, reactor coolant system integrity, containment integrity and radioactivity control is presented by a Primary SPDS overview display which shows the current value of each primary SPDS parameter overlayed upon a plant schematic. Whenever this primary display is not called up the current value of each primary parameter is displayed in the form of a Control Function Parameter Matrix (CFPM) that is a part of all other CRIDs displays.

The CFPM is organized to be consistent with the HCGS E0P control functions rather than with the specific critical safety functions outlined by Supplement 1 to NUREG 0737. Nevertheless, the CFPM correlates with the Supplement 1 critical safety functions as discussed in Table 2-3 of reference 9.

For each primary SPDS parameter a second level display may be called up.

These second level displays contain trend plots and considerable information relative to the specific operator action points in the E0Ps.

4.1.2 Audit Team Assessment The HCGS SPDS provides the needed concise display of the parameters presently included in the parameter set. The organization of the CFPM is logical and provides a strong tie between the SPDS and Hope Creek's functional E0Ps.

4.2 "THE SPDS SHOULD ... DISPLAY ... CRITICAL PLANT VARIABLES" 4.2.1 Audit Team Observations The parameters displayed on the SPDS overview display and on the Control Function Parameter Matrix are:

o Reactor Pressure Vessel Level, o Reactor Pressure Vessel Pressure, o Average Power Range Monitor, o Drywell Pressure, o Drywell Temperature, o Suppression Pool Pressure, o Suppression Pool Temperature, o Reactor Building Temperature all locations normal or any location above normal,

. . . _ . . . . . . . . . . . _ _ _ _ _ . . . . _ _ . _ . . . . _ _ . - - - =_ _ ._ . . _ _ _ . . . _ . .

l f

o Reactor Building Level; all locations below 1" or any location above 1",

o Reactor Building Area Radiation normal or above normal, o Total Radiation Release Rate at plant vents.

In addition, the secondary displays use parameters that are not on the i overview or the CFPM. These parameters are:

o Suppression Chamber Temperature, l l

o Suppression Chamber Pressure,  ;

o Reactor Recirculation Pump Status, o Average Power Range Monitor Bypass Status, I o Reactor Protection System Logic Status, +

o Drywell Water Level.

) PSE&G has stated that this parameter set was based upon information needs [

~

identified by the E0P task analysis. NRC audit team review of the task analysis performed to identify desirable display features [10] indicated, ,

however, that there are a number of E0P tasks that are not supported by the  ;

current SPDS paramete'r set. The audit team could not determine PSE&G's basis i for selecting the E0P tasks to be supported by the SPDS. t

! I 4.2.2 Audit Team Assessment

  • A complete review of the the parameters selected for display on the HCGS SPDS i was not within the scope of this audit. The use of E0P task analysis as the  !

basis for parameter selection is commendable. However, in light of the  !

, numerous EOP steps that are not supported by the SPDS, it is not clear that  !

the parameter selection analysis has been carried to its logical conclusion.  !

Furthermore, analysis supplementary to the EOP task analysis may be necessary to identify parameters needed to monitor critical safety functions when the plant is not in power operation.

[

%s  !

PSE&G aust verify the adequacy of the selected parameter set and provide NRC  !

with documentation of this review. The audit team suggests that this  ;

verification include a review of EOP tasks not supported by the SPDS and  !

documentation of the basis for omitting from the SPDS parameter set the  !

variables associated with these tasks.

l r

h I

I L

f i

i t

f i

.... . . .a . a, ., ... , e...-, - .. .. .

i i

?

43 "THE SPDS SHOULD ... AID THEM (OPERATORS) IN RAPIDLY AND RELIABLY f DETERMINING THE SAFETY STATUS OF THE PLANT" l 4.3.1 Audit Team Observations f The HCGS SPDS provides real time display of magnitudes of the safety parameters input to it. The sampling rate and update interval fo'r the j 4

magnitude display of SPDS parameters is approximately one second. Secondary I display trend plots have two parts, the time-history and the current-value  !

indication, that are updated differently. The current value indication is a single bar at the right edge of the time-history plot. The height and color  ;

of the single bar is updated every second to correspond to the current-value 5 of the parameter. The time-history plot is updated every 15 seconds with alarm status of the parameter.  ;

i Magnitudes of SPDS parameters are generally displayed on the CFPM to the  ;

nearest integral value. Numerical magnitudes of SPDS parameters on other SPDS j displays are generally indicated to the nearest one-tenth. The time-history l l plots, however, can resolve the data only to a value that is equivalent to the  !

height of a CRT character on the plot magnitude scale. As a result the trend  !

plot resolution is very poor. For example, RPV pressure can only be resolved t to 125 psi from the trend plots. Any single point on a trind plot may be  !

examined in more detail by manually requesting display of the numerical value }

cf singla time segment of the trend plot, j i

The SPDS displays a single validated value of each SPDS parameter. For most I parameters the single value is generated by eliminating individual inputs that  !

indicate outside of the calibrated range of the instrument channel, then j averaging all remaining inputs. For sensor inputs that are not redundant, f instrument operability checks replace the range checking and averaging. The I following parameters that do not conform to this general validation I methodology.  !

In addition to the range check and averaging, dry

~

o RPV Water Level: f well temperature is used to eliminate instruments that are not i calibrated for reference leg temperatures above the current dry well l temperature. Also, fuel zone water level indication is eliminated j

^

from the average if either recirculation pump is running.

o Average Power Range Monitors: In addition to channels indicating ,

out of range, bypassed channels are eliminated from the average.

o Suppression Pool Temperature: The suppression pool temperature j channels are fed into the SPDS and into two averaging modules that i are part of the plant process instrumentation system. The average {

values generated by these two modules are used if the averages are i within range. Otherwise, the reading of each temperature sensor i associated with the out of range average is range checked and i averaged by the SPDS. It is not clear if the process - i instrumentation averaging modules perform range checking of their  !

individual inputs.

I l

i o Reactor Building Area Water Levels: The input signals for area water levels come from a pair of level switches in each area; therefore, these inputs are not amenable to the standard validation process. Instead, high water level is indicated if either input is high, and low water level is indicated if both inputs are low. Any other status is indicated as unknown water level. -

The NRC audit team reviewed the values to be used for range checking and found that some were inconsistent with the instrument ranges or with credible parameter values. In addition, the validation algorithm used for RPV water level fails to account for the fact that none of the water level instruments are calibrated to provide valid indication for all plant conditions; i.e. one set of instruments is calibrated for use when the RPV is " hot" and another set of instruments is calibrated for use when " cold".

With the exception of reactor building area water levels as noted above, the SPDS indicates that a parameter value is unknown if no inputs for that parameter satisfy the operability or range checking criteria. None of the SPDS Gisplays alert the operator to a loss .of less than all of the parameter inputs. The operator can call up a single line display on the bottom of the SPDS screen that shows if an input has been ignored in the calculation of the average. This display does not indicate how many inputs remain as the basis of the average. Furthermore, some parameters are expected to have certain out-of-range inputs under normal operating conditions; therefore, this display will always indicate failed inputs for these parameters.

Operability of the CRIDS computer is indicated by a once per second update of the current-time clock and by a flashing cursor. If there is a computer system fault, the current-time clock stops incrementing and the cursor changes color.

PSE&G is conducting an availability analysis for the CRIDS and for the SPDS functions of the CRIDS. This analysis assumes a mean time to repair (MTTR) of 1 hour1.157407e-5 days <br />2.777778e-4 hours <br />1.653439e-6 weeks <br />3.805e-7 months <br /> for a failure of any component needed for complete operability of the SPDS. Component reliability data was obtained from the hardware vendors, MIL-HDBK-217C and from the SRS data base. Initial results indicate the SPDS is expected to be totally operable more than 995 of the time. NRC audit team review of the reliability anal] sis indicated that three systems, the EFRDAS, the NSSS Data Acquisition System, and the CRIDS Process Interface Units, each have an expected mean time to failure (MTTF) of less than one month. PSE&G stated that a complete set of spare parts will be maintained for each computer system; however, the planned maintenance staffing will not provide for continuous, on-site presence of maintenance technicians qualified to troubleshoot and repair all SPDS components.

, , , .., , , ,c . - . ,

~ ~

l .

4.3 2 Audit Team Assessment The HCGS SPDS does not completely satisfy the provisions of Supplement 1 to NUREG 0737 regarding rapid, reliable display. There are a number of problems as discussed below that lead the audit team to this conclusion.

i The HCG 3 SPDS data validation methodology has serious shortcomings. These ares o The SPDS does not inform the operator when some inputs have been omitted from the calculation of average parameter values, o The display that the operator can call up to determine if inputs have been omitted from the average does not account for the fact that.with overlapping instrument ranges some inputs will always be omitted from the average even when functioning normally.

~

o The operator does not have ready access to a concise display of the individual input values used to calculate the average parameter values.

o It appears that the use of process instrumentation averaging modules.

to develop average suppression pool temperature values does not provide for range checking individual instrument channels.

o The validation algorithm does not provide for analysis of or notifi-cation of the operator about in-range instrument readings that are inconsistent with other inputs.

o Some of the values used in range checking of parameter inputs are unrealistic.

o No provisions are made for selecting between " hot calibrated" and

" cold calibrated" level instruments as appropriate for plant conditions.

In addition to the data validation problems, the update interval and the resolution of the secondary display trend plots is inadequate.

Certain portions of the SPDS hardware are expected to have MTTF that are quite short. Although the availability analysis discussed at the audit shows an acceptable SPDS availability, the audit team believes the numerical results are based upon a mean time to repair assumption that is inconsistent with maintenance staffing plans.

PSE&G must correct the above mentioned deficiencies in the data validation methodology and must improve the display resolution of the trend plots. A discussion of the system improvements in this regard shall be provided for NRC review. Interim use of the existing system is acceptable provided the plant operators are appraised of the above shortcomings during training and provided that PSE&G establishes a correction schedule that is acceptable to the NRC.

l i l

l . - -. .. - , -. .. .. . . - .. . . _.

The reliability expectations for SPDS hardware should be reviewed to determine ,

if hardware or planned maintenance staffing levels require modification to )

assure acceptable system availability. PSE&G should also collect SPDS operating history data and use this information to evaluate the actual reliability of the system and its components. This evaluation should separately identify the unavailability contributions from both hardware and software. Failure data from this history may be used to focus efforts for system reliability improvement and maintenance training.

4.4 "THE PRINCIPLE PURPOSE AND FUNCTION OF THE SPDS IS TO AID THE CONTROL

. s. ROOM PERSONNEL DURING ABNORMAL AND EMERGENCY CONDITIONS IN DETERMINING THE SAFETY STATUS OF THE PLANT AND IN ASSESSING WHETHER ABNORMAL CONDITIONS WARRANT CORRECTIVE ACTIONS BY CONTROL ROOM OPERATORS TO AVOID A DEGRADED CORE."

4.4.1 Audit Team Observations As discussed above the HCGS SPDS displays magnitude and trends of the SPDS parameters.. The values of instrumentation channels related to a parameter are synthesized into a single display of that parameter. Perceptual cues to abnormal conditions are provided by reverse video coloration of the abnormal value. Color changes are also provided on trend plot segments that represent an abnormal value.

In addition to magnitude and trend information the SPDS provides the operator with information regarding selected EOP limit points and operator action points. This information is contained on the secondary displays and is presented in terms of the numerical margin between the parameter and the limit or action point. Color changes are used to alert the operator when the parameter has exceeded each margin or action point. This technique is applied to comparison of parameters with curves from the EOPs as well as comparison of parameters with single valued action points. The second level SPDS also keeps track of the total time that the suppression pool is above 95 degrees Fahrenheit to aid in the determination of suppression pool heat capacity, and identifies which systems have sufficient pump head to add water to the RPV at any given time.

4.4.2 Audit Team Assessment The HCGS SPDS displays reflect corsiderable thought about operator information needs and the formats that will most effectively fulfill these needs.

Consequently, the audit team concludes that the Hope Creek SPDS thoroughly satisfies the operator aid provision of Supplement 1 to NUREG-0737. .

l I

i l

l I l

.. - - . - -. . ,. . .. - . . .. . _ .-,. .- . ._, . ~

4.5 "(THE) SPDS (SHALL BE). LOCATED CONVENIENT TO THE CONTROL ROOM OPERATORS" 4.5.1 Audit Team Observations Any SPDS display can be called up on any of the CRIDS CRTs. A CRT is easily i visible from every control console and main vertical control board in the ~

control room. Additional CRTs are provided at the chief operator's console ,

and in the shift supervisor's office. All CRTs are capable of independently showing any selected SPDS or other CRIDS display j 4.5.2 Audit Team Assessment

^

The HCGS clearly fulfills the provisions of Supplement 1 to NUREG 0737 regarding convenient location. .

I i i

4.6 "THE SPDS SHALL CONTINUQUSLY DISPLAY INFORMATION FROM WHICH THE SAFETY STATUS OF THE PLANT ... CAN 1BE ASSESSED..."  !

i i 4.6.1 Audit Team Observations [

The Control Function Parameter Matrix is displayed on every control room CRIDS f

! CRT that is not displaying the primary SPDS display.  !

4.6.2 Audit Team Assessment i

The Hope Creek SPDS satisfies the continuous display provision of Supplement 1 to NUREG GT37.

i

[

4.7 "THE SPDS SHALL BE SUITABLY ISOLATED FROM ELECTRICAL OR ELECTRONIC  !

INTERFERENCE' WITH EQUIPMENT AND SENSORS THAT ARE IN'USE FOR SAFETY  !

SYSTEMS" l

I 4.7.1 Audit Team Observations l
PSE&G has indicated that Class 1E isolation devices are used at each interface r between Class 1E systems and the SPDS. Type test data for the specific -

isolation devices used is being separately provided to NRC. .

l 4

i j 4.7.2 Audit Team Assessment [

l

{ Detailed review of the isolation provisions was not within the scope of this }

audit.

[

i

i

)

i

~

l 1

l . -- . , - - . . _ . . , -. --.m. - , . , , . . - , . - , . - . . - - , , , - _ _ - . _ , . , . . - . , , . . . . _ , - , , . . . . _ . - . . - _ . - . __,s.---

_ ..,_ . ~ _.. - - ..

I

[ . . .  :

e I

i l 4.8 " PROCEDURES WHICH DESCRIBE THE TIMELY AND CORRECT SAFETY STATUS ASSESSMENT WHEN THE SPDS IS AND IS NOT AVAILABLE WILL BE DEVELOPED BY THE LICENSEE IN PARALLEL WITH THE SPDS. FURTHERMORE, OPERATORS SHOULD BE TRAINED TO RESPOND TO ACCIDENT CONDITIONS BOTH WITH AND WITHOUT THE SPDS AVAILABLE."

, t

! 4.8.1 Audit Team Observations .

l .

! PSE&G does not intend to have separate safety status assessment procedures for j plant operation with and without SPDS available. Instead the HCGS procedures  !

. identify the important safety parameters to be monitored and the operators (

l will be trained to perform this monitoring function both with and without the SPDS. This training will be incorporated into the operator qualification and requalification training. PSE&G has not yet developed procedures to insure 4

that consistency is maintained between the SPDS and HCGS procedures.

t l 4.8.2 Audit Team Assessment

$ PSE&G provisions for _ training appear to fulfill the provisions of Supplement 1 l to NUREG 0737 in this regard. Formal controls must be established to ensure -

that the SPDS, SPDS training and HCGS procedures remain autually compatible. f

. In addition, plant operation without SPDS should be a scenario included in the f j simulator training program.

4.9 'THE SPDS DISPLAY SHALL BE DESIGNED TO INCORPORATE ACCEPTED NUMAN FACTORS PRINCIPLES SO THAT THE DISPLAYED INFORMATION CAN BE READILY PERCEIVED AND t COMPREHENDED BY SPDS USERS." [

i l 1 4.9.1 Audit Team Observations ,

i

} The MCGS SPDS display designs result from a task analysis based upon the plant l

! specific E0Ps. This task analysis identified operator information needs for [

l each control function and, for each need supported by the SPDS parameter set, i identified a number of candidate displays that would provide the needed j information in a readily usable format. These candidate displays were i

reviewed to select the final displays and the displays were grouped together l on the secondary displays related to the applicable control function. Final

! display formats were developed with consideration to the human factors design j criteria developed as part of the Detailed Control Room Design Review (DCRDR)

and EPRI NP-3701, " Computer-Generated Display Systen Guidelines, Volume It j Display Design".

SPDS displays on the plant control boards are addressed by a small, calculator-like keyboard located near each CRT. The SPDS overview display is l called by pressing a single key on the keyboard. The second level displays I may be called up from either the overview display or the CFPM display by

! placing the cursor over the parameter desired and pressing an execute key.

! SPDS displays away from the main control boards have typewriter style l keyboards that allow more flexibility in the use of CRIDS. Access to SPDS l displays using these keyboards is achieved with function keys that are similar

! in function to those on the control board keyboards, t

j L ,, ,. . .

i The SPDS was also reviewed for conformance with the guidance of NUREG-0700 during the Hope Creek DCRDR. Human factors deficiencies noted during this

review are being resolved as part of the DCRDR process.

The NRC audit team observed the SPDS displays as the system was being driven by data from the HCGS simulator during an operator training exercise. Aside from HEDs associated with the problems noted in section 4.3, no other significant HEDs were noted.

4.9.2 Audit Team Assessment ,

> The HCGS SPDS satisfies the provisions of Supplement 1 to NUREG-0737 regarding  ;

incorporation of human factors principles. {

i 5.0

SUMMARY

5.1 VERIFICATION AND VALIDATION PROGRAM The concept and approach of the PSE&G/EEI V&V Program Plan is generally acceptable. However, the plan is not entirely adequate as based on the l

recommendation of NSAC/39. The plan and program lack sufficient formal documentation to justif y sn NRC conclusion whether the execution of the plan will be acceptable to the NhC.

Our concerns are:

o The plan uses certain abstract and non-specific terms to describe some V&V tasks. Before execution, the tasks must be researched. l well planned, coordinated, and documented to ensure that their t execution will result in an acceptable and auditable V&V effort. j o The use in the V&V tests and audits of randomly selected channels  ;

and equipment is acceptable only if it is ensured that previously l l documented information exists to justify the use of random t selections to demonstrate valid V&V for all SPDS items.

l {

i l o The V&V process should be auditably documented to describe ard l demonstrate that the matrix and the execution of the V&V j methodologies will compare requirements with characteristics of -

^

equipment and procedures and identify and correct all significant l discrepancies. l o PSE&G aust ensure the methodology used for selecting acceptance [

criteria will give results that are accurate, comprehensive and ,

i complete, and will satisfy the intent of the V&V requirements. [

l Furthermore, the audit team has a number of recommendations for improving the planned V&V program: i P

' ~

o The constraints imposed by the SPDS upon the design features of supporting computers should be documented.

o Performance validation testing scenarios should include events that are outside of the scope of FASR Chapter 15 evente, o Performance validation testing should include a structured methodology for obtaining candid feedback about the SPDS from operators wh6 participate in the testing.

5.2 SPDS DESIGN The audit team concluded that the Hope Creek SPDS reflects a strong intent to provide an effective tool for the operators to use in monitoring plant safety status and in responding to plant transient conditions. A number of deficiencies were noted, however, that prevent the system, as currently implemented, from totally satisfying all of the provisions or Supplement 1 to NUREG 0737. These shortcocings are listed below.

o The SPDS does not automatically indicate when some inputs have been eliminated from the calculation of an average parameter value.

Furthermore, the operator cannot obtain unambiguous information regarding whether any input is outside of the expected range for the current operating conditions. Ideally the SPDS should automatically indicate if any input instrument is reading outside of its expected range and the operator should have ready access to a concise display of raw input data for each parameter so that the effect of individual instrument failures may be assessed.

o The data validation algoritha does not resolve in-range instrument readings that are inconsistent with other inputs.. Thus, failed in-range instruments may significantly bias average parameter values displayed by the SPDS.

o Some of the range limits used by the data validation algoriths are unrealistic.

o The data validation algoriths makes no provisions for removing RPV level instruments that are not calibrated for the current plant conditions from the calculation of RPV average level.

l o The update interval for parameter time history plots is too long.

! o The parameter magnitude resolution of the time-history plots is

insufficient.

I Although the MCGS SPDS may be used as is on an interia basis, ultimate

! acceptability of the system will depend upon timely and acceptable resolution of these shortcomings.

l

-21~

~

4 PSE&G aust also implement processes thats o Maintain consistency between the Emergency Operating Procedures, operator training and the SPDS.

o Document SPDS operating experience in order to establish the actual reliability of SPDS hardware and software during operation and to focus the application of resources if improvements in SPDS reliability prove necessary.

Finally, the audit team suggests that PSE&G consider the following NRC audit team concerns witP the SPDS design process.

o It .is not clear that the task analysis used to select the SPDS parhaeter set has been completed. PSE&G should review the tasks that are not supported by the SPDS parameter set and ensure that a justifiable basis for not including the parameters needed for these j steps exists and is documented.

The SPDS parameter set selection methodology does not appear to have 3

o given sufficient consideration to monitoring safety function status during operating modes other than full power operation. PSE&G aust verify that the SPDS parameter set is sufficient to monitor safety

function status during all applicable modes of operation.

' o The mean-time-to-repair assumptions used in the SPDS availability l analysis are not consistent with plant maintenance staffing plans.

l PSE&G should. review the effect of more realistic MTTR assumptions on I the availability' calculation and determine if additional actions are needed to achieve acceptable availability.

i r

'l 1

1 l-

.w. . :. < m -

.~ ~- - - - - -

.~- --

'A *

~! . . ,

4:

6.0 REFERENCES

i-a  !

1. U. S. Nuclear Regulatory Commission, NUREG-0737, " Clarification of ,

TMI Action Plan Requirements," No, ember 1980, Supplement 1, December

  • l 1982.

5

2. U. S. Nuclear Regulatory Commission, NUREG-0800, " Standard Review Plan for the Review of Safety Analysis Reports for Nuclear Power Plants," Section 18.1, Control Room, Rev. 0 - September 1984 and -

-

~

i ,

3. Verification and validation for Safety rarameter Display Systems, NSAC/39, Science Applications, Ir.c., (December 1981). l 4 Hope Creek Safety Parameter Display System Verification and ,

Validation Program Plan, Eigen Engineering, Inc., August 26, 1985. i

5. U. S. Nuclear Regulatory Commission, NUREG-0700, " Guidelines for Control Room Design Review," September 1981.  !
6. U. S. Nuclear Regulatory Commission, NUREG-0835', " Human Factors  !

Acceptance Criteria for the Safety Parameter Display System" (

7. U. S. Nuclear Regulatory Commission,1NUREG-0696, " Functional Criteria .

for Emergency Response Facilities," February 1981.

8. " Instrumentation for Light-Water Cooled Nuclear Power Plants to  !

Assess Plant and Environs During and Following an Accident",  !

Regulatory Guide 1.97, Rev. 2, Nuclear Regulatory Commission, Office  !

e of Standards Development (December 1980).

i

9. " Safety Analysis for Hope Creek Generating Station Safety Parameter U Display System, Display Design and Implementation", OEI Document  !

8407-1, Rev.1, March 1985. j

10. " Hope Creek Generating Station SPDS Display Feature Development",

OEI Document 8407-2, Draft Revision C, August 1985. >

I

! 11. " Hope Creek Generating Station SPDS Display Functional Descriptions",

OEI Document 8407-3, Draft Revision C, August 1985.  :

i r

e 4

t i r t

t I

I 1

l r

h

_ _ _ _ _ ._ _, _ _ _ . _ - ~ . _ _ _ .