ML20137Z317

From kanterella
Jump to navigation Jump to search
Design Verification & Design Validation Audit of SPDS for Louisiana Power & Light Waterford Unit 3
ML20137Z317
Person / Time
Site: Waterford Entergy icon.png
Issue date: 10/21/1985
From: Johnson G, Johari Moore
LAWRENCE LIVERMORE NATIONAL LABORATORY
To:
NRC
Shared Package
ML20137Z214 List:
References
NUDOCS 8512110270
Download: ML20137Z317 (18)


Text

O ENCLOSURE

.S 4

DESIGN VERIFICATION AND DESIGN VALIDATION AUDIT OF THE SAFETY PARAMETER DISPLAY SYSTEM FOR LOUISIANA POWER AND LIGHT WATERFORD UNIT 3 OCTOBER 21, 1985 Gary L. Johnson

~

James W. Moore Lawrence Livermore National Laboratory For The United States Nuclear Regalatory Commission B512110270 851202 PDR ADOCK 05000302

.F PDR

,a

. t J ~': V  ; ,

3 y DESIGN VERIFICATION AND DESIGi VALIDATION AUDIT

'x 0F THE SAFETY PARA!ETER DISPLAY SYSTEM FOR

, LOUISIANA POWER AND LIGHT WATERFORD UNIT 3

1.0 INTRODUCTION

On September 25 through 27, 1985 an' audit of the Waterford Unit 3 Safety Parameter Display System was conducteo by the NRC. This NRC audit examined

.i the Waterford 3 verification and validation program and reviewed the operation i' of the SPDS. Thus the audit specifically addressed the points of both a Design Verification Audit and a Design Validation Audit as described by Section 18.2 of NUREG 0800 [2]. The audit team was composed of one individual from the Nuclear Regulatory Commission Human Factors Engineering Branch and two individuals from the !;aiwrence Livermore National Laboratory acting as consultants to the NRC. '

The audit.was based upon the recommended criteria of NUREG 0800 Section

~

1812. In acon'rdance with that guidance, up to three separate audit -

c meetings / site visits, as described below, may be arranged.

'i Design Verification Audit. The purpose of this audit meeting is to obtain

" additional information required to resolve anjioutstanding questions about the VnV prograni, to confirm that the V&V program is being correctly implemented,

': - s and.to audit the results of the V&V activities to date. At this meeting, the applicant should provide a thorough description of the SPDS design process.

Emphasis should be placed on how the applicant is assuring that the i

implemented SPDS willi provide appropriate parameters, be isolated from safety

' i systems, provide reliable and valid data, and incorporate good human engineering practice. To the extent dictated by the completeness of the V&V program plan, the HFEB reviewer will arrange for participation of PSRB and

- ICSB reviewers at this meeting, Design Validation Audit. After review of all documentation, an audit may be

' conducted to review the as* built prototype or installed SPDS. The purpose of this audit is to assure that the results of the applicant / licensee's testing l demonstrate that the SPDS meets the functional requirements of the design and to assure that the SPDS exhibits good human engineering practice.

Installation Audit. As necessary, final audit may be conducted it the site to .

l ascertain that the SPDS has been installed in accordance with the applicant / licensee's plan and is functioning properly. A specific concern is that the data displayed reflect the sensor signal which measures the variable displayed. This audit will be coordinated with and may be conducted by the NRC Resident Inspector.

l ,

3

. . - - , . _ _ - _ . - _ _ _ - - . ~ 4,....,__. -

,a Based on the advanced state of the Waterford 3 SPDS design, the NRC staff carried ott a combined Design Verification and Design Validation audit at the

' plant site.

During the course of this audit the NRC audit team discussed aspects of the Waterford 3 SPDS program with Louisiana Power and Light (LP&L). Additionally ,

the Waterford 3 control room was visited to ascertain the location of SPDS displays in relation to plant control boards.

2.0 SAFETY PARAMETER DISPLAY SYSTEM DESIGN OVERVIEW Safety Parameter Display System information in the Waterford-3 control room is displayed on a pair of dedicated CRTs. The SPDS displays are developed from plant parameter information in the Plant Monitoring Computer (PMC) data , base by SPDS preprocessor / display generator oosputer. The data in the PMC data base is from one of three sources.

o Directly from plant instrumentation via PMC multiplexers.

o The Combustion Engineering (CE) " Qualified SPDS" (QSPDS) that displays information regarding the status of primary cooling system parameters, o The Radiation Monitoring System (RMS) 4 Conversion of input voltage values to engineering computer units and data validation checks for information obtained directly from plant parameters is performed by the PMC. The QSPDS and the RMS perform units conversions and data validation for the parameters that they process and pass the converted values along with data quality tags to the PMC. Variables that 'are monitored i

- by redundant instrumentation are displayed as single values on many of the j SPDS displays. These single values may be the maximum, minimum or average of

' v'alid input values, depending on the designers judgment regarding which value is most usef ul to the plant operator. Generally the synthesized value is i

calculated by the SPDS preprocessor / display generator using data that is flagged as " good" in the data base. In some cases, however, the synthesized l value is calculated by the QSPDS and transmitted to the PMC.

Most of the hardware and much of the software that is used to perform l Waterford's SPDS function was developed as part of the PMC and predates the l requirement for a SPDS. After NRC requirements for SPDS were developed by NRC LP&L developed SPDS display formats to run on the PMC, installed dedicated SPDS terminals in the control room, and added an extra CPU to the PMS in order to improve the access time for SPDS displays.

l l

"2"

  • w J .w- , _4 -i. e J -. m _ _m.,.A. .:w m -

a 30 ASSESSMENT OF THE VERIF "ATION AND VALIDATION PROGRAM 31 SYSTEM REQUIREMENTS REVIEW 3.1.1 Audit Team Observations Several SPDS functional requirements documents were were developed by LP&L [9,

-16, 19].

The audit team determined, however, that these documents contained f ew technical requirements and mostly pertained to the preprocessor / display I generator portion of the SPDS. The preprocessor / display generator only formats information for display on the SPDS CRTs and calculates maximum, minimum, and average values of parameters. The more critical functions of data validation, data scanning, and units conversion are performed by the QSPDS, the RMS and the main portion of the PMS. Consequently, it appears that no formal documented requirements exist for these SPDS functions.

Functional requirements for the QSPDS were developed by CE in " Functional Design Description for the Qualified Safety Parameter Display System for Waterford Steam Electric Station, Unit Number 3" [26]. The functional requirements contained in these documents were not sufficiently specific to form a basis for audit of QSPDS factory validation and LP&L field verification testing. Additionally a description of and documentation from any CE functional requirements verification activity for the QSPDS was not available for' audit.

LP&L did perform a SPDS system review against the overall provisions of Supplement 1 to NUREG-0737 and documented this review in " Safety Parameter t' Display System (SPDS), Waterford SES Unit 3. Louisiana Power and Light"

[22]. Both the scope of and conclusions from this review are somewhat vague. Further, the results 'of LP&La overall review differ significantly from t the audit team review findings as discussed in Section 4 of this report.

31.2 Audit

  • Team Assessment LP&L has not addressed the intent of NUREG-0800 Section 15.2 or NSAC/39 i'

regarding independent verification of system requirements. Since LP&L did not clearly define overall functional. requirements for the SPDS as a whole, a verification review of functional requirements vs regalatory requirements and operator needs could not be performed for the portion of the SPDS that was developed by LP&L. Functional requirements for the QSPDS may have been documented as part of that systems development, however, any verification reviews that were performed as part of the development of the QSPDS were not available. J LP&L attempted to fulfill the intent of a system requirements review by evaluating as"bJilt characteristics against the provision of Supplement 1 to i NUREG-0737. The NRC audit team concluded, however, that this review is both  !

both inaccurate and unacceptably limited in scope. l l

1 l

"3"

-- - - = _ . - . . - . - . .. . - -_- . _ _ . _ . __. -. . _. _ -

~

s 3 2 DESIGN VERIFICATION REVIEW 3 2.1 Audit Team Observations Design specifications [10,18,19] were prepared and independent reviews were conducted as part of the development of the SPDS preprocessor / display generator. As discussed above, however, many important SPDS functions are performed by other portions of the system. Detailed technical design specifications and independent verification reviews of these specifications do not exist for the LP&L portions of the SPDS other than the preprocessor /

display generator. Documentation of verification reviews that might have been conducted as part of CEs development of the QSPDS were not available for review.

3 2.2 Audit Team Assessment LP&L failed to address the provisions of NUREG-0800 Section 18.2 and NSAC/39 regarding verification of design.

3 3 VALIDATION TESTS 331 Audit Team Observations Functional testing of the QSPDS and of the preprocessor / display generator software was conducted as part of system development. Additionally extensive testing of SPDS functions was conducted as part of the field verification testing conducted after installation in the plant [153. This testing was rigidly controlled and thoroughly docamented in accordance with procedures governing the Waterford 3 start up test program.

T- In order to evaluate the effectiveness of the SPDS as an operator aid LP&L is planning to conduct man'in'the' loop system validation testing. This testing will consist of operator reviews of the SPDS displays while they are being driven by dynamic data simulating two selected design basis transients [23, 24]. Two plant operators will independently view each transient twice.

On the first run of each trrisient the operators will not know what transient

  • is being simulated and will be asked to provide a running description of plant safety status as determined from the SPDS. After the first run the operators will complete a questionnaire that provides their assessment regarding the SPDS usef alness in rapid and reliable evaluation of plant critical safety functions. On the second run of each transient, the operator will know what event is being simulated and each operator will be asked to review the organization of displays. After this run the operators will complete a questionnaire regarding the organization of SPDS displays. .

The questionnaires to be completed in both cases consist of a short checklist regarding the features to be evaluated by the test, and three essay response questions about features most liked, features most disliked and recommended changes to the system.

-ga

a 332 Audit Team Assessment Extensive system testing was condacted as part of field acceptance tests.

Nevertheless, due to the lack of docamented system functional requirements, the audit team could not establish the relevance of test acceptance criteria to system requirements. Consequently the audit team could not determine if the field acceptance testing provided adequate validation of hardware or sof tware performance.

The development of a SPDS validation test that solicits operator feedback regarding the usefulness and organization of the SPDS is a positive feature.

There are, however, a number of shortcomings with the planned testing that limit its usefulness.

o The number of operators to be included in the validation testing is too small to provide an accurate indication of the usability of the SPDS by the Waterford 3 operating staff.

o The operator questionnaires to be used are brief and only minimally prompt the operators for feedback, o The transients to be evaluated are quite limited and do not include consideration SPDS use in evaluating events that are more severe then the plant design basis.

o The planned testing does not evaluate the effectiveness of the SPDS as an integrated part of the plant control room, o The goals of the validation program, especially the observations made during the second run through of each transient, are not well defined.

The audit team recommends the following actions to compensate for these deficiencies in the validation testing, o The' operator questionnaires should be improved.

o Feedback should be solicited from a much larger sample of operators.

o A much more extensive validation program that addresses the above points should be conducted using the plant simulator when it becomes available.

3 4 FIELD VERIFICATION TESTS 3.4.1 Audit Team Observations As discussed above extensive field verification testing of the SPDS was conducted as part of plant start up testing. These tests had clearly defined acceptance criteria, reqaired careful documentation of test results, and included reviews of test results to identify any acceptance criteria that were not totally met by the system. The test deficiency logs, procedure change

'5'

4 -

a documents and documentation of resolution of test deficiencies provided evidence that the testing was carefully controlled and that deficiencies identified by the tests were tracked to resolution.

3.4.2 Audit Team Assessment Although the relevance of field acceptance testing to SPDS functional requirements could not be determined by the NRC audit team, the field testing did appear to provide a rigorous check out of system operability and field installation.

4.0 ASSESSMENT OF SPDS DESIGN The NRC audit team assessed the SPDS system with respect to Supplement 1 to NUREG 0737 and the specific review criteria suggested by NUREG 0800, Section 18.2, Appendix A. This portion of the audit addressed the points of a Design Validation Audit. The following provides a discussion of the Waterford 3 SPDS design features relative to the provisions of Supplement 1 to NUREG 0737, and the corresponding audit team assessment in each area.

4.1 "THE SPDS SHOULD PROVIDE A CONCISE DISPLAY ..."

4.1.1 Audit Team Observations Many of the plant parameters needed to assess the status of the reactivity control, reactor core cooling, reactor coolant system integrity, containment integrity and radioactivity control safety function are presented on a SPDS overview display. However, in order to determine the status of. these five safety functions, the operator must access five screens of data. Furthermore ,

an additional eight screens of data' must be viewed to obtain trending i~ information for these parameters. There are no provisions for alerting the operator to a change in safety status due to a change in the value of a parameter not shown on the durrent display.

The audit team observed that it takes about 20 seconds to change from one display to another. Therefore, reviewing the overall plant critical safety function status takes several minutes and requires considerable operator attenti,on to obtain displays.

4.1.2 Audit Team Assessment Status information for all important SPDS parameters is not available on a single display and obtaining all the displays that contain the needed information is both awkward and time consuming. Therefore, the Waterford 3 SPDS does not satisfy the provisions of Sapplement 1 to NUREG 0737

~

regarding concise display.

d 4.2 "THE SPDS SHOULD . . . DISPLAY ... CRITICAL PLANT VARIABLES" 4 . 2.1 Audit Team Observations The Waterf ord SPDS has an extensive parameter list. The parameters that LP&L considers to be needed to fulfill the SPDS function are listed in Table 1.

The SPDS can also display many other parameters that are not included in this list.

f 4.2.2 Audit Team Assessment A complete review of the parameters selected for display on the Waterford 3 SPDS was not within the scope of this audit. To some extent the difficulty in i obtaining SPDS parameter information results from the inclusion of many unnecessary parameters on the SPDS displays. While the audit team coes not disagree with the concept of using the SPDS to display information that is not directly related to the status of critical safety functions, this information should be organized such that it does not interfere with the primary function of the SPDS. One way to achieve this goal would be to create a few separate displays that contain all the required information pertaining to SPDS parameters but contain only SPDS infonnation.

4.3 "THE SPDS SHOULD ... AID THEM (OPERATORS) IN RAPIDLY AND RELIABLY DETERMINING THE SAFETY STATUS OF THE PLANT" 4.31 Audit Team Observations The Waterford 3 SPDS displays parameter information in several forms, o Individual instrument channel readings. In some cases each of several i

redundant channel readings are shown on top level displays, o Averages of redundant ' inputs.

o The maximum value of redundant inputs.

o The sinimum value of redundant inputs.

The specific form of data display used for a particular input was based upon the designers' judgment about what form would be most useful to the operator . Whether a specific value displayed is an individual input, a maximum, a minimum, or an average is not clearly indicated on the displays.

Trending information is displayed as the numerical value of the parameter rate of change along with a dynamically updated bar that indicates the current relative magnitude of the parameter. -

Data validation in the PMC is conducted by first verifying, for each input, the integrity of the field multiplexers and the communications channel between the multiplexer and the computer. Data arriving via malfunctioning multiplexers or communications links are eliminated. The value of each remaining input is compared to the calibrated range of the input instrument channel and a " bad" data flag is set if the value is out of range. The inputs 7

- _ _ _ . __,,,...,._r.,.,,_.__,m,. , _ _ . . , , , - _ _ _ , , . . , _ _.,___,_,-_,__,y.. _____m. , ,,.

4 are, in come cases, also compared against expected parameter ranges and a

" invalid" data flag is set if the parameter is out of the expected range. 8 LP&L stated that the expected range limits are based upon normal operati.ng conditions. " Bad" data are not used in the calculation of parameter averages, maximums, or minimums. If only " bad" data are available for a given SPDS display field, then a series of asterisks appear in the field. At the present time this is also true for " invalid" data, however, LP&L is laplementing modifications that will result in " invalid" data being displayed as questionable. For some single input parameters, data validation also includes i status monitoring of the input instrumentation.

The audit team examined the limits used by the PMC in identifying " bad" data. In some cases, particularly temperature instrumentation lower limits, the limits of instrument range are well outside of the credible parameter values. The limits used by QSPDS and RMS to identify " bad" data and the units used by the three systems to " invalid" data were not readily available for review. The algorithms used to validate data and produce maximums, minimums and averages were also unavailable.

LP&L stated that the update interval for SPDS display of parameter values, may be as long as 60 seconds when the displayed value is the average of several inputs. Updating of SPDS parameter trending information is also performed a't one minute intervals. The current update intervals are limited by the processing speed of the SPDS hardware and would require either hardware or sof tware laprovements to shorten. At the time of the audit LP&L was attempting improvements in software efficiency to reduce the interval required to update parameter average values. This effort has succeeded in reducing the update interval to 15 seconds for some parameters.

As discussed in Section 4.2 considerable time is required to switch from one T" SPDS display to another. It would require more than five minutes to review

.all SPDS parameter information including trend data.

LP&L has conducted an availability analysis for the PMC [20] and concluded that the PMC would be available more than 99.7% of the time. The NRC audit team noted a number of differences between the current SPDS system and the systen examined by the reliability analysis, o The availability analysis only considered the PMC, however, the QSPDS and the RMS aust also be available for complete SPDS operability, o At the time of the availability analysis the PMC coasisted of two redundant computer systems with two Central Processor Units (CPUs) each. Each of the two redandant systems now have four CPUs that must be operable. ,

o The PMC has many displays and control keyboards located throughout the .

control room. All of these would have to fail to make the PMC inoperable. The SPDS has only two display units and one control keyboard; failure of both displays or the single keyboard would make the SPDS inoperable. ,

'8-

e 432 Audit Team Assessment The Waterford 3 SPDS does not completely satisfy the provisions of Supplement 1 to NUREG 0737 regarding rapid, reliable display. The slow system update rates and the long time and involved control actions required to review SPDS data prevent the system from providing a suitably rapid display of safety parameters. As discussed below, the audit team also could not conclude that the data validity checking provisions are adequate to insure reliable data is provided by the SPDS. ~

l o Data validity algorithms that rely only on checking for data that is within the instrument channel range can result in erroneous average parameter value displays if an input instrument f ails on' scale, o In some cases the off scale limits used for data validity checking are well outside of crediole parameter values, thus the validity checking would allow use of clearly erroneous inputs.

o The use of normal operating limits as criteria for determining when inputs are questionable could lead operators to incorrectly disbelieve their instrumentation under plant transient conditions.

o The data validation, averaging, maximum determination, minimum determination algorithms, and the specific setpoints used to identify

" bad" and " invalid" data are not well documented. Therefore, the i proper implementation of the algorithms was not readily auditable.

Furthermore, given the lack of such documentation, it would have been difficult for LP&L to conduct the level of independent review necessary

! to insure the correctness of the data manipulation algorithms.

LP&L has attempted to develop a high availability system with considerable t internal operational checking and hardware redundancy. However, the system availability calculations presented at the audit are not totally applicable to the existing SPDS system. Therefore, there is no clear basis for determining i if the SPDS hardware can be expected to have a high availability for use by the operators. The provision of a single keyboard in the control room represents a clear single failure point for the SPDS.

4.4 "THE PRINCIPLE PURPOSE AND FUNCTION OF THE SPDS IS TO AID THE CONTROL i ROOM PERSONNEL DURING ABNORMAL AND EMERGENCY CONDITIONS IN DETERMINING THE l SAFETY STATUS OF THE PLANT AND IN ASSESSING WHETHER ABNORMAL CONDITIONS l WARRANT CORRECTIVE ACTIONS BY CONTROL ROOM OPERATORS TO AVOID A DEGRADED CORE." l l

4.4.1 Audit Team Observations The Waterford 3 SPDS provides perceptual cues to abnormal parameter values by changing the color of characters displaying the value of the abnormal parameter. This feature is effective only for parameters contained in the displays called up on the control room CRTs. No perceptual cues are provided to alert operators to abnormal values of parameters that are not shown on the current displays.

t "9'

, - - - - nr-e-r,- pvn-~ws,-*--~r- ,w-m-+--~~- > v v-www~, - - - - - - , - - - -

wvm-v~~~~---n w +---wwme---. ,-----vw-- c-w-~- -~'- '-* * ~

Discussions with plant operators indicated that they did not find the SPDS useful and that they would be more likely to use the existing PMC displays and the hardwired control board readouts to assess abnormal plant conditions.

4.4.2 Audit Team Assessment The audit team agrees with the plant operators' assessment that the current SPDS does not provide a particularly useful aid to the control room personnel. Thus the Waterford 3 SPDS does not fulfill the operator aid provisions of Supplement 1 to NUREG 0737 in this regard. This conclusion is primarily based upon the following findings that have been further discussed in other sections of this report, o The displays require considerable time and operator attention to call up. It is probably both f aster and easier to obtain an overview of the critical plant safety parameters from the control boards than from the SPDS.

o The display formats are generally cluttered and difficult to understand.

o The trending capability of the PMC is much more useful than the SPDS's trending capability.

o The update intervals for SPDS data are so long that it is difficult for an operator to dynamically follow the actions of a parameter.

"(THE) SPDS (SHALL BE) LOCATED CONVENIENT TO THE CONTROL ROOM OPERATORS" T' 4.5.1 Audit Team Observations The SPDS CRTs and keyboa'd are located next to the control room supervisors station.

l 4.5.2 Audit Team Assessment The Waterford 3 SPDS satisfies the provisions of Supplement 1 to NUREG 0737 regarding convenient location. A considerable improvement would be achieved by mounting the SPDS CRTs on swivels so that displays could be rotated for observation by operators at the main control boards.

4.6 "THE SPDS SHALL CONTINUOUSLY DISPLAY INFORMATION FROM WHICH THE SAFETY STATUS OF THE PLANT . . . CAN BE ASSESSED. . ."

4.6.1 Audit Team Observations As discussed previously five screens of data must be examined in order to review the parameters which LP&L defines as needed to assess the safety status of the plant. No safety staus overview display is available nor are there any provisions for perceptual cues when a SPDS parameter that is not on the current display goes into an alarm condition.

"10*

, l s- l

. 4.6.2 Audit Team Assessment The Waterford 3 SPDS does not continuously display sufficient information for i the assessment of plant safety status.

4.7 "THE SPDS SHALL BE SUITABLY ISOLATED FROM ELECTRICAL OR ELECTRONIC INTERFERENCE WITH EQUIPMENT AND SENSORS THAT ARE IN USE FOR SAFETY SYSTEMS" 4.7.1 Audit Team Observations LP&L indicated that Class 1E isolation devices are used at each interface between Class 1E systems and the SPDS. Type test data for the specific

. isolation devices has been separately provided to the NRC.

4 4.7.2 Audit Team Assessment Detailed review of the isolation provisions was not within the scope of this audit.

4.8 " PROCEDURES WHICH DESCRIBE THE TIMELY AND CORRECT SAFETY STATUS ASSESSMENT

WHEN THE SPDS IS AND IS NOT AVAILABLE WILL BE DEVELOPED BY THE LICENSEE IN PARALLEL WITH THE SPDS. FURTHERMORE, OPERATORS SHOULD BE TRAINED TO

', RESPOND TO ACCIDENT CONDITIONS BOTH WITH AND WITHOUT THE SPDS AVAILABLE."

i 4.8.1 Audit Team Observations LP&L does not intend to have separate safety status assessment procedures for

plant operation with and without SPDS available. Instead the Waterford 3
- procedures identify the important safety parameters to be monitored and the operators will be trained to perform this monitoring function both with and without the SPDS. This training will be incorporated into the operator qualification and requalification training. NRC audit team discussions with

- LP&L training personnel indicated that confusion exists regarding who among I the operating crew will be considered primary SPDS users.

- LP&L has also not yet developed procedures to insure that consistency is maintained between the SPDS and Waterford 3 procedures.

{ 4.8.2 Audit Team Assessment 4

LP&L provisions for training will fulfill the operator training provisions of Supplement 1 to NUREG 0737 if a clear philosophy about SPDS use is developed 4

and incorporated into the training. Formal controls aust also be established to ensure that the SPDS, SPDS training, and Waterford 3 procedurfs remain mutually compatible.

l 4.9 "THE SPDS DISPLAY SHALL BE DESIGNED TO INCORPORATE ACCEPTED HUMAN FACTORS

{ PRINCIPLES SO THAT THE DISPLAYED INFORMATION CAN BE READILY PERCEIVED AND COMPREHENDED BY SPDS USERS."

i 11 4

--e - +-.%-- ,-,,..c..-~.ww...m.--...--... ,,.,-.w,,--e_,.,,._..,.-,.e,,%weey,,_,.%,.-%.,.+--e__-e.--, y,,---,,,w.m-- . rw e,

~

4 4.9.1 Audit Team Observations LP&L indicated that the human factors principles of NUREG 0700 and NUREG 0835 were incorporated into the system design. Additionally, a human f actors review of the Waterford 3 PMC including the SPDS was conducted and another one is planned as part of the closeout of the DCRDR effort. Nevertheless, the NRC audit team noted that the SPDS is not well integrated into the control room or operator tasks. Additionally the system design does not conform with good human f actors practices in many respects. The specific problems have been discussed in previous sections of this report.

Many of the specific human engineering discrepancies noted by the NRC audit team were also noted by previous human engineering reviews by LP&L and LP&L consultants [21]. LP&L, however, chose not to take action on the items identified by the human engineering reviews.

LP&L did indicate that as part of the DCRDR review of the SPDS, the abbreviations used by the SPDS will be made consistent with the abbreviations list in the Waterford 3 human f actors manual.

4.9.2 Audit Team Assessment The Waterford 3 SPDS does not completely satisfy the provisions of Supplement 1 to NUREG 0737 with regard to incorporation of Human Factors Engineering principles in the system design.

5.0 CONCLUSION

S The Waterford 3 SPDS meets neither the intent nor the letter of the SPDS provisions of Supplement 1 to NUREG 0737. This conclusion is based upon the

- many specific deficiencies noted by the NRC audit team. As a minimum the following deficiencies must be corrected to produce a system that satisfies the intent of Supplement 1.

o The existing system does not provide a concise display of the information needed to assess the safety status of the plant. At least five screens of data must be reviewed by the operators to determine the status of the primary SPDS parameters identified by LP&L. F urthermore ,

'in order to determine the status of containment isolation, the operators must examine hardwired indications located on several control boards, o The Waterford 3 SPDS does not provide continuous indication of information from which plant safety status can be assessed. There is no single display that can be used to determine plant safety status and no cues are provided to alert the operator to changes in the status of critical safety functions that cannot be evaluated using the display currently on the SPDS CRTs.

o The data validation techniques are limited in scope and in some cases incorporate validity checking criteria that would result in readings outside of the credible parameter range being interpreted as valid.

  • 12'

s o The parameter update interval of 60 seconds is far too slow to allow the operator to dynamically follow the actions of a parameter using the SPDS.

o LP&L has not established that the SPDS system may be expected to be available for operator use during a large percentage of plant operati ons .

o The Waterford 3 SPDS does not provide the operator aid intended by Supplement 1 to NUREG 0737. Basically the system is inconvenient to use, the data displayed is not trusted, and the system is too slow both in the updating of data on the screens and in the access of displays.

NRC audit team conversations with plant operators indicated that this assessment is shared by at least some on Waterford's operating staff.

o Good human f actors practices were not consistently applied to the SPDS. Many of the problems noted above fall into this category. In addition:

o The system is not well integrated into the the control room or operator tasks.

o Considerable operator time and attention is required to access displays.

o Many of the SPDS displays are cluttered and difficult to read, o The system seems to confuse data validity flags with alarming of conditions that are outside of the normal operating range.

i' These are not problems that can be solved by simple modifications to the existing system. LP&L needs to completely reevaluate the Waterford 3 SPDS starting with a clear definition of SPDS functional requirements. Further, the problems identified in this report probably do not represent an exhaustive list of the deficiencies with Waterford's SPDS. The NRC effort was only an audit and not an extensive design review that could be expected to develop a complete list of deficiencies.

During the reevaluation process LP&L should be careful not to limit the scope of the evaluation to the functions performed by the SPDS preprocessor / display generator. The PMC, the QSPDS, and the RMS all perform critical portions of the SPDS f unction and thus must be considered as integral parts of the SPDS.

Given the significant shortcomings of the Waterford 3 SPDS design it is clear that LP&Ls design process did not incorporate a sufficiently riforous Verification and Validation program. This finding is further supported by the inability of the NRC audit team to identify specific SPDS functional requirements that had been developed by LP&L and to track these requirements through the design and testing phases of the project. The audit team did find that extensive field testing was conducted for the PMC and SPDS. However, without clearly defined performance requirements, the relevancy of this testing could not be established.

'13'

The NRC audit team does not believe any benefit would be derived from retrofiting a V&V process to the existing SPDS. Instead it is recommended that LP&L SPDS redesign efforts incorporate a thorough and rigorous V&V program in order to insure that an acceptable and useful system is developed. The guidance of NSAC/39 is recommended as a source of criteria for a V&V program, i-O e

W

  • 14-

f . l

. 9

6.0 REFERENCES

6.1 GENERAL REFERENCES

1. U. S. Nuclear Regulatory Commission, NUREG-0737, " Clarification of TMI Action Plan Requirements," November 1980, Supplement 1 December 1982.
2. U. S. Nuclear ' Regulatory Commission, NUREG'0800, " Standard Review Plan for the Review of Safety Analysis Reports for Nuclear Power Plants," Section 18.1, Control Room, Rev. 0
4. U. S. Nuclear Regulatory Commission, NUREG'0700, " Guidelines for Control Room Design Review," September 1981.
5. U. S. Nuclear Regulatory Commission, NUREG'0835 "Hean Factors Acceptance Criteria for the Safety Parameter Display System"
6. U. S. Nuclear Regulatory Commission, NUREG'0696, " Functional Criteria for Emergency Response Facilities," February 1981.
7. Instrumentation for Light' Water Cooled Nuclear Power Plants to Assess Plant and Environs During and Following an Accident, Regulatory Guide 1.97, Rev. 2, Nuclear Regalatory Commission, Office of Standards Development (December 1980).

6.2 DOCUMENTS EXAMINED DURING AUDIT

8. 1305-101C, " Control Specification for Emergency Response Facility" May 24, 1982.
9. LP&'L Memo, PMC 84*0041, W. M. Alphonso to Distribution " Design Document for the Plant Monitor Computer System" March 23, 1984.
10. W3'DDS-011. " Safety Parameter Display System Computational Sof tware Detailed Design Specification Data, Data Base Specification",

February 21, 1984

11. PE-3-012 Rev. O, " Administrative Procedare Computer Software Control Implementation", April 5,1982.
12. PE'3"015 Rev. O, aEngineering Procedure Computer Acquisition",

November 23, 1982.

13 PE*3'027 Rev. O, Plant Engineering Computer Controls Procedare Computer Software Walkdown Review", February 16, 1983.

14 NS20089 SPG Rev. 4. "LP&L Waterford 3 SES Plant Monitor Compater System Description". September 4,1985.

  • 15'

I '

, o

15. SPO'50D'001, Rev. O, "Preoperational Test of Qualified Safety Parameter Display System Including Test Results" November 2,1983
16. PML83'0081, " Plant Monitor Compater Safety Parameter Display System Functions", August 11, 1983.
17. OP"902-004, Rev. 2, " Emergency Operating Procedure, Excess Steam Demand Recovery Procedure", March 8,1985.

18 W3'DB5'011. " Safety Parameter Display System Computational Sof tware Data Base Specification", August 1983

19. W3*FDS'011. "SPDS Computer Sof tware Functional Design Specification",

December 1983

20. " Availability Studies for Waterford 3 Plant Computer System", E. Gai, R. B. Gounley, J. V. Harrison, W. W. Weinstein, August 1981.
21. " Human Factors Engineering Review of the Waterford 3 SES Process Monitoring Computer", Essex Corporation.
22. " Safety Parameter Display System (SPDS), Waterford SES Unit 3.

Louisiana Power and Light Company" Transmittal letter: W3P34'1007, dated April 16, 1984.

23 " Validation Test Plan", no date.

24. " Safety Parameter Display System (SPDS) Validation Test Procedure" (Draft), no date.
25. LP&L Memo, W3P85'2433, M. J. Meisner to File, "SPDS Background",

dated September 13, 1985.

26. 231" ICE"3218, Rev.1. " Functional Design Description for the Qualified Safety Parameter Display System for Waterford Steam Electric Station, Unit Nrsber 3", Combustion Engineering,

. April 11, 1984

  • 16'

TABLE 1 SAFETY FUNCTION PARAMETERS WATERFORD GENERATING STATION, UNIT 3 Critical Safety Function Parameters Reactivity Control o Neutron Flux o Control Element Assembly Position o RCS Boron Concentration o Charging Pump Flow Rate

~

Reactor Core Cooling and o Pressurizer Level Heat Removal o Pressurizer Pressure o Saturation Margin o Reactor Vessel Water Level o Core Exit Temperature o Hot Leg Temperature o Reactor Coolant Pump Current o Safety Injection Flow o Steas Generator Water Level o Steam Generator Pressure o Steam Flow Rate o Main Feedwater Flow Rate o Emergency Feedwater Flow Rate o Shutdown Cooling Heat Exchanger Flow Rate o Shutdown Cooling Heat Exchanger Temperature RCS Integrity o Pressurizer Level o Pressurizer Pressure o Containment Sump Level o Pressurizer to Quench Tank Line Temperature o Quench Tank Level o Quench Tank Pr essure o Secondary Side Radiation Radioactivity Control o Containment Atmosphere Radiation o Main Steam Line Radiation o Condenser Air Ejector Radiation o Plant Stack Radiation o Liquid Waste Management Discharge Radiation -

Containment Conditions o Containment Pressure o Containment Temperature o Containment Spray Flow o Containment Fan Cooler Differential Pressure o Containment Hydrogen Concentration

  • 17-