ML20205B131
| ML20205B131 | |
| Person / Time | |
|---|---|
| Site: | Harris |
| Issue date: | 04/17/1985 |
| From: | Oconnell W LAWRENCE LIVERMORE NATIONAL LABORATORY |
| To: | |
| Shared Package | |
| ML18019A449 | List: |
| References | |
| NUDOCS 8508150198 | |
| Download: ML20205B131 (30) | |
Text
.-
9 I
TECHNICAL EVALUATION REPORT 0F THE SAFETY PARAMETER DISPLAY SYSTEM Shearon Harris Nuclear Power Plant Carolina Power and Light Company William J. O'Connell Lawrence Livermore National Laboratory April 17, 1985 O
7
1.
BACKGROUND All holders of operating licenses issued by the Nuclear Regulatory Commission (licensees) and applicants for an operating license (OL) must provide a Safety Parameter Display System (SPDS) in the control room of their plant. The Commission approved requirements for the SPDS are defined in Supplement 1 to NUREG-0737 (Reference 1). The SPDS requirements are specified in Section 4.1 of Supplement I to NUREG-0737. Documentation and NRC review of the SPDS are specified in Section 4.2 of Supplement 1 to NUREG-0737. Further guidanco on SPDS requirements and on NRC review is provided in NUREG-0800, the Standard Review Plan (SRP). Section 18.2 and Appendix A to Section 18.2 (Reference 2).
The purpose of the SPDS is to provide a concise display of critical l
plant variables to control room operators to aid them in rapidly and reliably determining the safety status of the plant. NUREG-0737, Supplement 1, requires licensees and applicants to prepare a written safety analysis describing the basis on which the selected parameters are sufficient to assess the safety status of each identified function for a wide range of events, which include symptoms of severe accidents. Licensees and applicants shall also prepare an ipplementation plan for the SPDS which contains schedules for design, development, installation, and full cperation of the SPD3 as well as a design verification and validation plan. The SPDS safety analysis and implementation plan are to be submitted to the NRC for staff review.
2
e THe results of the staff's review are to be published ir, a Safety Evaluation Report (SER). The SER evaluates =$ ether the SPDS meets the requirenents in Section 4.1 of NUREG-0737 Suppispunt I and whether the changes to implement the SPDS involve an unreviesed safety question or a change in the technical specifications. The human factors scope of the staff's review is described in Section I of NUREG-0800, Section 18.2.
This scope is limited to the principal function of the SPDS, which is to aid control room operators determine the safety status of the plant as described in Paragraph 4.1.a of NUREG-0737 Supplenent 1.
The scope of the SPDS review does not include secondary functions which may be placed in the SPDS, such as presentation of additional data to assist operators with diagnosis of abnormal conditions. Also, the scope of the SPDS
~
review does not include other functions that may be incorporated in an integrated computer system which supports the SPDS function, except to verify that the other functions will not degrade the reliability or performance of the SPDS.
This Technical Evaluation Report (TER) supports the NRC Human Factors Engineering Branch (HFEB) review of the SPDS at the Shearon Harris Nuclear Power Plant ($HNPP). It does not include review areas assigned to other branches of NRC (see Reference 2). This TER includes human factors elements for the following SPDS topics:
e 1
r-1.
Implementation plan and activity.
2.
Verification and validation (V4V) plan and activity.
3.
Data validation.
4.
The requirements in Section 4.1 of NUREG-0737 Supplement 1.
2.
INTRODUCTION Carolina Power and Light Company (CP&L) submitted a letter to NRC on, l
April 15,1983, (Reference 3) responding to NRC Generic Letter 82-33 on
~
requirements for emergency response capability. This CP&L letter submitted brief plans and schedules for emergency response capabilities including the SPDS for their Shearon Harris Nuclear Power Plant. The Shearon Harris NPP is now under construction and CP&L plans to load fuel in 1986.
CPAL submitted a safety analysis for the Shearon Harris Safety Parameter Display System on December 2,1983 (Reference 4). This submittal describes the SPDS with regard to the set of plant process variables selected for the SPDS and the adequacy of the selected variables to assess the status of each critica) safety function of the plant. The submittal also provides preliminary information about the top ievel and second-level SPDS displays which will convey information to the control
~
room operators on the status of the critical safety functions.
In addition, the submittal addresses data validation and the display of data quality status. -
e s
g The NRC staff conducted an on-site design verification audit of the Shearon Harris SPDS. The NRC audit was carried out on March 5-7, 1985, under the lead of the Human Factors Engineering Branch (HFE8) with participation of staff members from the Instrumentation and Control Systems Branch (1C58) and the Procedures and Systems Review Branch (PSRB) and a consultant to the NRC from Lawrence Livermore National Laboratory. During the NRC audit, personnel were available from CPAL and from Science Applications International (SAI), CP&L's prime contractor for the SPDS hardware and software system and,the related SPDS safety analysis.
3.
SPOS IMFLEMENTATION PLAN AND ACTIVITY The Shearon Harris SPDS iglementation plan is in a form which is not easily amenable to internal or external review.,The 1905 project is being executed and a prototype of the system has been developed. Thus,.
part of the iglementation plan is cogleted. During the design verification audit, through discussion and examination of documents, the NRC audit team was able to assess the execution of the implementation plan. Several elements of the plan have been recently expanded or added ira and are not yet documented.
The implementation plan consists of the following docketed document:
1.
Letter f rom E.E. Utley, CP&L, to Harold R. Denton, NRC, April 15, 1983, on Shearon Harris NFP, Docket Numbers 50-400 and 50-401, Requirements for Emergency Response Capability, including one-third
.sa
y of a page summarizing the SPDS planning and schedule; and of the following undocketed design documents, available during the design verification audit:
2.
"ERFIS (Emergency Response Facility Information System) Functional Requirements" document, by CP&L.
3.
"ERFIS Functional Technical Specification," Volume 1, by SAI.
4.
A functional technical specification combining (3) and (4) above; in printing and not yet available at the time of the audit.
5.
Milestone charts used by CP&L for internal planning and control of the SPDS project.
Because of the recent additions or changes to the implementation plan, we recommend that CP&L submit a revised SPDS !splementation Plan to the NRC. More specific reconnendations on its contents are given below in Section 3.4.
3.1 SPDS Displays
(
During the design verification audit, the NRC audit team was told that the emergency operating procedures (EOPs) development provides the SPDS design program with key inputs:
~~
i l
l (1) The critical safety function (CSF) status trees which guide the operators in the sequence of data observations necessary to determine the status of the CSFs and the need for corret.tive I
actions, i.e., E0Ps; (2) the set of parameters needed to check the CSF status; and I
(3) the graphical tree format to display the CSF status trees and current status.
The CP&L Operations Department is responsible for specifying the SPDS display formats (content and visual arrangement). This work is being done by an operations person involved with the Westinghouse Owners Group (WOG) development of emergency response guidelines (ERGS) and the CP&L development of E0Ps based on the ERGS; a shift supervisor; and a shift technical advisor who is a former operator. The top level SPDS display.
is a set of CSF status blocks, that are color coded for safety status.
The second level display selected by CP&L for each CSF is the CSF status tree taken from the WOG ERGS Revision 1.
The top level and second level SPDS displays are described in the CP&L Shearon Harris SPDS Safety Analysis.
Third level displays are being developed by the CP&L Operations Department Team by looking at what parameters, trends, limit values, etc., the operators will need to carry out the procedures. We find that this approach provides good integration of the SPDS with t'he E0Ps. ;
The human f actors element of the displays design and review will be discussed in Section 3.2 below. The verification and validation of the final displays will be discussed in Section 3.3 below.
CPAL stated during the design verification audit that when the set of SPDS displays is finalized and installed, then any proposed change to this set of displays will be subject to control by the plant change procedure. This control procedure implies that there will be a final set of SPOS displays that will be reviewed for acceptability on the 4
basis of completeness for operational needs and consistency with human factors criteria. We find it is a good practice to maintain a fixed base of standard SPDS displays that will be compatible with training needs and ease of operation.
We noted, from examining a prototype SPDS at the Shearon Harris plant site, that it is possible to change elements of a display, e.g., range,.
parameter displayed. If operator change of displays is to be allowed, it should (1) be written up in the SPDS operating procedures; (2) be apparent in the display, and (3) apply to that one time only, and not when the display is called up again.
It is also possible to create new displays through a display-keyboard console and to store the display template for later recall. This can be done by calling up an existing template for a bar chart, trend chart, or X-Y plot, and changing the names and numbers of the variables to be This displayed, as well as changing the display scales and units.
revised template can be stored under a new storage name for later - -.
recall. If this display creation and storage are to be allowed by the SPDS computer software, then we recommend that the following guidelines be followed to preserve the standard set of displays and operations and to avoid confusion:
1.
No changes to the program software should be allowed.
2.
No changes to or overrides of the stored standard SPDS displays shouldbeallE~wed. This can be ensured by software and data media protection methods.
3.
The method of display recall should be different than that for the standard SPDS displays.
4 The capability to create new displays and the related administra-tive controls should be written up in the SPDS operating procedures.
5.
Storage of additional displays should be controlled by appropriate acceptance standards which Rey be different from the SPDS standards.
3.2 Functional Requirements and Functional Specifications in the SPDS Inglenentation The C?&L Functional Requirements document and the SAI Fun 6tional Technical Specifications document for the Shearon Harris SPDS address --
primarily the hardware and software system Tor the SPDS, including data acquisition and signal conditioning, computer and peripheral hardware, and software. These documents indicate a well-prepared planning and procurement activity for most of these aspects of the SPDS. These documents, however, do not give an adequate level of attention to human factors principles to allow development of design guidance. These documents also do not give adequate specifications for the set of SPDS displays upon which to base development of detailed design and test and acceptance criteria.
The functional requirements and specifications documents for the Shearon Harris SPDS have human factors requirements for several elements of the SPDS, but do not call for a high-level human factors participation in the system design effort and do not specify a set of human factors principles to be used as a basis for developing design guidance and test and acceptance criteria. The documents do specify that the keyboard-.
visual display terminals shall be easy to use for a diverse set of users. The sit-down consoles have specifications so as not to block the line of sight to the vertical boards of the main control room boards, and size and ergonomics considerations. The visual display units have specifications on resolution and pixels per inch, which are relevant to display readability.
Based on the NRC audit team's independent assessment of a prototype SPDS display at the plant site, there appears to have been inadequate
attention paid to the human factors discipline in the development of the displays and to display usage via the keyboard. Further details on this j
matter are in Appendix A of this report.
During the design verification audit, the NRC audit team learned that I
the displays will be reviewed for human factors as part of the detailed control room design review (DCRDR). We believe that such review will help assure that the final displays' design meets human factors e
requirements. We recomend that a human factors specialist be brought l
into the design stage as well as the review stage. This will provide more extensive consideration of human factors issues at the design l
stage. Attention should be given not only to legibility, internal 1
l consistency, and consistency with population stereotypes and conventions, but also to information transfer and cognitive processing needs. If the human factors review is done as part of the DCRDR, then I
}
the SPDS Imp'lementation Plan should note this plan and provide reference l
to the specific DCRDR document which provides for planning the review and resolution of findings.
1 2
The functional requirements and specifications documents for the Shearon
.r Harris SPDS specify three levels of SPDS displays but not in enough detail to establish design guidance or acceptance criteria for a test plan. The documents do not describe the assignment of responsibilities for display definition between SAI and CP&L's Operations Department (see Section 3.1 above).
3.3 Elements Not included in the Requirements and Specifications Documents 1
As noted earlier, the CP&L Functional Requirements document and the SAI Functional Technical Specifications document for the SPDS address primarily the hardware and software system for the SPDS. These documents do not specify:
1.
Operator training in emergency operations both with and without the
=
SPDS available; 2.
Integration of hardware, displays, procedures, and training; and 3.
The high level functionality of the SPDS, i.e., its usefulness to tne operators, as required by Paragraphs 4.la and 4.le of NUREG-0737, Supplement 1.
These items are also inportant later in V&V, because only goals and functions which have been identified can be included in the plan for verification and-validation.
During the design verification audit, the NRC audit team was told that the revised E0Ps and the CSF status tree graphical formats have been put through a V&V program at CP&L's Robinson NPP in 1983. The E0P flow paths and CSF status tree formats for the Shearon Harris NPP h' ave been through a V&V program at the Shearon Harris simulator. The CSF status
~
trees, in poster format, were used in place of the future computerized visual display format. This simulator program used one operator crew 9
from the Robinson plant and one operator crew from the Shearon Harris plant. We recommend that this program to validate the CSF status tree formats be mentioned in the SPDS Implementation Plan and described in the SPDS V&V plan, or described by reference to available documentation within programs other than SPDS.
During the audit, the NRC audit team learned that CP&L is planning to do integrated tests of their NUREG-0737 related emergency response initiatives on the Shearon Harris plant simulator which will include an SPDS.
If suitably planned and executed, such tests can validate the high level functional goals of the SPDS while using control room operators in the simulation.
We recommend that a human factors specialist be included in the planning and observation of the integrated simulator tests, so that the human factors concerns are explicitly and adequately addressed as part of the.
tests.
The SPDS Verification and Validation Plan (see Section 4 below) considers feedback, review, and resolution of V8V findings to be part of the development program rather than part of the V8V program. From the implementation planning documentation to date, we are uncertain how the SPDS development will handle this feedback. We recommend that the revised SPDS Implementation Plan call out this review and resolution as an explicit activity in the SPDS implementation.
3.4 Revision of SPDS Implenentation Plan CP&L appears to now have the major activities and verification steps for the Shearon Harris SPDS in progress or planned. Because of the recent changes or additions to CP&L's Shearon Harris SPDS implementation i
planning, which we learned of during the design verification audit, and because some of these changes have not yet been written down in an overview form, we recommend that CP&L submit a revised SPDS Implenentation Plan to the NRC. The revised plan should summarize the goals; the implementation activities in design, development, testing and installation; and the criteria or references to criteria which will be l
used as guidance for design and for khe test plan.
Those elements already completed should be described briefly. Those requirements or sub-requirements of an SPDS which will be implemented within other emergency response initiatives, e.g., E0Ps or DCRDR, may be documented by reference to the appropriate documents of the other initiatives.
4.
VERIFICATION AND VALIDATION PLAN AND ACTIVITY During the design verification audit, the NRC audit team examined:
" Verification and Validation Plan and Procedures for CP&L Shearon Harris Nuclear Power Plant Emergency Response Facility Information System," by SAI, April 16, 1984,
and discussed V4V with CP&L and SA1 staff. The Shearon Harris SPDS V8V i
plan document covers the hardware and computer software, since that is what is covered in the SPDS system specification document.
Topics outside the scope of, or addressed to only a limited extent in, the specification documents have been noted above in Sections 3.2 and 3.3.
The topics of concern are specifically:
~
Human factors principles; 1.
2.
Final set of SPDS displays;
~
3.
Operator training in emergency operations both with and without the SPDS available; 4
Integration of hardware, displays, procedures, and training; and 5.
The high level functionality of the SPDS, i.e., its usefulness to the operators.
As noted in Section 3.3, CP&L is planning reviews and validation tests I
in these areas. These plans should be mentioned in the SPDS Implerentation Plan. The plans should be described and submitted to the NRC in appropriate documentation--either in a second volume of the V&V Plan or in the documentation of the initiatives, such as the DCRDR or the E0Ps validation, which have the assignment to review a~nd validate these SPDS functional requirements. In the latter case, the submitted SPDS documentation should provide references to the other programs' documentation. We note that the latter approach of assigning parts of the SPDS review to other emergency response initiatives can have a j
benefit by enhancing the integration of these related programs.
The V&V plan for the hardware and computer software is well structured and documented, and should, if implemented properly, provide adequate 1
V&V for those subsysteps.
A point of information to be noted is that the V&V activity does not replace project quality control, quality assurance, or integration testing. Rather, the V8V effort takes an approved product -- a specification, detailed design documents, or implemented product -- and performs an independent review and assessment. Concerning final product testing, validation develops an independent description of testing needs, test plan, test environment, and test contents. Then the V8V effort reviews the developer's integration test plan and observes the integration tests.
If the integration test checks all the functional requirements of the integrated system, then this test may suffice as the validation test.
If deemed necessary, V8V specifies additional tests.
This approach that maintains independent V&V review is consistent with the SPDS V8V approach described in NSAC-39 (Reference 5) and is adequate to achieve a good V8V of the final product.
The V8V tasks will result in the following V8V reports from CP&L's contractor SAI to CP&L:
4 2 :
1
1.
SPDS V8V Plan 2.
SPDS System Requirements Verification Report 3.
~3PDS System Design Verification Report 4
SPDS System Validation Test Plan 5.
SPDS Final V8V Report The SPDS final V8V report is to be done after the V8V team can examine the as-built system report and and observe the site acceptance test.
The SPDS System Requirements Verification Report was completed at the time of the NRC design verification audit, but was not reviewed as part of the audit. We recommend that this report be reviewed by the NRC at a future design validation audit or installation audit. This review item might be covered without a site visit if all relevant documents are submitted to the NRC for review. We recommend that the NRC review of the SPDS System Requirements Verification Report and of further results,
of the V&V activity examine:
1.
What the V&V review findings are concerning the SPDS system requirements documents reviewed by the V AV activity. The V8V
, y,.
findings may be compared with the findings of the NRC design f
4 verification audit as described in this report.
2.
Whether changes were made to the SPDS system requirerents documents as a result of the V8V findings.
1 3.
How the SPDS program handles the resolution of the V8V findings.
1 17 _
4 How the SPDS program documents the resolution of the V8V findings, both when the resolution involves changes to the SPDS and when the i
resolution involves no change to the SPDS.
5.
DATA VALIDATION AND DISPLAY CP&L'S Safety Analysis Report for the Shearon Harris SPDS (Reference 4) briefly discusses data validation.
It notes that the SPDS checks for signals of bad quality, discards them and uses the remaining redundant signals with a reduced logic or by averaging, as appropriate for the E0Ps. If there are no signals of good quality, the SPDS display indicates this with a white color code in the following ways:
1.
The status block for that CSF (included in every SPDS display) is shown in white.
2.
The current status path through the CSF status tree (if displayed) is carried only as far through the tree as is possible using valid data, i.e...up to the point where the invalid data would be needed in the sequer.ce of decisions on status; and the status path up to that point is displayed in white.
3.
The invalidated parameter value (if dispiayed) is in white.
During the design verification audit, the NRC audit team learned some of i
the checks done within the SPDS to determine the data quaiity. At the data multiplexer, checks are done to detect:
18 -
e
,m.-
1.
Open thermocouple 2.
Data out of the possible range of the transducer or amplifier /
converter.
At the SPDS computer, checks are done as appropriate to determine:
3.
Disagreenent among redundant sensors.
4.
Data outside a specified reasonable range.
5.
Data rate of change above a specified reasonable range.
If some sensors are invalid by any test, and other sensors for the variable are valid, this sensor status information is recorded and is available for a data display.
We found that the cisplay indications for invalid data are clear and adequate. Further, the status path part way through the status tree shows the operator where the problem of invalid data begins to affect the sequence of decisions about the plant safety status.
6.
NUREG-0737 SUPPLEMENT 1 REQUIREMENTS This section reviews the planned Shearon Harris SPDS against the requirements in NUREG-0737 Supplement 1. Section 4.1.
TKe requirements are quoted or summarized in the following discussion. Underlining of i
key phrases in the requirements is taken from the analysis in the Standard Review Plan, Section 18.2, Appendix A (Reference 2).
Paragraph 4.1.a "The SPDS should provide a concise display of critical plant variables to the control room operators to aid them in rapidly and reliably determining the safety status of the plant. Although the SPDS l
will be operated during normal operations as well as during abnormal conditions, the principal purpose and function of the SPDS is to aid the 4
control room personnel during abnormal and emergency conditions in determining the safety status of the plant and in assessing whether abnormal conditions warrant corrective action by operators to avoid a degraded core. This can be 'particularly important during anticipated transients and the initial phase of an accident."
i The proposed SPDS does provide a concise display of the selected variables. Procedures and Systems Review Branch (PSRB) will review whether the set of selected plant variables is sufficient. The selected variables are displayed at one console. In the secondary displays for each CSF, the variables are tabulate'd and the status whether above/below set points related to safety is shown in the status tree.
In third level displays, trends of variables are shown. Thus, the operators can use third level displays to see whether parameters are approaching set points.
The displayed data are current; data screens once displayed are updated on a two to three second cycle. Reliability of the operator's cetermi-i.
~.., -- ---- -
,-a
,-,v,-
w
nation of safety status is aided by data validation. Data validation is performed and invalid status is displayed by a white color code.
The SPDS should aid the control room operators in " rapidly and reliably i
determining the safety status of the plant" and in " assessing whether abnorrel conditions warrant corrective action by operators to avoid a degraded core".
In general, tne proposed SPDS will do this well. The 4
displays indicate current status and are closely related to the symptom-oriented E0Ps.
There are however still several open questions:
1.
Display content and format. For the operators to fully determine the safety status during syrptoms of off-normal conditions and during novel combinations of conditions, certain displays rey be needed (e.g., trends, two or three related trends shown together). The displays are not finalized, and acceptance criteria (operations and task-related criteria, human factors criteria) are
\\
not yet detailed enough to assure adequate development.
~
+
l l
2.
Rapidity:
In the prototype SPDS it can take up to 30 seconds to get a desired third-level display (see Appendix A of this reporth this seers too slow. The system has the capability to provide faster response, 6 to 10 seconds at most. Whether the programming and display construction will realize this is an open question.
1 a
l -.
3.
, Aid to the control room operators in determining status. This requirement parallels that of Paragraph 4.1.e and will be discussed under that paragraph below.
Paragraph 4.1.b "Each operating reactor shall be provided with a Safety Parameter Display System that is located convenient to the control room operators. This system will continuously display information from which 1
the plant safety status can be readily and reliably assessed by control room personnel who are responsible for the avoidance of degraded and danaged core events."
Dedicated SPDS displays will be provided in the control room at a sit-down console and at one display on the control board. Another control board display may optionally be used for SPDS display. The dedicated displays provide a continuous display of SPDS information. The Shearon Harris NPP DCRDR will review questions of glare from lighting, obstruc-tion of vision and movement, and other factors of the convenience of the SPDS location. We conclude that the requirement of (4.1.b) will be satisfied.
Paragraph 4.1.c This requirement calls for electrical isolation and for procedures and operator training for assessment of safety status and response to accident conditions both with and without the SPDS l
available.
Suitability of electrical and electronic interference isolation will be reviewed by Instrumentation and Control Systems Branch. The NRC audit i !
l-
/
team learned that the operators will be trained to respond to accident conditions both with and without the SPDS available. CP&L plans tests on the Shearon Harris simulator with SPDS to validate the combined emergency response initiatives including control room enhancements, SPDS, E0Ps, and training. A summary of the test plans and how they relate to validating SPDS functions should be documented in the SPDS VSV Plan and SPDS System Validation Test Plan. The needed documentation pay oe by reference to documentation of other emergency response initiatives where appropriate.
Paragraph 4.1.d and 4.1.f These sections relate to the minimum requirements on information and parameters to be displayed by the SPDS. These requirements will be reviewed by PSRB.
Paragraph 4.1.e "The SPDS display shall be designed to incorporate accepted human factors principles so that the displayed information can, be readily perceived and comprehended by SPDS users."
i 1.
Human factors principles.
s.
A human factors specialist should be included in the design as well as in the review of the. displays. The human factors objectives under consideration include information transfer and design for ready comprehension by the SPDS users.
4 b.
CP&L plans to do human factors review of the SPDS displays as part of the DCRDR. We recorrend that CP&L docunent this assignment of review by noting it in the SPDS Implementation Plan and by j
i j -.
~
providing reference to the DCRDR documentation that incorporates the SPDS display review. The references nuy be provided in the SPDS Implementation Plan or in the SPDS V8V Plan. Human factors review should include the keyboard use, i.e., how displays are called up.
I c.
Human factors principles for the SPDS hardware, software and i
generic display formats should be incorporated in the system specification documents in a more comprehensive way. The i
l corresponding acceptance criteria should be incorporated into the SPDS System Validation Test Plan, s
2.
Ready perception and comprehension by SPDS users.
a.
As just noted in (l.a) above, this goal should be incorporated into the human factors considerations in design of the set of displays.
b.
CP&L plans to do integrated tests on their simulator with SPDS, r
using simulations of emergency conditions and using operator crews. Such tests could validate satisfaction of the requirement that the displayed information can be readily perceived and comprehended by SPDS users.
We recommend that a human factors specialist be included in planning and doing the test, and that the tests explicitly address i ;
the above SPDS requirement. The SPDS Implementation Plan or System Validation Test Plan should document the test goals and planning by sumary or by reference to other existing documentation.
REFERENCES 1.
U.S. Nuclear Regulatory Comission, NUREG-0737, " Clarification of TMI Action Plan Requirements", November 1980; Supplement 1. " Requirements for Emergency Response Capability". Decerter 1982.
2.
U.S. Nuclear Regulatory Comission, NUREG-0800, " Standard Review Plan for the Review of Safety Analysis Reports for Nuclear Power Plants",
Section 18.2, " Safety Parameter Display System (SPDS)", Rev. O, November 1984; and Appendix A to SRP Section 18.2, " Human Factors Review Guidelines for the Safety Parameter Display System (SPDS)", Rev. O, November 1984.
'3.
Letter from E. E. Utley, Carolina Power and Light Co. (CP&L), to H. R.
Denton, NRC, "Shearon Harris NPP; NRC Generic Letter 82-33, Requirements for Emergency Capability; Plans and Schedules for Emergency Response Capabilities", April 15, 1933.
u e
4.
Letter to: Harold R. Denton, NRC, from M. R. McDuffie, Carolina Power i
and Light Company;
Subject:
Shearon Harris Nuclear Power Plant, Units No.1 and 2, Docket Nunbers 50-400 and 50-401; NRC Generic Letter 82-33,
" Requirements for Emergency Response Capability", December 2,1983, with attachment:
Safety Analysis of the Shearon Harris Safety Parameter Display System. September 1983.
S.,
" Verification and Validation for Safety Parameter Display Systems",
Report NSAC-39, Nuclear Safety Analysis Center, Palo Alto, CA, December 1981.
i W -
Appendix A INDEPENDENT ASSESSMENT OF DISPLAYS IN A PROTOTYPE SPOS During the design verification audit, the NRC audit team carried out a mini-assessnent for human factors on a prototype SPDS. The prototype system was set up in the computer room near the control room.
It used simulated data for testing and had local and remote display terminals. This system was a
~
prototype and not a final product.
The keyboard is a QWERTY typewriter-style keyboard plus special function keys. One block of dedicated special function keys allows the top SPDS display or any of the secondary displays, one for each CSF, to be called up by a single keystroke. This access is good for speed of response and ease of use. The programmable special function keys are grouped in blocks of six, each block distinguishable by location or color. This is a good human factors poi nt. One group of six keys have meanings which depend on the display; a menu appears near the bottom of the display. This is also a good feature for tase and speed of operation. Some groups of keys have programmable functions which are independent of the current display. These can be assigned to frequently used functions.
In some displays, the menu choices include a callup of another display. This is a fast way to switch among related displays.
~
The generally available way to call up a third level display is versatile but rather slow. The steps are:
r e'
1.
Press key for " third level".
The system generates a display which lists the available third level displays by name, and asks the user to type in a name.
2.
Type in a name.
The system displays a tenplate for the selected display, i.e., a table showing variable names, limits, colors, etc. The user can change entries if desired.
3.
Press key for the graphical display corresponding to this teglate.
The system displays the graph, then continues to update the dynamic elements of the display.
The system takes up to 6 or 8 seconds to generate a display including its static elements. The system updates the dynamic elements more rapidly, usually on a 2-second cycle for readability.
i The system will not start a newly requested function until it has cogleted the previous request. This can take up to 6 or 8 seconds.
The system has the latent capability for speedier response to a user request. We recomend consideration of the following changes:
1.
When the final set of displays is developed using operations and human factors inputs, decide which displays are related and are likely to be
~
requested next.
Include these in the menu of special function keys.
a.
2.
Go to the graphical display rather than displaying 1,ts template first.
Leave the teglate as a next-step option.
3.
The system should not wait until cogleting a comand (e.g., to produce a display) before checking whether a subsequent comand has been requested by the user. The system should check frequently, e.g., every two seconds or less, whether a new keystroke comand has been entered.
If so, start that comand; do not coglete the previous display.
The displays' visual designs have many good points for human factors. They also have some human engineering deficiencies which may become problems if carried into the final designs.
Good points we noted include:
1.
The parameter names, units and present values are displayed in characters as well as in the graphical form.
2.
On trend graphs, a " trend line" can be set at any time point of the time axis, e.g., 3.5 minutes before present. The data values at that time ir are displayed in numerical form.
3.
For multiple parameters in a graph, the graphical and text information for a parameter are color-matched.
E o&
i
~ r
\\
4 The current path through a status tree is shown as a thicker line as tell as by color. Thus the current status is visually indicated not only by color and endpoint, but also by a shape coding, i.e., the particular path's shape.
or 5
When some data is invalid, the current path (thicker line, in white) extends only part way through the tree. This is an especially visible and sWape coding.
Some potential human engineering deficiencies are the following:
1.
Red and magenta text are difficult to read on a dark background. Such colors are used for status in Level 1 and 2 displays and for text-graph
~
matching in Level 3 displays.
2.
Some numbers display too many digits, e.g., a pressure of 1906.89 psi g.-
3 On the status trees, the same variable names are sometimes spelled out it and sometimes abbreviated. Some abbreviations seem to be on a space-available basis rather than standard abbreviations.
4 On the trend graphs, current time is at the right, past time is to the left (so far so good; increasing time is going to the right, good for estimating whether a trend slope is rising or falling). The time axis values for past time should have negative numbers, e.g., -1 minute.
.