ML20199C854: Difference between revisions

From kanterella
Jump to navigation Jump to search
(StriderTol Bot insert)
 
(StriderTol Bot change)
 
Line 14: Line 14:
| document type = TECHNICAL SPECIFICATIONS & TEST REPORTS, TEST REPORT
| document type = TECHNICAL SPECIFICATIONS & TEST REPORTS, TEST REPORT
| page count = 46
| page count = 46
| project = TAC:51232
| stage = Other
}}
}}



Latest revision as of 05:40, 8 December 2021

Final Verification & Validation Rept for Nebraska Public Power District Plant Mgt Info Sys
ML20199C854
Person / Time
Site: Cooper Entergy icon.png
Issue date: 05/30/1986
From: Lexa A, Paul B, Thomas N
SCIENCE APPLICATIONS INTERNATIONAL CORP. (FORMERLY
To:
Shared Package
ML20199C828 List:
References
SAIC-86-1097&26, SAIC-86-1097&264&, SAIC-86-1097&264&0, TAC-51232, NUDOCS 8606180274
Download: ML20199C854 (46)


Text

_ _ _ _ _ _ _ _ _ _ _ _ - _ _ _ _ _ _ _ _

E E

~

E -

I Final Verification and Validation Report For Nebraska Public Power District Plant Management Information System Rev. 0 E A. F. Lexa N. C. Thomas B. D. Paul E- J. H. McCleskey I SAIC-86/1097&264&O May 30, 1986 E

E E

S IqAfH"""

-- ruw.

Science Applicaticns International Ccrporation Post Office Box 4406, Lynchburg, Virginia 24502

~

e6o6teo274 e60613 '~

DR ADOCK 0500Q: O B- .

l

SAIC-86/1097&264&O 05/30/86 E

RECORD OF REVISIONS Date E I

,- s,a ,es.---;

Revision Nurnber-l Revision l o t ,O;;,7 ,ssu, o, Re or,t I .

t

_ = _

E

= - - _==_-

i .

l E .

i l

= -------

E

. . i.

t-_-_------ 1 ---_- - :__-_-__ -__-=- - - - ._-_-_-_-_-  :

I l  !  !

t ====--- ---------------------- t l ---

__--------------l .

:  : i lE l

i_-_---------_i,-______---

l l --- -____-- -----  :  :

:  :  : i 1 r ------------- i  : i s .----------_-_

I E I E ---_________-  !

i .

l E .

. . e t ------------ ---------------------- . ----------------------------------- .

i 1

i  :  :

l ------------ee


8 ---------------------------------- '

E  !

0 I

0 I

E E~ -

SA!C-86/1097&264&O 05/30/86 lE PAGE COUNT SUfetARY E Pages 1.0 VERIFICATION AND VALIDATION PROGRAM OVERVIEW 1-1 Through 1-10 2.0 SYSTEM REQUIREMENTS VERIFICATION 2-1 Through 2-4 3.0 DESIGN VERIFICATION 3-1 Through 3-3 4.0 VALIDATION TESTING 4-1 Through 4-5 5.0 FIELD VERIFICATION TESTING 5-1 Through 5-2 6.0 6-1 Through 6-4

SUMMARY

AND CONCLUSIONS E

a E

a a

5 s -

a E '

E

E SAfC-86/1097&264&O 05/30/86 E

TABLE OF CONTENTS E Page 1.0 VERIFICATION AND VALIDATION PROGRAM OVERVIEW 1-1 1.1 Introduction 1-1 1.2 Overview of Verification and Validation Activities 1-1 1.3 Sumary of Results 1-3 E 1.4 Sumary of V&V Plan and Procedures 1-4 1-5 1.5 Discrepancy Report Generation / Resolution 1.6 Verification and Validation Report References 1-6 1.7 Developer References 1-7 1.8 List of Abbreviations 1-8 2.0 SYSTEM REQUIREMENTS VERIFICATION 2-1 E. 2.1 Objective 2-1 Sumary of Activities 2-1 2.2 2.3 Sumary of Results 2-3 Deficiencies Identified / Resolved 2-3 2.4 5 3.0 DESIGN VERIFICATION 3-1 Objective 3-1 3.1 Sumary of Activities 3-1 3.2 Sumary of Results 3-3 3.3 3-3 3.4 Deficiencies Identified / Resolved 4-1 4.0 VALIDATION TESTING Objective 4-1 4.1 Sumary of Test Plan 4-1 4.2 l 4-1 4.3 Sumary of Test Execution

,g 4.4 Sumary of Test Results 4-4

,3 4-5 4.5 Deficiencies Identified / Resolved 5

11 E

H SAIC-86/1097&264&O 05/30/86 e

H Table of Contents (Continued)

Page 5.0 FIELD VERIFICATION TESTING 5-1 F

L 5.1 Objectives 5-1 5.2 Sumary of Results, 5-1 5.3 Deficiencies Identified / Resolved 5-2

{

s 6.0

SUMMARY

AND CONCLUSIONS 6-1 L

6.1 Sumary of Discrepancy Reports 6-1 6.2 Conclusions /Reconnendations 6-1

{

y APPENDIX A - Conventions Used In V&V Interactive A-1 i Description (VVID) Diagrams l

E I

I I

I ~

I iii I . . . . . _ . . . .

E SAZC-86/1097&264&O 05/30/86 E

LIST OF FIGURES Page Figure 1-1 Overview of VaY Activity Perfonnance 1-2 Figure 2-1 System Requirement Verification 2-2 Figure 3-1 Design Verification 3-2 Figure 4-1 System Validation 4-2

. Figure 4-2 Validation Testing 4-3 Figure A-1 Conventions Used in V&V Activity Diagrams A-2 E

E

'E E

E

.E iv E

l SAIC-86/1097&264&O 05/30/86 E

LIST OF TABLES Page Table 6-1 NPPD PMIS Discrepancy Report Sumary 6-2 E -

E E

i E

E E

E E

E.

~

E E '

E

E SAZC-86/1097&264&O 05/30/86 1.0 VERIFICATION AND VALIDATION PROGRAM OVERVIEW E 1.1 Introduction This document is the Final Verification and Validation Report for the Nebraska Public Power District (NPPD) Plant Management Information System (PHIS) installed at the Cooper Nuclear Station.

The purpose of this document is to summarize the verification and validation I (V&V) activities performed throughout the project and as documented in individual V&V task reports. Specifically, this report will:

E e Provide an overview description of the V&V activities.

e Suninarize each of the V&V tasks.

E e Provide an overall assessment of the PMIS based on V&V results.

1.2 Overview of Verification and Validation Activities The V&V Program conducted for the NPPD PMIS closely conforms to the concepts defined in NSAC-39. Verification and Validation for Safety Parameter Display E

Systems, dated December 1981. The basic V&V tasks perfonned were:

E e V&V Plan and Procedures preparation e System Requirements Verification e System Validation

- Validation Test Plan Preparation 1 - Validation Testing and Evaluation Field Verification.

e The interaction and sequencing of these activities are shown in Figure 1-1, which is in a format called V&V Interactive Description (VVID) diagrams.

The VVID diagrams provided in this report are taken from the V&V Plan.

I Appendix A provides guidance on the interpretation of VVID diagrams. Each of the above V&V tasks will be summarized in this report.

1- 1 E

s Ee V&V PLAN PROCEDURES AND N

\

l U

D SYSTEM

{ REGOTREMENTS p g

EM REQUIREMENTS VERIFICATION REPORT STANDARDS AND , VERIFICATION V&V PLAN p

REGULATIONS '

PROC DURES

@ t

.i i

> PERFORM DESIGN DESIGN VERIFICATION REPORT DESIGN DOCUMENTtTION VERIFICATION y V1V PLAN AND

! PROCEDURES

@ Y j
  • PERFORM VALIDATION TEST M AN j ~  % SYSTEM l >

DEVELOPER'S TEST DOCUMEhTATION VALIDATION VAfIDATinN TF9T RFPORT d

k 3 m

V&V PLAN 3 p AND PROCEDURES PMIS

@ y

{ ( y PERFORH m FIELD FIELD INSTALLATION INSTALLATION DOCUMENTATION C INSTALLATION ' p r VERIFICATION REPORT VERIFICATION

, V&V PLAN &

k PROCEDURES PtilS h h DEVELOP V&V 4 FINAL REPORT

^

F>

i ,o V&V FINAL REPORT i  %

,C OVERVIEW OF VEV. ACTIVITY PERFORMANCE a

Q Figure 1-1 D cn

SAIC-86/1097&26480 05/30/86 The overall objective of the NPPD PMIS V&V Program was to assist NPPD and the developers in providing a quality system through independent technical review and evaluation. V&V Team independence was maintained by using a division of SAIC (SAIC-Lynchburg) which was organizationally separate from the developers (SAIC-Huntsville). In addition, an independent V&V subcontractor (Pied Piper Engineering) was a member of the V8V Team.

The scope of the V&V program includes the PMIS system with emphasis on the Safety Parameter Display System (SPDS).

1.3 Summary of Results The V&V process generated a number of discrepancy reports which addressed a wide range of deficiencies. Through the actions of NPPD and the developers, such as testing and documentation, most of the discrepancies were resolved.

This section provides a brief summary of the currently unresol ved discrepancy reports. Section 1.5 describes the discrepancy report documentation methodology and section 6.0 provides a listing .of all discrepancy reports.

l The current unresolved discrepancy reports for the NPPD PMIS are:

! e System Requirements Verification 1

The V&V Team considers that the system requirements document

( did not require a health monitor for the SPDS display.

e Software Design Verification 1

The design documentation was considered insufficient to demonstrate compliance with requirements for system response time and memory capacity.

The human factors work was considered to be insufficiently documented.

I 1-3

~ , .

, - - - . - - - - . - ., - , . - , ,-,---,-.,.-_.,,----,--.--.,-c.-.,- , , - - , . . , . , , - . . - - - -

E SAIC-86/1097&264&O 05/30/86 The documentation was considered insufficient to describe the design and design basis for the SPDS display linkage.

e Validation / Field Installation Verification The V&V Team considers that additional assurance should be provided that the system will perform adequi.tely under emergency conditions.

1.4 Summary of V&V Plan and Procedures The VaY Plan for NPPD PMIS is documented in Reference 1.6.1 and closely follows the methodology of NSAC-39. The V&V Procedures document, Reference 1.6.2, provides added detail on the implementation of the V&V Plan.

Some features of the NPPD PMIS VlV Plan are highlighted below:

e Full utilization of the developers' QA program and configuration management system.

o Use of a two step review process - informal followed by formal review.

e More rigorous emphasis placed on SPOS and emergency response system capabilities. Less vigorous emphasis placed on plant ccmputer system capabilities.

e Full advantage taken from developer system testing.

In. general, the V&V Plan and Procedures were closely adhered to during execution of the V&V Program. Deviations are documented and justified in l

the following sections for each'V&V activity.

l l

l 1-4

SAIC-86/1097&26480 05/30/86 ll l5 1.5 Discrepancy Report Generation / Resolution l

Discrepancy Reports were generated as part of the V&V process whenever a discrepancy was identified. At the conclusion of each activity, Discrepancy Reports and Reviewer Comments were documented in individual V8V reports -

B References 1.6.3 through 1.6.9. These reports document the V&V review process. The V&V process used two vehicles to document V&V reviewer comments, questions or concerns - Reviewer Comments and Discrepancy Reports.

Reviewer Coments were used to raise questions resulting from the initial V&Y review or to document minor problems. Following the developer responses and corrective actions, most of the Reviewer Comments were resolved (closed). The Reviewer Coments which remained fall into two categories:

o Reviewer Comments of minor importance that were provided as suggestions to the developers or NPPD. Due to the minor nature of these issues, such as minor documentation problems, V&V Team suggestions, etc. final disposition was left to the developers and NPPD with no further tracking by the V&V Team.

e Reviewer Comments of major importancs that were not resolved by developer responses. These issues were elevated to the level of Discrepancy Reports and were tracked by the V&Y Team.

Discrepancy Reports, as stated above, were used to document important issues. Most Discrepancy Reports started as Reviewer Comments and were later raised to Discrepancy Reports. Others were written without Reviewer Comment origin. The decision of whether a problem or concern was an "important issue" was made soley by V&V Team and was somewhat subjective in nature. However, the following general guidelines were used. A "no" to any of the following questions, indicating a significant problem, qualified the issue as one which should be documented by a Discrepancy Report:

e Does the system meet NRC requirement, i.e., NUREG-0737, Supplement 17 1-5

SAIC-86/1097&264&O 05/30/86 e Does the system meet the NPPD requirements (50W) especially in the area of SPDS and related functions?

e Is the system design documentation, especially the SPDS, clear, consistent and sufficiently detailed to define how the system works?

e Does the system testing, especially for the SPDS, demonstrate compliance with system, requirements and design?

e Was the system sufficiently tested following installation to ensure correct operation?

Each Discrepancy Report (DR) is designed to document the original problem identified by the V&V Team with all subsequent activities to resolve the issue. The dialog between the VaV Team and the developers /NPPD is shown or referenced. As the project progressed, the DR format was changed, but the overall technique was applied to all DR's.

Foliowing the issue of the individual task V&V reports, a number of DR's remained open. The V&V Team, the developers and NPPD continued a process to resol ve the open DR's. This primarily consisted of a review of revised documentation and site testing. This activity resulted in the resolution of many DR's. Section 6 of this report will present a summary of the PMIS Discrepancy Reports.

l 1.6 Verification and Validation Report References l

The following is a listing of all V&V reports for the PMIS V&V activities:

1.6.1 Verification and Validation Plan, Revision 0, SAIC document 503-8500000-51, SAI-84/1525-264, dated March 1, 1984 l

1.6.2 Verification and Validation Procedures for Nebraska Public Power District Cooper Nuclear Station Plant Management Information System, Revision 1 SAIC document SAIC-85/1024-264-1, dated July 11, 1984.

l 1.6.3 System Requirements Verification Report for Nebraska Public Power District, SAIC document SAIC-84/1739&264, dated November 6, 1984.

l l 1-6 l

l

E SAIC-86/1097&264&O 05/30/86 1.6.4 Software Design Verification Report for Nebraska Public Power District Plant Management Information System, SAIC-85/1509&264, dated January 18, 1985.

1.6.5 Hardware Design Verification Report for Nebraska Public Power District Plant Management Information System, Revision 3 SAIC document SAIC-85/1717&264803, dated December 31, 1985.

1.6.6 Validation Test Plan for Nebraska Public Power District, SAIC document SAIC-85/15678264, dated March 15, 1985.

1.6.7 Validation Test Report for Nebraska Public Power District Plant Management Information System, Revision 1, SAIC document SAIC-85/1692&264, dated August 7,1985.

1.6.8 Validation / Field Installation Verification Test Report for Nebraska Public Power District Plant Management Infomation System, Revision

0. SAIC document SAIC-86/1500&26480, dated March 12, 1986.

1.6.9 Final Discrepancy Reports for Nebraska Public Power District Plant Management Information Systems, Revision 0, SAIC document SAIC-86/1687&264&O, dated May 5, 1986.

1.7 Developer References The following is a listing of the primary PMIS system requirements, design and test documentation produced by the developers and evaluated by the V&V Team. The revision levels listed were the revisions available at the time of the conclusion of the V&V effort. It should be noted that this documentation list is only a subset of the total PMIS development documentation package. The individual V&V reports should be referred to for more complete documentation references.

1.7.1 Nebraska Power District Plant Management Information System Cooper Nuclear Station Statement of Work, SAIC Document 1-323-05-766-00, Revision 1, dated January 16, 1985.

1-7 l

SAIC-86/1097&264&O 05/30/86 E 1.7.2 Nebraska Public Power District Plant Management Information System Cooper Nuclear Station Functional Specification, SAIC document 501-8500109-2b, Revision B, Change Number 3, dated December 30, 1985.

1.7.3 Nebraska Public Power District Plant Management Infomation System I Cooper Nuclear Station Detailed Design, SAIC document 502-8500110-01, Volume I - Software, Revision A Part 1 of 2, Change Number 2 dated December 31, 198.5.

1.7.4 Nebraska Public Power District Plant Management Infomation System Cooper Nuclear Station Detailed Design, SAIC document 502-8500110-01, Volume I - Sof tware, Revision A, Part 2 of 2 Change Number 2, dated December 31, 1985.

1.7.5 Nebraska Public Power District Plant Management Infomation System Cooper Nuclear Station Detailed Design, Volume II - Interface, SAIC document 502-8500110-02, Revision B, Change Number 2, dated December 16, 1985.

1.7.6 Nebraska Public Power District Plant Management Information System Cooper Nuclear Station Safety Parameter Display System Detailed Description, SAIC document 503-8500000-78, Revision 3, dated February 3, 1986.

g 1.7.7 Nebraska Public Power District Plant Management Infomation System W Cooper Nuclear Station Test Plan, SAIC document 501-8500102-01, dated October 26, 1984.

1.7.8 Nebraska Public Power District Plant Management Information System Cooper Nuclear Station Nebraska Test Procedures, Document No. 501-E 8500102-02, Revision A Final, dated March 25, 1985.

1.8 List of Abbreviations A/D Analog to Digital

~

CCB Configuration Control Board CDR Critical Design Review I CNS Cooper Nuclear Station CPI Computer Products, Inc. - Data Acquisition System 1-8 I

SAIC-86/1097&264&O 05/30/86 1.8 ListofAbbreviations(Continued) l CPU Central Processing Unit CRDR Control Room Design Review CRT Cathode Ray Tube CVT Current Value Table l DAS Data Acquisition System DR Discrepancy Report EOF Emergency Offsite Facility ERF Emergency Response Facilities ERFIS Emergency Response Facilities Information System

[ FAT Factory Acceptance Test FMEA Failure Modes Effects and Analysis F-SPEC Functional Specification l

IRCU Intelligent Remote Control Unit HPR Hardware Problem Report l HW Hardware NPPD Nebraska Public Power District I

NRC Nuclear Regulatory Comission I' NSAC Nuclear Steam Supply System OR Originating Requirement POR Preliminary Design Review l

PPE Pied Piper Engineering PMIS Plant Management Infonnation System f QA Quality Assurance RC Reviewer Coment RPIS Rod Position Information System l

g RWM Rod Worth Minimizer M SAI Science Applications, Incorporated SAIC Science Applications International Corporation (Previous name SAI)

SAIPMS Science Applications, Incorporated Plant Monitoring System SAT Site Acceptance Test E SBDC Site Data Base Change (Request)

SDCR Site Data Change Request SER Software Error Report '

SHPR Site Hardware Problem Report SOE Sequence of Events S0W Statement of Work (System Requirements Document) 1-9 E~

E SAIC-86/1097&264&O 05/30/86 1.8 List of Abbreviations (Continued)

SPDS Safety Parameters Display System SR System Requirement SSCR Site Software Change Request 8 SSER Site Software Error Reports SSMR Site Software Move Request 1 STVR Site Test Variance Report I SW Software l

'g TC Thermocouple j 5 TIP Traversing Incore Probe TSC Technical Support Center TVR Test Variance Report V&V Verification and Validation VVID Verification and Validation Interactive Description 8 (V8V task information flow diagram)

E E

E

,E E .

E E

E E

E '-'

E

SAfC-86/1097&264&O 05/30/86 y 2.0 SYSTEM REQUIREMENTS VERIFICATION L

2.1 Objective a

g The primary objective of the PMIS System Requirements Verification was to ensure that the PMIS met the requirements and guidelines as defined by the NRC. An additional objective was to ensure that the PMIS system l requirements were concisely defined to allow accurate implementation of the system by the developers.

I 2.2 Sumary of Activities l The process and methodology used for the PMIS System Requirements Verification is described in Figure 2-1. The System Requirements Verification was accomplished by fomulation of the Originating Requirements List and the System Requirements List and their cross-referencing. The Originating Requirements List was derived from various NRC documents such as l

NUREG-0737, Supplement 1. NUREG-0696 and other associated NRC documents.

I The purpose of the Originating Requirements List was to define in tabular form all computer system related NRC requirements for the Emergency Response Computer System.

B The developers system requirements document, which was entitled Statement of

}

Work (50W), Ref*ence 1.7.1, was reviewed by the V&V Team and organized I into a tabular System Requirements Lit,t. The System Requirements List I represented all system functions as defined by the developers. The Originating Requirement and System Requirements List were cross-referenced and reviewed using the following criteria:

e Does each originating requirement have a corresponding system l requirement within the developers scope of supply?

I e Are the system requirements clearly defined and can they be implemented?

2-1

g R O

~

g 'M ' Ms M"M l FM h MR f Wl [ U~l f1 i U C

s k

PERFORM INFORMAL e

EMENTS m INFORMAL REVIEW COMHfNTS TO DEVELOPERS

) r- REVIEW OF SYSTEM 3 p y

REQUIREMENTS g

g COMENTS O

V&V TEAM

  • h ,

W AND APPROVE NPPD APPROVAL TO COMMENCE DESIGN 3

(h REQUIREMENTS g

NPPD APPROVAL NPPD V&V PLAN & PROCEDURES PROJECT TEAM

(

If If SYSTEM REQUIREMENTS VERIFICATION REPORT

> PERFORM FORMAL

  • STANDARDS AND REGtitAit0t:S  ! "

l to > C 3

SYSTEM REQUIREMENTS DISCREPANCY REPORT V&V TEAM *

( @

T RESOLVE SYSTEM 4 REQUIREMENTS DISCREPANCIES 3 SYSTEM REQUIREMENTS A DISCREPANCY RESOLUT!0t! DEVELOPMENT TEAM

  • V&V TEAM INCLUDES NPPD PARTICIPANT (h M SYSTEM REQUIREMENTS . VERIFICATION Figure 2-1 il 3

n' m

5 SAfC-86/1097&264&O 05/30/86 E The final activity for the PMIS System Requirements Verification was to document, by means of Reviewer Coments and Discrepancy Reports, those areas where the V&V Team felt there were inconsistencies. Through various meetings, correspondence and documentation changes, a number of these discrepancies were resolved. The complete description of the PMIS System E Requirements Verification is documented in the System Requirements Verification Report, Reference 1.6.3.

2.3 Sumary of Results After the first pass of the PMIS System Requirements Verification, the V&V Team issued fifteen Reviewer Coments. Through meetings and correspondence with NPPD and the developers, ten of these issues were resolved with five remaining items issued as Discrepancy Reports in the System Requirements Verification Report. Subsequent to the issue of the System Requirements E Verification Report, documentation changes and testing resolved four of the five items identified in the Discrepancy Reports. Section 6.0 contains a description and final status of all the System Requirements Yerification l

Discrepancy Reports. The V&V Team considers that all the NRC requirements as defined in NUREG-0737, Supplement I were adequately covered in the system E requirements documentation. The topics in the five Discrepancy Reports were based on NRC guidance documentation.

[ 2.4 Deficiencies Identified / Resolved As stated above, Section 6.0 presents a sumary of all Discrepancy Reports generated as part of System Requirements Verification.

E With respect to confomance to the V&V Plan, the PMIS System Requirements

, Verification prccess closely followed the methodology outlined in the VaV Plan except in the following areas:

e The V&V Team felt that the Statement of Work Document, which was l used as the System Requirements Document was somewhat difficult to l use. It was a contractual procurement document that was not a l concise listing of system functional requirements. This required the V&V Team to significantly " filter" the document to establish the system requirements.

15 e-e lE

E SAIC-86/1097&264&O 05/30/86 e Ideally, the Statement of Work should have been revised and the V&V discrepancies resol ved before commencement of system design.

However, due to schedule constraints, the system design commenced before completion of the System Requirements Verification. This did not have a major impact on the system except for those items

,I described in the five Discrepancy Reports.

E

.E E

E

E'

'E E

E E

~

E l3 2-4 lE

SATC-86/1097&264&O 05/30/86 l5 3.0 DESIGN VERIFICATION 3.1 Objective E The objective of the Design Verification was to ensure that the design is consistent with system requirements. An additional objective is to evaluate 5 the design documentation for glarity, correctness and consistency.

3.2 Summary of Activities Design Verification was performed for both the software and hardware E designs. These verification efforts were performed separately and are documented in separate reports; Reference 1.6.4 for Sof tware Design Verification and Reference 1.6.5 for Hardware Design Verification. Figure 3-1 shows the Design Verification process.

The primary tool used for Design Verification was the Capabilities Matrix.

Each system capability, as derived from the design documentation, was given a unique identification number and listed in matrix form. Then, each System Requirement taken from the System Requirements List, generated during System Requirements Verification, was matched with a system capability. Where B there was no system capability for a system requirement, a problem was flagged.

Other checks were performed to ensure consistency and correctness of the design documentation.

The Sof tware Design Verification concentrated on those areas related to the SPDS and emergency response facilities. The primary software design documentation was contained in References 1.7.2 through 1.7.6. The Hardware Design Verification was limited to a system level review of the PMIS. Since the components which made up the PMIS were for the most part established, proven in application product lines, no detailed Hardware Design Verification was considered necessary. For this reason, the hardware verification activities were confined to an evaluation of the integration of the individual components into a total working PMIS.

E 3-1 E

e M

.L V&V PLAN

@N AND REQUIREMENTS PROCEDURES

. U"E'^" "

P

]

3 PREPARE FOR Y $

O DESIGN INFORMAL REVIEW C0fNENTS TO DEVELOPERS VERIFICATION 3 y

. DESIGN 00Cl#1ENTAT!0N**m '

- OM 3 V&V TEAM .

COMMENTS l f V8V TEAH*

1 l

k k>y APPROVE DESIGN NPPD APPROVAL TO COPNENCE CODING m l

l DOCUMENTATION 3 N NPPD APPROVAL l

h V&V PLAN AND PROCEDURES

(

w , TEAM I r'o h 1f y

( SYSTEM PFCAff RFMENTS VFRf Ff CATf 0N RFPORT

> PERFORM FO R DESIGN VERIFICATION REPORT

> DESIGN -

DESIGN DOCUMENTATION ** > VERIFICATION A

(> - -- DISCREPANCY REPORT V&V TEAW V&V PLAN AND PROCEDURES

@ v

( DESIGN DISCREPANCY RESOLVE DESIGN N

  • V&V TEAM INCLUDES NPPD PARTICIPANT DMRNM
    • DESIGN DOCUMENTATION INCLUDES HUMAN FACTORS

\

ENGINEERING RESULTS DEVELOPMENT f

  • i ,

Q J m s

l- DESIGN VERIFICATION 8 Figure 3-1 2

SAIC-86/1097&264&O 05/30/86 5 3.3 Sumary of Results E The first review of the software design documentation by the V&V Team resulted in thirty-seven Reviewer Coments. From discussions, meetings and documentation changes a number of open areas were resolved. At the completion of the Software Design Verification twenty Discrepancy Reports were issued. The majority of these discrepancies were due to the fact that I the design documentation was not in final form. Subsequent to the issue of the Software Design Verification Report, the final design documentation was issued. This resolved all but three of the twenty Discrepancy Reports.

Section 6.0 will present a discussion and final status of all Software Design Verification Discrepancy Reports.

The Hardware Design Verification resulted in no discrepancies. Five relatively minor coments were documented as Reviewer Coments.

The V&V Team considers the design documentation, in its final form, is 5 adequate to document the system design and compliance with the system functional requirements.

E 3.4 Deficiencies Identified / Resolved As stated above, Section 6.0 presents a sumary of all Discrepancy Reports generated as part of Software Design Verification.

5 The Software Design Verification was accomplished according to the V&V Plan except that the Software Design Verification was perfomed on preliminary B design documentation. This decision was made to permit the Development and V&V Teams to remain on schedule. This resulted in the formulation of a numoer of discrepancies that were later resolved when the final documentation became available. Ideally, the Design Verification should be performed on final documentation in time to allow feedback to the developers I*I so that changes could be implemented before detailed coding commenced.

E E

3-3 E

SAIC-86/1097&264&O 05/30/86 4.0 VALIDATION TESTING E 4.1 Objective The objective of Validation was to provide assurance that the final system complies with the system requirements through a planned test and evaluation process. ,

4.2 Summary of Test Plan Figure 4-1 describes the overall System Validation activities. Figure 4-2 provides expanded detail on validation testing.

The first block of Figure 4-1 depicts the preparation of the Validation Test E Plan. This activity consisted of a review of the developers test plan,

, Reference 1.7.7 and draf t test procedures, Reference 1.7.8, to understand the scope and detail of the developers tests. With this understanding, the VaV Team produced the Validation Test Plan, Reference 1.6.6. The Validation Test Plan presented a methodology and validation test approach with the E following features:

e Test emphasis was placed on the SPDS and other emergency response system capabilities, e The developer testing was fully utilized. The V&V Team considered that a valid, well designed system test perfomed by the developer does not need to be repeated by V&V. The V&V Team recommended additional testing only when it was felt that the developer's tests were incomplete or deficient.

4.3 Sumary of Test Execution E The majority of the system validation testing was performed at the developers facility in Huntsville, Alabama during February 1985. The E testing was labeled Factory Acceptance Testing (FAT). During this testing the V&V Team witnessed most of the developers tests which was under the control of the developers QA staff.

4-1 E

j

s n sFisWO$g$ $R$
  • p T

R O

P .

E R

T S

E T

N O

I T

A D

I L

S A E V N R A U LDD PNE3Y AC V O .

& R V P T

NR OO P

E R

AD P T E S E

Y P T 3'

N O

( TI 3

)

J A S

D N

E R

U A

LDD U f A1 PNE AC - V -4 V O

& R N

A V P N -

Mreu L O +I S

Eg P M T

MTG RAN I

P Ti F E

S ODI FIT T S T RLS S Y Y EAE E S

O N

T PVT T I

T A

N O

I

- D TS I AT L DL A I U V LS AE N S VR A E L R PDU ND VAE

& C V O R

P f

N O

5 I S1 T DO A RI D AT ITN DA LSA NL AEL AU f TG VTP SE R

h k T

SD EN L

R D T

TA O N P A ST E TR R N NO A EP N L ME O P ER I R T T IN A S UO C E QI I T ET F RA I S C R h MI E 'R t EF V E x TI P u SR N O u YE G L t SV I E t S V o E E R D D P O

Ak

c SAIC-86/10075264&O 05/30/36 l

' 5 I

A A A A A

= o.---,

I $as I .

'ig 3 =

+ 0 O 50

- C S8 l

= i St L 25 S c"

=

at 85 S + $2 "

WO

. -- O

, E g g

=

a &----_q S}

[ d 1 U$

=

s+ O O

= #d s g 250 $"$

5 _s s=" "

2 ~

E g+ us:

OEM

~

gg 0 NE @ A

- eos -

I $

0 s

CE5->

E 0*

CE S a

g

< =8 5 g I a M

O

_k gr EE ~

mu

  • N> N F--

C U) 20+6 1 4 I $

o(

8 5

  1. 2
6. - - - - - -

')

(JJ l- *l w

toe g Ze 0 O I

5-j > M g -> - -

= =* 3 8 QC -

g,

= = c es t 4'

r

g E

5ro"

=05 *c E 10 Q

S~ E I C _.;

~

e n' E *

  • E T C = -> 2 O B

e $ ->'

3 - 2@ A I =

  • SmW M

3$E$

I gy

-. 5:= E g j _> aat.

== 6 A >

I

> r m

3 55 g I ar ='

2 .-

OY?

4-3 I - _ - _ _ _ _ _ _ - _ _ _ _ - _ _ _ __

l 7 SAIC-86/1097&264&O 05/30/86 The V&V Team witnessed the developers tests and maintained a V&V Test Log.

L The following checklist was used during witnessing of the developers tests.

Deviations were documented in the test log and/or as a Reviewer Comment or F Discrepancy Report.

L e Were test procedures being followed?

L e Were test anomalies being documented and resolved through the

[ developer standard procedures?

e Were the PMIS software, test software and files properly controlled L by means of a configuration management system?

Following completion of the FAT, the system was disassembled and shipped to Cooper Nuclear Station. However, the validation testing analysis was not completed at this time. The V&V Team obtained from the developers a set of

[ al1 test documentation - test procedures, test results and test scenario files after completion of the FAT. This data was reviewed against the Capability Matrix developed in earlier V&V tasks. This process linked together a system requirement, a design capability and a test. Where the linkage was incomplete a problem was flagged.

4.4 Summary of Test Results Following the FAT and V&V analysis of test data, the Validation Test Report, Reference 1.5.7 was issued. This report documented the details of I validation tests associated with the FAT. The report contained:

I e 19 recommended additional validation tests e 21 unresolved Reviewer Comments e 4 unresolved and 1 resolved Discrepancy Reports.

Ideally, the recommended additional validation tests should have been performed at the factory and all open Reviewer Comments and Discrepancy Reports should have been resolved before the system was shipped. However, due to a narrow time window for system installation, the system was shipped before the above V&V matters were addressed. This resul ted ir a continuation of validation tests during site acceptance testing (SAT).

4-4

SAIC-86/1097&264&O 05/30/86

= 4.5 Deficiencies Identified / Resolved 2 Section 6.0 presents a summary of all of the Discrepancy Reports issued as part of System Validation.

i

= The actual performance of validation testing deviated somewhat as described in the V&V Plan and Validation Test Plan. The major areas of deviation are:

~

. e System was shipped before additional validation testing could be performed and various V&V issues could be resolved.

e Lack of final design documentation, as noted in Design (erification, prevented the V&V Team from establishing design details for some system capabilities. Because of this, the adequacy of some tests

, could not be detennined. Also, the capability matrix could only be

+ partially completed.

I e The final version of most test procedures were issued the day of the test. This prevented V&V from perfonning a review of the procedure 3 before the test was conducted.

2 e The V&V Team observed that the test scenario files, which provided the input data values to drive the test, were not under formal 4--

configuration management.

In spite of the above deviations from the plan, the V&V Team considers that the testing perforried during the FAT was valid. These tests demonstrated d '

that the PMIS met the system requirements except in the areas noted by the developers and by the V&V Team. The resolution of these open areas were addressed through additional testing at CNS.

==

The testing of the SPOS was especially thorough and was directly traceable back to the design documents.

5 4-5

5 SAIC-86/1097&264&O 05/30/86 5.0 FIELD VERIFICATION TESTING E 5.1 Objectives E Block 4 of Figure 1-1 snows the Field Verification activity. The overall objective of Field Verification was to ensure that the validated system is l

correctly installed in the, plant. Only those tests to confirm correct instal 1ation were considered necessary. Because of this 1imited scope, no specific Test Plan was prepared. The V&V approach was to monitor a portion I of the Site Acceptance Tests (SAT) with emphasis placed on SPDS testing.

The Site Acceptance Tests were was a subset of the Factory Acceptance Tests.

V&V was to witness these tests and only conduct additional tests which were felt to be necessary.

5.2 Sumary of Results As presented in Section 4.0, there were a number of open V&V concerns at the conclusion cf the Factory Acceptance Testing. In order to resolve these areas, Validation Testing which was conducted during the FAT had to be 4

E continued during the Site Testing. The VaV activities during Site Acceptance Testing consisted of the following:

e Witnessing of developer Site Acceptance Tests g e Execution of YaV tests 5 e Documentation of test results in the V&V Test Log and/or Reviewer Comments and Discrepancy Reports.

E As a result of the continued Validation Testing at the site, the majority of the Reviewer Comments and Discrepancy Reports had been resolved. Also, it was determined that NPPD personnel had performed a very thorough wiring

~

check during the installation of the Data Acquisition System. In addition to this, NPPD planned a SPDS control board comparison study to ensure correct installation through the SPDS displays. The V8V Team considered that these two actions were sufficient for Field Installation Verification.

3 The V8V activiti6s associated with Site Validation and Field Installation Verification are documented in the Validation / Field Installation

~

Verification Test Report, Reference 1.6.8.

5-1 E

{ SAIC-86/1097&264&O 05/30/86 5.3 Deficiencies Identified / Resolved Section 6.0 lists all of the discrepancies generated from the Validation System Testing. Field Installation Verification generated no discrepancies.

At the time of issue of the Validation / Field Installation Verification Test Report, two discrepancies remained open; one pertaining to the adequacy of the testing of the data acquisition system and the other pertaining to a system load test. Subsequent,to the issue of the above report, additional test information and NPPD informal test results allow the resolution of the Data Acquisition System Discrepancy Report.

Although the ideal situation would have been to resolve all Validation Testing concerns at the factory, schedule constraints required validation to be conducted both at the factory and at the site. V&V considers that except for the System Load Test Discrepancy Report area, the PMIS received adequate I testing from the Factory and Site Ac.ceptance Tests.

l I

I I

I I

I I -

n ...

I l

SAIC-86/1097&264&O 05/30/86

=

6.0 SUlGERY AND CONCLUSIONS 6.1 Sumary of Discrepancy Reports Table 6-1 sumarizes all the Discrepancy Reports generated during the PMIS [

V&V process. Note there were no discrepancies for Hardware Design ]

Verification or Field Installation Verification. For each discrepancy which -

is resolved the V&Y Team can'siders that the area has been resolved to a sufficient level by a combination of test and/or documentation. For those f unresolved discrepancies the V&V Team considers that additional actions are 7 necessary. In some cases, there are planned activities underway which _

should resolve the V&V concern. In one case (DR #5 - System Requirements Verification), there is a technical differencc of opinion between the developers and the V&V Team over the level of sophistication for the SPDS -2 heal th monitor. The " Current Disposition" column in Table 6-1 sumarizes  :

the present status of the unresolved discrepancies. For a listing of all .=

Discrepancy Reports, refer to Final Discrepancy Reports, Reference 1.6.9.

i 6.2 Conclusions /Recomendations The V&V Team considers that the developers have adequately designed, h_

documented and tested the NPPD PMIS. Although, some deviations between the j; V&V Plan and actual performance did occur and al though some unresolved 9 discrepancies exist, the overall V&V process was successfully applied to the $

L PMIS development and testing. The SPDS and other related emergency response capabilities received added scrutiny and were found to be well designed and

{

i tested. I -

It is recomended that NPPD address each unresolved Discrepancy Report and decide on the necessary corrective actions, if any. _

6-1 .

l SIC-96/1097825440 TABLE 6-1 05/30/66 I

WPO PNIS DISCEPKY EPORT SIMIARY j SYSTEM REGJIEIENTS VERIFICATIIBl l QR ,,,_N0s, BIQ I BRIEF SIMIARY STATUS CL1NIENT DISPOSITIGI l

1 SYST 9 AVAILABILITY SW DID NOT REGJIE SYST9 AVAILI.BILITY RESG.VED -

l TO INiluDE CGMPUTE HVAC AND AC POER.

2 SPDS/CCNTR3.ROCM S0W DID NOT RiiGUIE A TEST PROGRM TD RESOLVED -

l DISA.AY CCNSISTENCY CHECK THE CDNSISTENCY BETWEEN TE SPDS I

M CINTROL ROCM DISPLAYS.

3 ERF SYSTEM SECURITY S0W DID 2 T PROVICE SEPARATE SECURITY RESOLVED -

h REGUIREMENTS FOR THE SPDS PORTICN CF THE PMIS.

4 DATA VALIDATION S0W DID MT PROVIDE MIEENTS THAT RESOLVED -

DIFFERENTIATE BElliEEN TE SPDS M TE BALAPCE CF THE PMIS SYSTEM.

l 5 TRAINING AND RECOGNITIDI Sai DID NOT PROVIDE A EalIREMENT FOR NOT NPPD AND llE DEVELOPERS AGREE 0F SYSTEM HEALTH A SEPAMTE SIMPLE SPDS ER.TH NONITOR. RESOLVED THAT TE CURRENTLY AVAILABLE B DIAGNOSTICS ARE SLFFICIENT TO DETERMIE SPDS OPGABILITY.

TE V4V TEM RECOMMENDS M SPDS HEALTH MONITCR I41IDI CAN BE EASILY AO QUICKLY LGED BY THE EACTOR OPERATOR.

SIFTWRE DE5764 VERIFICATI31 1 HEALTH MONITDR TIE DATA AC JI3I! ION DIASNOSTIC AND ESCLVED -

HER.!H TNITDR FUNCTIDtS AE NOT ADGUATELY ESCRIED IN THE ESIGN 001:MENTATICN.

2 DAS SCM PLM THE SCM PLM ?ESENTED IN THE FUNCTICNAL RESOLVED -

EPECIFICATION IS INCCNSISTENT WITH T E R:TUR. SCM PLAN.

3 MISSING T DULE SOFTWARE DEIGN DOCUMENTATICN COES NOT .RESCLVED -

DOCLJIENTATION PROVIDE ESC::IPTICN CN A MNER OF S&TWRE MOOLLES.

4 RESCAN CHECX THE F-SPEC STATES THAT A ESCM DiECX IS ESOLVED -

PERFORMED. HOWEVER, OTHER CESIGN DOCU-

- ENTATION DOES NOT SPECIFY A RESCM DEX.

6-2

L r SAIC-86/1097826440 TABLE 6-1 05/30/86 L

~

M TOPIC BRIEF SpelARY STATUS CURRENT DISPOSITim L 9 ETAILED DESIGN TEE AE INCCNSMPiCIES BETEEN TE SET- RESOLVED THE FINAL DOCUENTATICN ESOL'23 VQ 1 - DIISSI MS WAE MODULE DESIfW D0QJMENTATICN AND THE $lIORITY 7 INCCNSISTENCIES.

~ DETAILED DESIGN 90 LUES 1 AND 2 T E DEVELOPERS AND NPPD AGREE CN l THE ACEPTABIUTY & EE DXDIEN-TATION. M V4V TEM RECDeENDS ADDITIG#I. D0QJENTATICN DESCRIBED IN S/W DESIGN VERIFICATION RC 8 6.

6 F-SPEC TD DETAILED A MJleER OF AREAS EXIST '.41ERE TERE IS NO RESOLVED REVISD DOCUMENTATICN IS ADEDUATE.

~ ESIGN TRACEABILITY DESIGN TRACEABILITY BETWEEN THE S0W CR THE RJRTER V8V RECCMEhDATICNS ARE L F-SPEC rJ0 TE DETAILE DESIGN 00CUMENTATICN. PROVIDO IN S/W DESIGN VERIFICATION THESE ARE SinARIZED IN TE LNMATCHED ITEMS RC 4 7.

CN THE SYSTEMS CAPABILITIES LIST.

7 S0W TO F-SPEC THERE ARE INCCNSISTENCIES BETEEN TE S0W RESOLVED -

INCCNSISTENCIES AND TE F-SPEC IN RELATION TO SOFTWARE FurCTIGR. DESCRIPTIDNS.

8 SECURITY LEVELS TE DETAILED DESIGN DOCUENTATIGi DESCRIP- RESOLVED TID 4 0F SYSTEN SECURITY IS INADED'JATE.

9 DETAILED DESIGN- TE DETAILED DESIGN DOCUENTATION IS INADE- RESQ.VED REVISED DOCUENTATICN IS ADEDUATE.

VQ.LNE 1 GEERIC QJATE IN TE AREAS 7 FURTER V8V RECOMENDATIONS ARE PROBLEMS PROVIDED IN S/W DESIGN VERIFICATI0i I o LOGIC FLOW DIAGRAMS o DESCRIPTICN 7 INPU13 AIO CUTPUTS o LISTING 7 IMP 1RTANT 4.00RIT:NS E 8 10.

o SYSTEM BLOCK 07AGRAMS 10 SOE CLARIFICATICN THE ESIGN DOCUMENTATION WAS INADELATE RESDLVED IN THE AREA 0F THE TIE RESQ.UTIOi 0F TE SCE POINTS.

11 SYSTEM RESPCNSE/ CPU THE ESIGN DOCUENTATICN DID MIT PRCVIDE A NOT THIS DR RELATES TO SYSTEM SYSTEM SIZING. TIMING, AND LDAD ANALYSIS TO RESOLVED VALIDATION DR

  • 5.

I CAPACITY DEMONSTRATE COMPLIANE WITH REQUIREMENTS.

12 HLMAN FACTCRS DQ1NENTATICN CF HJMAN FAC' ORS DESIGN IN THE NOT ThE DEVELOPERS ARE CURRENTLY I SPDS DISPLAYS 'AS NOT PROVIDED. RESOLVED DOCUENTING HUMAN FACTORS WORK.

WHEN CCMPLETED THIS DR SHOULD BE CLOSD.

13 SPDS IN CCNTRG. ROOM NO EVIDENE CQJLD BE FQJM) THAT TE DETAILED RESULVED DESIGN REVIEW Q)NTRUL A001 DESIGN REVIEW AND SPDS DESIGN WERE INTESRATED.

I 14 SPDS DISPLAY LIMGGE INSUFFICIENT DOCUENTATION WAS PROVIDED NOT DESCRIBING THE LOGIC EEHIND TE SPDS DISPLAY RESCLVED THE VIV TEAM CONSIDERS THAT THE SPDS DISPLAY LINKAGE IS IMPORTANT, LINGGE. AND SHOULD SE REVIE'KD BY OPERATION I

PERSONEL M ARE FAMILIAR WITH EMERGENCY OPERATING PROCEDURES FCR ACCEPTABILITY.

6-3

SAIC-86/1097126440 TABLE 6-1 05/30/86 QL_5 mig BRIEFSlfelARY EEE GRRENT DISPOSITIM 15 SPDS MTA VALIDATIM TE DESIM 00QMNTATION DID NOT ADEDUATE1.Y ESQ.VED -

E ADDRESS AND RELATE THE PMIS DATA VALIDATICN AM) THE SPDS DATA VALIDATION TEDNIQUES.

16 MTE F DIANGE TE SPDS MTE CALCULATION DID MT AllTOMATI- ESOLVED I CALCLLATIM CALLY ACCOLMT FOR A 04ANGE IN SCAN CLASS.

17 INCmPLETE SPDS SOFTWARE THE SPDS DOCUENTATION WAS ET IN FINAL FOM. RESOLVED -

E DOCUMENTATIm ,

19 AVAILABILITY 7 REDLM- NO ANALYSIS WAS PROVIDED IN THE DESIGN DOCU- RESOLVED -

DANT SPDS INPUTS ON MENTATION TO DETE21E TE AVAILABILITY 7 E MJI REDUM) ANT SPDS INPUTS WITH ESPECT TO MJI TEMINATICNS.

19 INSTRUMENT ERRORS AND THE SPDS DESIGN DOCUENTATION DID WT ESOLVED -

SPDS DISPLAYS ACCOLNT FOR INSTRLK NT ERRORS IN THE SETTING

& VARIGJS (LARMS AND SETPOINTS.

W 20 SPDS CGMPRESSIM LIMITS TE DESIGN 000) MENTATION DID WT PROVIDE A RESG.VED -

JUSTIFICATim FOR TE COMPESSIM LIMITS FOR TE SPDS ARCHIVE DATA FILE.

VIE.IDATION/ FIELD IMTft1ATI(M WRIFICATIM

! SYSTEM EA.TH OITOR/ W SYSTEM TESTS EE PCRF09ED TD DEMON- RESOLVED -

SELF DIA6MSTICS STMTE TE DAS WA.TH MONIlDR 01 TE ABILITY TD PERF0lDI SELF-DIAM OSTICS.

2 DATA AC211SITION SYSTEM A MJMBER CF MS SYSTEM CAMBILITIES WFRE ESOLVED -

UNTESTED, SU01 A3:

l o MUI POER FAIL RESPGSE o DAS SELF DIAGNOSTICS o OPEN TERM 0CDUPLE I.ETECTION 3 FAILOVER TEST THE PRIMARY STAND-ALUE SYSTEM WAS NOT ESOLVED -

DIALLEN6ED WITH A FAILOVER COMMAND.

4 SPDS MTE TRANSFORM SPDS DID NOT EXERCISE RATE TRANSFORM. RESOLVED -

5 SYSTEM LOAD TEST 2 LOAD TEST WAS CDetETED ON THE AS- NOT V8V RECOMMENDS CONTINUED SYSTEM SELF E

i INSTALLED SYSTEM UNDER REALISTIC HEAVY LOAD CDGITICNS.

ESOLVED M(NITCRING TO EVALUATE TE SYSTEM ESPmSE UNDER PLANT UPSET AND TRANSIENT CCNDITIONS. THIS IFOR-E MATION PLUS FAT DATA SHOULD PROVIDE ASSURANG THAT THE SYSTEM WILL PERFORM ADEQUATELY LNDER ENERGENCY CCNDITICNS.

l

' Inu 6-4

APPENDIX A Conventions Used in V&V Interactive Description (VVID)

Diagrams E

I 1

E 1

E l

E

~

E E

E~

a m -

E n

4

%%2s'2'l$2e'"E DESCRIBE 0 BY RN0THER CONSTRRINTS IEXTERNRL CONSIDERATIONS y

FIGURE WHICH RFFECT THE PROCESSI Es

V V

_g

" OUTPUTS INPUTS >

3SL,lbb (REPMTS M (TRSK OR SUBTRSK)

OR I L

A ti RESOURCES (WHO OR WHAT CONTRIBUTES TO THE TRSK)

INTERACTION

<a (e.g. , REVISION BRSED ON REVIEV C0HMENTS) flb CONVENTIONS USED IN V&V RCTIVITY DIRGRAMS g FIGURE R-1 h

e 3 9 "

%s G . _ ' W 4 -l .

g- ..

xe, M p 8 g ,

g 'g . " M g '

. < s .

. . 3 9

.I '.

g.

-m.'.

6 46, q, -

e. -

. e -

. . 4

. -Q.,

j' - ', -

e r . .  : ,

1

. *,i . . . .

y .

. 6 .

j * -

s

. f *

-. ..f g $ " *

-~ _ "

4

} ,

0 g- G h e

'g' '

  • g- [, p - .

. ~

4t

. s * -

r .. .

- . .. . . s

, ;4

- 4

-o- -

9 . , - .

p T .g < p

  • a gj, * *

.. j,' . j .

4 '. .

'.* . . 3. - .

- ?*9 .

e g ,

- , 6 w . _ _

. g . .

8

$ e .

_ Q . '

e a- * * . .

e'

-2. q

) _

2 , 9.' .y . ,

3i* '.

G . .

. e

.ar

. , - 1 - -

.'t F... w;

.q

. )_ l^ , -

l

^

e,.

O #

- .  ; w g ..

3, w, . . .

., .. ..m .% -

y-e .-

a

_~ .

..p,- -

g. t' 1 - : , .

, _9 -

,. . . , . 6 3

n .. , .

. . ' , ' . , .* .. l s,' e Yg

- [!

?

5' ,

,. ,,'~ *_ - ; l.

., . . , ,,o a .. ,

~ -~* ' *

.vp,.  :

e. . ).

. _ . -),

7. .

n o .,

e . .  ; ' . 'V .

.- - pf.'. 6 *.

6 ,

. e 6 .

f i

.b '

_. f ., .

, , .' , _' *p _f' - '- -- Mi - e - ' - * - / ', .' ' ', '

o .

? '

6, e- a g

  • en ume

.g . t ,,, , ./ g s -

4.

gt .a - .

- , . . ,4 .

  • ,-e . . - .- j . - pi p , . .

g . 4 - .

  • r , - .- . . '. ,

a,,,.,, . 4 , -

- - - p ,

a-n,.

. < .. .  : n .

8L

' r ., .

/ amed . . . ?

[' Q '

~

T

~

s . ' .

3- f .3

- ' ' - /'

. ~. .

[. ~ ,-

~ x:

f , . , . m , #  ;

'.e

^'

,* s s I . ~ , ' - ' '

. * - f

- ,' ' 'b-t .

.l _

  • 9

'f.* . . -

i  :- '- [ ' . a , . t g- . - .- __ ., ;3 . . ; . _. ,

- e

. / - :. .. . . l ) ' .-

s.

.- . %- nau. .- ,*,y - .

1 g,,... g, i . -

. > ,z .- ,

g. . - - p.

1, p -

, - . : y- . ..;. .

c .%..

e ,. .. .' ' .

.- [ . y1 ,. -

. - / _

g *M". .e .

'=.",'s' 6 q -..

.__ ;e - , -

o- y

. N ,' , '

,34  ;- & ,, '. - -

  • 4 ,. ' *- . . ,

9 e

-y,- _.

y.-  ;- ..

e .7 s.

.g e .' +_* ,

4*

j

,/ .- *

~

.- l ' . .

.~ - - -

a' e

e. t a .. _ .

6

.r- - . . ' ._

+-*' .

I '

. . .. ~

, ;w . ,- _f p._.,,. _. .. 4

.s .

.- .. .g ..

- ,, .,t,.. .

, c - .

,,4. _._ .g ; c, 7 u _ ,r._ . . %x

_ A .y..

........4 p ....._2 i., L..; . _.a  : Jg ..'s.._e A .u r . . . . .

ftt ,c ' *

. 7 . - <

. ; .~

.e  ? . fe

'.* ' - , 'p.,. ,

.: . /,,,*=.,. -

. j . .

e;

,.,',..'....'<.,.-y Q , ;. - y<

h.'n-

.. . . ' . . _ . .. .. ._ >_. .*...I a-

_ . , . . . . . . 4 . , . . . . . a_... _.a..%.. _,..dQ_..... .y ..i..,_.

.471.. _h a

. .........._.-_.~~%._.m

~

. g '

, " '= j  ; e 4 . , , _ , . ' ' - -

p, .an8'

.. +

, . 7h ' ,', p..,..V

.s**'.e

' ^

e . g j . . .

' 4. .. t 3 _ Y, .,. .: -- .L v ,.S . < ~ t .sh ,..4.N f- < ~ .

NPPD Position Regarding Verification and Validation Discrepancy Reports that are Not-Resolved DR No. 5 (Systers Requirements Verification)

)

l Topic: Training and Recognition,of' System Health I Brief Summary: S0W did not provide a requirem'ent for a separate simple SPDS health monitor.

Status: Not Resolved l Current Disposition: NPPD and the Developers (SAIC) agree that the currently available diagnostics are sufficient to determine SPDS operability. The V6V Team recommends an SPDS health monitor which can be easily and quickly used by the E reactor ope'rator.

NPPD Position: While the NPPD PMIS Project Team understands the desire on the part of V&V to have a simple health monitor that will inform the operator as to whether the SFDS is or is not operating correctly at any time, we do not see this as a task that lends itself to being easily accomplished as V&V has suggested. V&V has suggested that each SPDS display include a box or some other visual presentation that would indicate "SPDS Healthy" or "SPDS Not Healthy." While this is possibly very desirable, the NPPD PMIS Project Team does not see how you can combine the approximately. two hundred ninety (290) analog and digital field input points into one "high level simple status indication" that accurately depicts SPDS status for all possible plant operating modes. SAIC has implemented data quality and data validation techniques E for the SPDS ao described in Attachment "A" and we believe that is sufficient to permit evaluation of SPDS data integrity. All PMIS input points are subjected to quality checks, limit checks, and sensor good checks and since the SPDS points are a subset of the PMIS points, this status carries through to the SPDS displays and will be reflected by the appropriate data quality colors in the SPDS high level display with the five safety function indicators. We believe this is sufficient and do not plan to do more at this time.

DR No. 11 (Software Design Verification)

Topic: System Response / CPU Capacity Brief Summary: The design documentation did not provide a system sizing, timing, and load analysis to demonstrate compliance with I. requirements.

Status: Not Resolved t .

L F

L Current Disposition: This DR relates to system validation DR #5.

NPPD Position: Two separate issues are involved here. Of the two, CPU capacity is not a problem in that evaluations of the operating y PMIS System have demonstrated that there is sufficient spare

" CPU memory even under heavily loaded conditions. Work is underway by SAIC to improve SPDS display response time by making the SPDS displays CPU memory resident. In addition, NPPD has requested proposals to make data files for other

- application software CPU memory resident to cut down on system load due to repetitive disk transfers of data. While we y concur that the design documentation did not contain a sizing,

[ timing, and load analysis, we believe that it is a moot point in that evaluations of actual system load have determined that improvements are necessary and action as described above is

[ underway to improve system response. At this point, it is our H opinion that these actions to correct known deficiencies eliminate the need for a theoretical analysis.

DR No. 12 (Software Design Verification)

Topic: Human Factors Brief Summary: Documentation of Human Factors design in the SPDS displays was not provided.

F l Status: Not Resolved Current Disposition: The developers are currently documenting the Human Factors work and, when completed, this DR should be l

closed.

I l

NPPD Position: We agree with the above disposition. SAIC is completing a human Factors checklist review of the SPDS displays. SAIC I will also issue a summary of their Human Factor activities regarding the SAIC Human Factor plan, implementation of it into the design, and the verification that the plan and guidelines were followed for PMIS/SPDS displays.

DR No. 14 (Software Design Verification)

Topic: SPDS Display Linkage l Brief Summary: Insufficient documentation was provided describing the logic I

behind the SPDS display linkage.

Status: Not Resolved Current Disposition: The V&V team considers that the SPDS display linkage is important and should be reviewed by operations personnel who are familiar with Emergency Operating Procedures (EOPs) for acceptability.

L. -

E NPPD Position: NPPD concurs that the SPDS display linkage is important and should be under continual review by the operations personnel, and in fact, is at this time under review by personnel familiar with the E0Ps. In addition, the District's E operations personnel familiar with the E0Ps have previously reviewed the display linkage and changes have been incorporated as a result of these reviews. It is, however, B our position that SAIC was obligated to only provide a hierarchical linkage method or tool and an initial SPDS display linkage based upon the BWROG Level I, II and III displays. This they did. NPPD will then continue to evaluate E the SPDS display linkage and will modify it if warranted. Our main goal was to get a display linkage tool which would then permit our modifying things of this nature if warranted and I not to end up with a set of linkages that could not be changed. The linkage logic is basically a tree type linkage that permits the user to define how the branches relate to 5 each other. The bottom line is that we feel the existing linkage is adequate for now.

DR No. 5 (Validation / Field Installation Verification)

Topic: System Load Test E Brief Summary: No load test was conducted on the as-installed system under realistic heavy load conditions.

Status: Not Resolved

( Current Disposition: V&V recommends continued system self monitoring to i evaluate the system response under plant upset and l

transient conditions. This information plus FAT data l should provide assurance that the system will perform adequately under emergency conditions.

l NPPD Position: It is NPPD's position that no controlled load test can I realistically be performed on the as-installed system with the data acquisition system actually connected to sensors throughout the plant. Even if it were possible to do so, one could not maneuver the plant to cause the number of plant alarms that were simulated at the factory testing. We do, however, concur with the recommendation to continue to I evaluate the system response under plant upset and transient conditions. In this light, during a plant scram on February 27, 1986, the PMIS, even though still not officially operable, captured transient scram data and proved invaluable in analyzing the sequence of events that occurred. Also, work is underway per DR #11 above to improve system response.

E E

E l

NPPD Position Regarding Final Documentation and Referenced Reviewer Comments Nos. 6, 7, and 10 l In DR Nos. 5, 6, and 9 (Sof tware Design Verification), the V&V Team, as part of their current disposition, have recommended additional documentation upgrades. These recommendations are made in the form of reviewer comments which are considered by the V&V Team (and NPPD) to be relatively minor in nature and which in fact may represent differences of opinion regarding what constitutes sufficient software documentation. NPPD personnel, including CNS Computer Applications personnel, have determined that the documentation in

! question is adequate for the District's use. In most cases, the documentation has gone through two to five revisions to resolve concerns we have had. In I addition, we will receive the documentation in question on magnetic media and thus will be able to revise it ourselves if it proves necessary.

B l

5:

1 B

1 l

l B

f I

B h

I 503-8500000-78 (Rev. 4)

Attachmsnt "A" 4/18/86 I N

2. DATA QUALITY, DATA VALIDATION AND GENERAL DISPLAY CHARACTERISTICS DESIGN GUIDELINES 2.1 DATA QUALITY Each time a field input point is sampled by the PMIS, a data l quality code is appended to the current value. The PMIS data quality codes are listed in Table 2-1. The quality and limit checks are performed in the order listed in Table 2-1 (i.e., from 00 to 18). If all checks are sa ti s f ac tory, the point is assigned a quali ty of GOOD, otherwise it is B assigned the quality of the first check that is failed. Sensor and alarm zones that are considered when establishing data quality are illustrated in Figure 2-1. The following information needed to perform the relevant l quality checks is specified in the PMIS data base definition of each' data point:

I f,}

Processing control logicals Warning limits (high, low) l -

Alarm limits (high, law)

Engineering limits (high, low) '

{ -

Redundant point ID and tolerance (if applicable)

Initialization data quality (if desired) l Alarm cutout point and alarm cutout status (if applicable)

I j The quality code of a calculated data point can be determined by propagating the worst quality code of any of the inputs to the calculation.

To the extent practical,, the SPDS calculations are performed using " healthy" I inputs as described below.

B l 2.1.1 Definition of " Healthy" Data In many cases, a valid result can be calculated even when one or more " poor" quality input points are rejected from the calculation. To take advantage of this, the quality code of any rejected input point is not t considered in a " healthy" calculation, , and the quality code assigned to the 2-1

Attachmtnt "A" 503-8500000-78 (Rev. 4) 4/18/86 result is the worst quality code of the remaining inputs. This approach is taken in the following cases: (a) the " healthy average" and " healthy [

maximum" pseudo-analog calculations that can be performed by the PMIS, E (b) the " heal thy OR" and " healthy AND" Boolean operations that can be performed by 'the PMIS, and (c) special calculations that are performed for external (real) data points used by the SPDS.

An SPDS " healthy" calculation will only include input points whose quality code is one of the following:

REDU HALM LALM HWRN LWRN ALM SUB DALM INHB GOOD Points with the following quality codes should be excluded from " healthy" l calcula tions.

l UNK DEL NCAL INVL RDER OTC BAD HRL LRL i

These quality codes are defined in Table 2-1. Healthy calculations are l described in Section 3.2. When a healthy result cannot be calculated g because of the unavailability of an adequate number of healthy inputs, a [,

'3 quality of NCAL is assigned to the result.

2.1.1.1 Basis for Treating REDU Quality Data as Healthy Data The PMIS data base allows an analog point to be designated as the redundant counterpart of one other analog point. When these two points differ by more than a specified tolerance, and all prior quality and limit

,g checks have been satisfactory, both points are assigned a quality of REDU.

M In this case, the SPDS cannot judge which data point, if any, is at fault.

For this reason REDU quality data is assumed to be healthy, and further resolution by the operators is nedad.

It is possible that an REDU quality will be assigned simply because the PMIS redundant tolerance limit was set too small. Opera ting experience with the SPDS will identify this type of problem. The corrective actions needed are to: (a) determine more appropriate redundant tolerance 5 limits for the points in question, and (b) update the PMIS data base to reflect the revised tolerances, s

E 2-2 E -

5 503-8500000-78 (Rev. 4)

Attcchment "A" 4/18/86 An REDU quali ty also will be assigned when equipment or s Of instrumentation associated with one of the two redundant points experiences ex'cessive drif t or a fault of some type. The corrective actions needed in this case are to: (a) determine which point is at fault, and (b) restore the faulted p'oint to normal operation, or (c) delete the faulted point from 5 scan if b, above cannot be accomplished in a timely manner. When the faulted point is deleted from scan, its quality is set to DEL, and the redundant point check is not performed on the redundant counterpart point.

2.1.1.2 Basis for Treating HALM, LALM, HWRN and LWRN Quality Data as Healthy Data The quality codes HALM, LALM, HWRN and LWRN are assigned based on E a comparison of the current value of an analog, pseudo-analog, transform or external (real) point with warning and alarm limits specified in' the PMIS da ta base. One of these quality codes can be assigned only if all prior quality and limit checks have been satisfactory. It therefore is expected that the point represents healthy data.

2.1.1.3 Basis for Treating ALM Quality Data as Healthy Data The quality code ALM is assigned based on a comparison of the current value of a digital, Boolean or external (logical) point with alarm states specified in the PMIS data base. This quality code is assigned only l f f all prior quality and limit checks have been satisfactory. It therefore is expected that the point represents healthy data.

2.1.1.4 Basis for Treating DALM and SUB Quality ' Data as Healthy Data All SPDS rate-of-change (ROC) variables are assigned a quality code of DALM because the second processing control logical for these I variables has been set, to 'N' in the PMIS data base. Quality and limit checking has been suppressed for all SPDS ROC data because of the following considerations:

No warning of alarm limits are specified in the PMIS data base for any SPDS ROC variables All ROC data is displayed in CYAN. (i.e., conventional GREEN, YELLOW, RED color coding is npt used).

I 2-3

l L

Attachmant "A" H 503-8500000-78 (Rev. 4) 4/18/86 k

All ROC data is displayed in conjunction with the current-value data from which the ROC is computed. This current value data is iI subject to PMIS and SPDS validation (see Section 2).

With only one exception, ROC data are not used in subsequent calculations. (The exception is the calculation of source range monitor reactor period, see Section 8.)

~ lt should be noted that rate-of-change, by itself, is seldom an accurate measure of the " goodness" or " badness" of the current plant state.

For example, low but increasing RPV water level may be a " good" situation, while high RPV water level, increasing at the same rate, may be a " bad" si tua tion. This example points to the fact that warning and alarm limits for ROC variables may be a function of the current value of the variable from which the ROC is calculated. This type of dependency cannot 'be repre-sented in the PMIS data base. Displaying ROC data in CYAN avoids having to consider this type of dependency. Use of the CYAN color code should have no adverse impact on the use of the SPOS because ROC data is provided in the I SPDS as supplementary information. The SPOS Safety Analysis

  • did not iden-tify any ROC data that was directly related to safety function status or E0P ,

' entry conditions.

l I A substituted value (quality code SUB) is considered to be healthy because it is assumed that a substitute value will only be assigned for specific, controlled purposes such as: (a) SPDS testing, or (b) when PMIS data acquisition problems render an SPDS data point unavailable, but the

! value of the data point is known from another, reliable source. In both of these cases, it is necessary to treat a substitute value as a healthy data j value in order to use it in any of the SPOS healthy calculations. If a substitute value were treated as not healthy, it would not be usable in

  • determing the value of healthy SPDS composed points.

Attachmznt "A" E 503-8500000-78 (Rev. 4) 4/18/86

, It is the responsibility of the utility to control the use of the Ijj SPDS capabilities to delete a point from quality and limit checking and assign substi tute values. With simple controls, it is expected that the operation and testing of the SPDS actually should be enhanced -by considering, DALM and SUB quality data as healthy data.

I 2.1.1.5 Basis for Treating INHB Quality Data as Healthy Data The quality code INHB is assigned to an analog, pseudo-analog, transform or external (real) point when a digital alarm cutout point changes states as specified in the PMIS data base description of the " analog" point.

The digital alarm cutout point provides information on a plant or equipment state that negates the need for warnings and alarms from a particular I " analog" poin t. This quality code can be assigned only if: (a) a digital alarm cutout point has been specified, (b) the alarm cutout point is in the correct state, and (c) all prior quality checks have been satisfactory. It therefore is expected that the data point represents healthy data.

I 2.1.2 Relationship Between Quality Code and Color Fill To the extent practical, the SPDS utilizes the PMIS default color I (',h assignments listed in Table 2-1 to define color fill based on data quality.

For example, a bar chart generally will have a GREEN color fill when the associated data point has a quality code of GOOD or INHB. The bar color fill becomes YELLOW when the quality code is LWRN or HWRN (i.e., the current value is in a warning zone), and becomes RED when the quality code is LALM or HALM (i.e., the current value is in an al. arm zone).

The following exceptions to the PMIS default color assignments are I made in the SPDS displays: (a) color conventions for indicating pump and valve operating status are dictated by existing control room conventions (i.e., RED = ON/0 PEN, GREEN = OFF/ CLOSED, as described in Section 2.3), and W) all rate-of-change data has a quality of DALM and is displayed in CYAN.

It should be noted that quc.lity code color assignments are parameterized in a file named QUALITY.D. The color assignments can easily be changed if SPDS operating experience indicates that there is a better approach to relating data quality to color fill than the approach listed in Table 2-1.

E .

E .

2-5

Attichm nt "A" 503-8500000-78 (Rev. 4) 4/18/86 L

F Table 2-1. Data Quality Codes * .[;

L

' PMIS No. Code Description Def ault Color I

00 UNK Unknown, point not yet processed White 01 DEL Point deleted from processing Magenta L (first processing control logical = N) 02 NCAL Could not calculate a sof tware computer point (insufficient Magenta healthy inputs to calculation) 03 INVL Data acquisition system front-end Magenta hardware error (assigned by data acquisition system) 04 RDER Sensor read error (assigned by data Magenta acquisition system)

I 05 OTC Open thermocouple detection (assigned Magenta by data acquisition system) l 06 BAD Input counts out of sensor range Magenta (assigned by data acquisition system) 07 HRL Point above high reasonable limit Magenta (EU high) in PMIS data base 08 LRL Point below low reasonable limit Magenta (EU low) in PMIS data base 09 REDU Redundant point check alarm based Magenta on redundant point definition and tolerance in PMIS data base 10 HALM Point above high alarm limit in Red l Pli!S data base 11 LALM Point below low alarm limit in Red PMIS data base 12 HWRN Point above high warning limit in Yellow PMIS data base 13 LWRH Point below low warning limit in Yellow PMIS data base 2-6

Attachnsnt "A" 503-8500000-78 (Rev. 4) 4/18/86 f.., j Table 2-1. Data Quality Codes

  • 5 PMIS No. ' Code Description Default Color 14 AUi Logical change-of-state alarm on 5 digital or logical points Red 15 SUB Substitute value assigned to point Blue E (assigned by PMIS based on operator input of substitute value via man-machine interface) 16 DALM Point deleted from alarm processing Green (second processing control logical = N, no quality or limit checks are performed on the point) 17 INHB Alarm inhibited by an alarm cut-out Green point specified in PMIS data base I (data is good, but the alarm function has been inhibited by a prescribed Eh '

plant or system condition that can be defined in terms of the status of a digital alarm cutout point) 18 GOOD Good Green

  • Note that the first 9 quality codes, (UNX to LRL) are considered to i

represent "not-heal thy" da ta. Qual.ity codes from REDU to GOOD are considered to represent " heal thy" data, and are used in " healthy" calcula tions.

E .

E E

E., .

I 2-7

Attachment "A" 503-8500000-78 (Rev. 4) 4/18/86 l

i l

  • I ,_ _ _

y,

, x .____ ,;

i

\N  :!:!

a-l  ! d

- t e B .no aooo l I e I j E i 5

\ -

e

~

I I !

i y i

i y -=-

x a i n  :

]3 1 1  ! ! f l I  : = s l 8 5

.  !! I B .

N .

!,!!! a si I ! !. !. Id.!!.!

~

2-8

Atttchm nt "A" 503-8500000-78 (Rev. 4) 4/18/86 2.2 DATA VALIDATION 2.2.1 PMIS Data Validation Technioues

~

I A normal function of the PMIS is to check the validity of all data points by performing the quality and limit checks listed in Table 2-1. There are many redundant data points used by the SPDS. These points are identified in Table 2-2. To check the validity of redundant input points, the PMIS uses the technique of comparative analysis. The PMIS data base defines the redundancy relationships between pairs of input points, and specifies the allowable tolerance between their current values. When the allowable tolerance.is exceeded, the redundant point check is failed, and each member of the pair of redundant input points is assigned a quality code E of "REDU". As listed in Table 2-1, this quality code will cause the respective data points to be displayed in MAGENTA.

Note that Table 2-2 specifies how redundant points should be defined as pairs in the PMIS data base for the purpose of performing the PMIS redundant tolerance checks.

Operating experience with the SPDS may indicate that the redundant g tolerance declared in the PMIS data be se is too small for some data points I ~

and is creating " nuisance" validation failures. If this situation occurs, the redundant tolerance for the af fected data points should be reset to a more appropriate value. That will minimize nuisance validation failures while still providing a meaningful check of the consistency between redundant points.

2.2.2 Supplementary SPDS Data Validation Techniques I In addition to the PMIS redundant tolerance check, the SPDS provides the following supplementary validation for selected plant variables: ,

Not-valid indicators (NVIs)

Downscale indicators (DNSCIs)

Safety function indicator (SFI) validation Equipment status indicator (ESI) validation E0P limit status indicator (E0 PSI) validation E

E 2-9

I Attachu nt "A" 503-8500000-78 (Rev. 4) 4/18/86 A summary of the validation criteria for SPDS composed points is presented E in Table 2-3, and SPDS data validation techniques are discussed in detail in @

this section.

2.2.2.1 Not-Valid Indicators To assist the operator in recognizing a "not-valid" situation, the characters "NV" appear in MAGENTA near the affected bar chart, trend or x-y plot, whenever the associated composed point fails to meet its respective validation criteria in Table 2-3. The presence of this Not-Valid Indicator (NVI) provides an unambigious indication that a question exists regarding the validity of input da ta. . It is expected that the operator will investigate this situation and determine if an input data fault actually E exists. Actions that may be taken by an operator include: (a) delete a faulty input data point from processing, (b) substitute a value for the faulty input data point, or (c) determine that no fault exists and continue to operate with the NVI present in the SPDS displays.

With regard to the data validation criteria in Table 2-3, please note the following:

The average source range monitar (SRM) reading (SPDS0014) is validated by first verifying that the SRMs are in the " inserted" position as indicated by digital point A519. The SRMs must be inserted in order to properly correlate SRM detector output with a source range power level.

The proxi mi ty to the RPV sa tura tion tempera ture limit, as indicated by composed point SPOS0288, is not explicitly considered in the validation of RPV water level data. If drywell temperature exceeds the RRV saturation temperature limit, flashing may occur in the cold reference leg RPV level instruments. When flashing occurs, the RPV level indication derived from these instruments must be considered as unreliable. The saturation temperature limit at normal operating RPV pressure is about 545*F. CNS containment temperature instrumentation used by the SPDS provides a monitoring capability up to 400*F, therefore, the usefulness of the RPV saturation temperature limit is somewhat restricted. The l

(.

E 2-10

( .

e Attschu nt "A" E 503-8500000-78 (Rev. 4) 4/18/86 E0P limit status indicator (E0 PSI) for the RPV sa tura ti on hy temperature limit is included in all SPDS displays which present RPV w a ter level i n f orma tion. Validation of this E0 PSI is E described in Section 9.

General guidelines for locating the NVI in the SPDS displays are presented in Section 2.3.

E In the PMIS data base, Not-Valid Indicators are defined as external (real) points with a normal value o f "0" (i.e., Table 2-3 B validation check passed) and a value of "1" when the Table 2-3 validation check is failed. Not-Valid Indicators are assigned eight-character point identifiers beginning with "SPDS" and ending with "NV" (see Section 3 for a listing of NVIs). The Not- Valid Indicators associated with each display are identified in the display descriptions in Sections 7, 8 and 9.

2.2.2.2 Downscale Indicators In a bar chart, the current value is shown by means of an

]} appropriate color fill in the bar, and by a digital display of the current value.

When a data point is " pegged high", the bar will be completely filled, and the color of the bar will change to MAGENTA because of the quality of the point driving the bar (f.e., a quality of HRL or NCAL). In

contrast, when a data point is " pegged low", there is no color fill in the bar and a quality code of LRL or NCAL cannot cause the bar to change color to identify the existence of a downscale condition.

To assist the operator in recognizing i downscale situation, the characters "0NSC" ar? displayed in MAGENTA at the " low" end of affected bar chart whenever the current value of the data point driving the bar reaches the engineering limit, low specified in the PMIS data base. General guidelines for locating this Downscale Indicator (DNSCI) in the SPDS displays are presented in in Section 2.5.

In the PMIS data base, Downscale Indicators are defined as external (real) points, with a normal value of "0" (i.e., not downscale),

I and a value of "1" when the current value of the associated data point drops to the engineering limit low (i.e., downscale condition exists). Downscale Indicators are assigned eight-character point identifiers, beginning with 2-11 E -

r Attachmsnt "A" E 503-8500000-78 (Rev. 4) 4/18/86 "SPDS" and ending with "DS" (see Section 3 for a listing of DNSCIs). The ,,

Downscale Indicator associated with each bar chart is identified in the bar ~~4 chart descriptions in Sections 7 and 8.

Downscale Indicators are not used in trend plots or x-y plots. In these types of displays, a moving cursor will track along one axis of the E display during a downscale condition. These displays also include a past history of the associated data point (s), so the operator will be able to see the value of the data point as it approaches, reaches, and recovers from a downscale condi tion. Adequate information is thus available in these dis-plays for the operator to be alerted to the existence of a downscale condi-tion.

2.2.2.3 Safety Function Indicator Validation An SFI normally will have a GREEN color fill, and will change to YELLOW when a warning condition exists, and to RED when an alarm condition exists. Each SFI is driven by a specific external (real) data point, as listed below:

Point ID SFI -

SPDSBOX1 Reactivity control SPDS80X2 Core cooling SPDSB0X3 Coolant system integrity SPDSBOX4 Containment integrity SPDSBOXS Radioactive release The validation criteria for each of these external (real) points is defined in Section 6. A validation failure causes the SFI to be displayed in MAGENTA. In spite of a. validation failure, it may still be possible for a valid warning or alarm condition to be gene ra ted, therefore, these conditions take precedence over a validation failure. As a result, a valid SFI warning condition will cause a YELLOW color fill to replace a GREEN or MAGENTA color fill in the respective SFI block. Similarly, a valid SFI l alarm condition will cause a RED color fill to replace a GREEN, MAGENTA, or l YELLOW color fill. ,

k_'

l 2-12

Attachmsnt "A" 503-8500000-78 (Rev. 4) 4/18/36 The Not-Valid Indicators (NVI) described previously are not used (J.J with safety function indicators.

2.2.2.4 Equipment Status Indicator Validation In general, ESIs are displayed in MAGENTA when: (a) insufficient healthy input data is available for determining system or equipment status, (b) input points have failed a PMIS redundant point check, or (c) conflicting data exists regarding equipment status.

~

The specific validation criteria for each ESI is integrated with the ESI processing logic described in Section 6. The Not-Valid Indicators (NVIs) described 5 previously are not used with ESIs.

2.2.2.5 E0P Limit Status Indicator Validation In general, an E0 PSI is displayed in MAGENTA when one dr both of the input variables needed. to drive the cursor in the associated Level 3 x-y plot is: (a) not healthy, or (b) has failed a PMIS redundant point check (i.e., healthy, but assigned a quali ty code of REDU). The specific validation criteria for each E0 PSI is integrated with the E0 PSI processing logic described in Section 9. The Not-Valid Indicators (NVIs) described J}

previously are not used with E0P limit status indicators.

E l

t l

l

.~ .

2-13

r Attechmant "A" 503-8500000-78 (Rev. 4) 4/18/86 Table 2-2. Identification of Redundant Input Points Used by CS.s" the Cooper SP05.(1)

Reference Redundant 2nd Redundant Input Input Inpu Variablel2) Point (3) Point (3) Point {3I APRM Flux 8000 B001 8002 8003 B004 8005 Average APRM SP050006 SPOS0007 SRM Log Count Rate N040 N041 E N042 N043 RPV Water Level, NR 8021 N011 N012 WR G032 G033 FZ N009 N010 ,

RPV Pressure N013 N014 l

Orywell Pressure, NR N017 N018 HR F084 F085 Drywell Temperature (4) M161 M162 M163 N276 N277 Supp Pool Temp, Sector A N023 NO31 B N024 NO32 C N025 NO33 0 N026 NO34 E N027 NO35 F N028 NO36 G N029 NO37 H NO30 NO38 Supp Pool Level, WR N019 N020 Containment Water Level, WR N021 N022

(.

2-14

- . _