ML20117J653

From kanterella
Jump to navigation Jump to search
Generic Emergency Response Info Sys (Basic Rtad) Software Validation
ML20117J653
Person / Time
Site: 05000447
Issue date: 04/30/1985
From: Shukla J
GENERAL ELECTRIC CO.
To:
Shared Package
ML20117H529 List:
References
NEDC-30885, NUDOCS 8505150168
Download: ML20117J653 (63)


Text

n-

~

NOTICE The information contained in this document is not to be used for other than the purposes for which this document is furnished by the General Electric Company, nor is this document (in whole or in part) to be reproduced or furnished to third parties (other than to carry out said purposes) or made public without the prior express written permission of the General Electric Company.

Neither the General Electric Company nor any of the contributors to this document makes any warranty or representation (express or implied) with respect to the accuracy, completeness, or usefulness of the infor-mation contained in this document. General Electric Company assumes no responsibility for liability or damage of any kind which may result ,

from the use of the information in this document. -

8505150168 850507 PDR ADOCK 05000447 y PDR ii

l NEDC-30885 CLASS II l

IIECUTIVE SlHIARY l

This Document No. NEDC-30885, entitled, " Generic ERIS (Basic RTAD) Soft-ware Validation", dated April 1985, describes General Electric's program to evaluate and test the integrated software, data bases and command files which, with hardware components, comprise a portion of GE's Emergency Response Information System (ERIS). ERIS is the commercial name for GE's Real Time Analysis and Display system and Transient Analysis and Recording System.

The purpose of this validation program was to demonstrate through static and dynamic testing and analysis that ERIS meets the generic functional, performance and interface design requirements placed on the system by General Electric, which designed and built the ERIS system.

The scope of the validation and verification testing reported here is to establish performance parameters for the ERIS system. Validation of the correctness of the ana' lysis and display of plant parameters is accomplished during other validation activities, including software integration tests, data base validation, pre-operational tests and startup tests. Data was taken during the validation and verification testing to facilitate software problem resolution and to assure that the plant startup validation activities would be successful, r Results of the validation and verification testing are as follows:

1. The time delay between the time a new display is requested and the time the display first becomes available on the display terminal is 3 to 5 seconds, depending on the complexity of the display,
2. Average null time (CPU idle time) during measurement intervals of 10 to 30 minutes is 46% to 50% depending on the complexity of the test scenario.
3. Average sensor upoate time (the time delay between the time a sensor changes to the time a display parameter changes ) is typically 0.4 to 2.4 seconds with a maximum of 6.0 seconds.

These results were obtained without performing parameter trend calcula-tions. A Software Problem Report has been written requiring a design change to the method of calculating and storing trend data. The redesign method will provide trend analysis without degrading the system's performance with respect to the required performance limits.

Analysis of the reliability of the system hardware during the validation and verification testing confirmed that the assumptions used in the original reliability analysis are valid.

In summary, the results of these tests confirm that the ERIS (Basic RTAD) system performs as designed.

iii

NEDC-30885 CLASS II This document also responds to Section C and H of Reference 4 and, therefore, includes the following appendices:

  • Validation matrix

~

  • Major ERIS verification and validation activities
  • Quality-related practices and procedures GE followed in this validation program
  • Summary of typical software problems GE encountered during this validation program iii
7. _ -

NEDC-30885 '

t CLASS II .

TABLE W CONTElffS Section Description Page 1 SCOPE y 2 DEINITIONS AND ACRONYMS 2 3 - OBJECTIVE 6 4 DESCRIPTION 7 4.1 General 7 4.2 Integration (Static) Test 9 4.3 Validation (Dynamic) Test 10 4.4 Summary of Software Problems and their Resolution. 12 4.5 Equipment Failures and their Impact on Reliability ~ :12 Analysis Assumptions .,

4.6 Inspection / Analysis (IA) and Documentation of 13 Compliance Established by IA 13

. 4.7 Documentation of Compliance Established by the ~13

- Tests (Static / Dynamic) 13 -

4.8 Field Verification Tests 14 * .

5 Conclusions 15 6 References and Supplemental Documents 16 APPENDIX A' ERIS SOFWARE VALIDATION MATRIX APPENDIX B MAJOR V&V ACTIVITIES OF THE ERIS PROJECT APPENDIX C ENGINEERING OPERATING AND PRODUCT QUALITY PRACTICES AND PROCEDURES APPENDIX D

SUMMARY

OF TYPICAL SOFTWARE PROBLEMS DISCOVERED DURING ERIS SOFTWARE VALIDATION PROGRAM iv '

s

, ., _ - -- p.__y ---.y - - - - --.- - . - -

NEDC-30885 CLASS II

1. SCOPE .

This report documents the software validation of the Basis Real Time Analyses and Display (Basic RTAD) functions of the generic Emergency Response Information System (generic ERIS). The following Generic ERIS (Basic RTAD) functions are beyond the scope of this document.

Temperature l

e Generic ERIS software validation is one of the several major Verifica- l tion and Validation (V&V) activitier,. Other major ERIS V&V activities are identified in Appendix B.

l l

l l

1 l

r -

S

?%

3 \

1

p --__ -____________ ____

l 7 ,/ NEDC-30885 a

s 7 CLASS II i

't 3

's sg 2. DEFINITIONS AND AMGI1MS p 'N,< The meaning of terms and acronyms as used in this report are defined to L be as follows:

\ -

ADS Automatic Depressurization System I

(

sBasic RIAD Basic Real Time Analysis and Display (RIAD) is a computer based l . real time analysis and display system. It is designed to e

display following " Emergency Operating Procedure" derived displays on power plant control room.

,, +

e RPV Control Display g

i e Containment Control Display 4

, e Critical Plant Variables 5.'

- e Two-Dimensional Plots e Trend Plots I. -

j e Validation Status Displays -l '

This' system performs the following functions e Display Format Processing

-. Display Format Verification 3

Format Installation Terminal Initialization Format Report Configuration Terminal Status Report Display Parameter List Report Display Directory Report e

L]ynamic Display Processing

' Display Interface

" Format Selection Data Accumulation Security Check s

~

2 1

', k . ,

~ ~. ,

NEDC-30885 CLASS II Historical Data Accumulation Active Point Data Processing Critical Error Processing -

Detailed basic RIAD table in Appendix A. functions are as defined in column 1 of the  ;

I CCI Configuration Change Information CRD Control Rod Drive System CST Condensate Storage Tank DG Diesel Generators Dynamic Test Same as Validation Test EOF Emergency Operations Facility EPG Emergency Procedure Guidelines --

ERIS Emergency Response Information System Gen. Temp. Mon. Generator Temperature Monitoring Gencric ERIS ERIS system which forms the basis for implementation of plant unique ERIS systems GCMACC3 Habitat Display GCMANIF Habitat Display GE% DATA:ASCAN.DAT Data File for Scan Data HPCS High Pressure Core Spray Habitat Data Base Management Software l Integration Test See Section 4.2 of this report MSIV Main Steam Insolation Valves MSL Main Steam Line MTBF Mean Time Between Failure NDL Nuclear Data Link 3

NEDC-30885 CLASS II Pantalk Habitat Command P&ID Piping and Instrument Diagram PDDB Point Definition Data Base -

Plant / Site Unique An ERIS system which satisfies both regulatory and customer requirements Repair Class Category of Mean Time to Repair RECIRC Reactor Recirculation System RCIC Reactor Core Isolation Cooling System RHR Residual Heat Removal System RPV Reactor Pressure Vessel RTAD Real Time Analyses and Display System S PMS Suppression Pool Monitoring System

  • SPR Software Problem Report SPR Punch List A compilation of major software problems with action plan to correct the problems.

SRV Safety Relief Valve Static Test See Integration Test Target System A computer system consisting of sof tware (which is being subjected to test), data bases, command files and computer hardware (which is used to execute the software)

Test Specimen Software being subjected to test, associated data bases and command files.

TEST 04 Test Sample Plan Data Base Test System Computer system which is used to generate simulated data and provide inputs to the target system.

TLCDRRT Test aid software for real time data retrieval TRA Transient Recording and Analysis USERDB User Data Base 4

NEDC-30885 CLASS II V&V Validation and Verification V&V Simulator Software used for generation of simulated input data (resembling plant dynamics with multiple failures and operator actions) to the target system for ERIS validation testing.

l Validation Evaluation and test of interrated hardware and software system (softwarn, data bases and command files) to deter- I' mine compliance with functional, performance and interface requirements.

1 Validation Test Dynamic testing performed for validation of tr.e software, j (Note validation program also includes static testing and evaluation by inspection and analysis.)

Verification Review of documents and design to ensure that the system requirements are correct and the design meets the requirements.

l

. \

S

r NEDC-30885 CLASS II

3. OBJECTIVE The objective of the software validation program is to certify that the test specimen (identified as generic ERIS master software), when integrated correctly with correct data bases and installed on a correctly functioning computer hardware system, will operate in accordance with the design specifi-cations (to the extent defined in Reference 16) with exceptions noted in "The ERIS Software Problem Report" (Reference 15).

e 6

l

NEDC-30885 CLASS II i

4. DESQLIPTION 4.1 GENERAL f

Validation is the test and evaluation of the integrated hardware and software system to determine compliance with the functional, performance and interface requirements. The program was conducted in accrrdance with the ERIS/0MNIBUS V&V Plan (Reference 10) and the validation matrix was prepared to demonstrate compliance to the ERIS Validation Requirements (Reference 1). ,

1 An overview of the General Electric ERIS Software Validation Program is shown in Figure 1. The program consists of four parts:

1. Integration (static) test
2. Validation (Dynamic) test
3. Inspection and analysis I 4. Field Verification and Tests

~

.The Integration (STATIC) test, the Validation test and the Inspection and.

l Analysis were performed for Validation of generic software. Field Verifica- ~

l tion and tests are conducted to validate the plant unique system and are not

! necessary for validation of the generic ERIS software. Hence, Field Verifica-tion was not conducted as a part of this program.

A Test Plan and Procedure was prepared for each test which describes how the testing is to be performed, how problems are reported and retested, and which aspects of the design are to be tested. The Test Plan and Procedure clearly documented the conduct of the test and included the following items:

a. Test Objectives
b. Hardware / Software required and configuration
c. Procedure"
d. Acceptance criteria The test outputs included the following:
1. Test log
2. Test results The test results were evaluated by the test team and/or a engineering review team. Open items and Sof tware Problem Reports were generated and forwarded to the design team (using the practices and procedures contained in the 7

0

. si,  : -

lIl Q!' i 1 5 EU 5  ! l 1 , oe I '

! b I,'! i !!!:$g

li'  :

!!! E i

vrr

!ill

!!!!!!iti !!D!<ll !!!l'l i

!!;lllil it iiiii -

!!!!.I .

!,i,IIl

'ijjji I

1  !.!ll t

!!l' I!j l .. N'ug sa i i lilil ,

, 'lil' y  !!!$l l

! si,1 E'

t I" I!

r(ll) ! h y! 'i .

+! i

!!!i m

y' i i

,+p %: \ 111 ii is W'

,Ilt i

I g i uj:,,iii- -

!ji!Il 1! I li!!!

.ll' f ill,i

.nf, - nfa i!!!it @v ii!!!!

s NEDC-30885 CLASS II General Electric Company Engineering Operating Procedures) for corrective action. Required corrections were made by the design team and modified sof t-ware was submitted for re-validation. New revisions of the software were re-validated. Re-validation included analysis and partial testing or complete testing bared on the nature of the change.

The inputs to the software validation program consists of the generic '

sof tware validation requirements document, and un-validated software executable tapes. The output, in addition to this document, consists of the validated software (tapes) and the software listings which document the software to be validated.

Validated software is identified as " Generic ERIS (Basic RIAD) Master Software - 30885". The sof tware listings (documenting the sof tware), test plans, procedure logs, results, evaluation and results acceptance areeschieved.

Major documented engineering operating procedures and product quality practices and procedures which were followed during the V&V Program (to the extent they are applicable) are summarized in Appendix C.

4.2 INTEGRATI(Et (STATIC) TEST: .

Integration testing was performed to verify that the entire sof tware package (for each functional block) correctly implements the design and satis-fies the software requirements. Combinations of modules which have previously been tested were assembled and tested in a step-wise (one at a time) fashion. ,

The entire package was tested af ter each new integration of modules in the overall structure. At the final stage of integration tests, the individual functional blocks were integrated and tested as a single computer program product.

As the test cases were generated, the actual results of each were checked to verify that the expected results were achieved. Problems identified as a result of integration testing were documented on Software Probles Reports.

The test procedures and test case listings sere placed in the Design Record file.

Integration testing of each functional block was considered complete when all modules had been incorporated and all tests in the test plan had been completed. Among the criteria for completion of integration testing were:

a. All software requirements are tested over both normal and abnormal input cases.
b. All module invocations are exercised at lesst once for each possible alternative response.

l t

4 9

NEDC-30885 -

CLASS II l

l l

4.3 VALYMTIGI (D7NAMIC) TEST:

This test, in conjunction with the Static (Integration) Test was used for evaluation of the integrated (hardware and software) system to determine com-pliance with the functional and performance requirements. Thus this test pro-vides overall assurance that the required capabilities have been implemented.

The testing was approached from the ultimate user's viewpoint. The objective of the teut was to establish that the systen, when subjected to realistic plant dynamics over a sustained time period, will continue to provide

' generally used functions with acceptable performance. Typical tests which were performed as part of dynamic system testing include the following:

a. Demonstrate acceptability of sustained system performance when sub-jected to the dynamics of the plant as represented by a set of simu-lated typical power plant transients.
b. Verify that multiple functions can be simultaneously performed and that the system it free of multiple function interaction or timing problems.
c. Verify that system response / execution times and the accuracy of the .

outputs are acceptable. .

The testing activity was guided by the Validation Test Requirements Specification and Validation Test Plan and Procedures that were prepared dur-j ing the requirements analysis task. Among the criteria for completion of testing are:

a. All tests are run with fully operational " f. ware and hardware,
b. All testa are run per approved and wr;;< >. test procedures. The procedures are based on the End User Msanals and allow verification of the End User Manuals during the test.
c. All test data is archived in the appropriate DFRs. These include inputs, outputs, the test procedures, any pertinent information on the test execution, and information on the hardware and software status prior to test.
d. Test results are accepted by the engineering review process.

The dynamic testing required use of two Vax Computer systems. The dynamic test setup is shown in Figure 2. One computer was used to execute the target system software (test specimen). The other computer was used as a power plant simulator. The transient data (with multiple failures and operator actions) was transferred from a power plant simulator to a magnetic t a pe. The site computer hardwtre configuration is contained in the site computer Data Acquisition Hardware Deta Base and Point Definition Data Base.

The two data bases were transferred to another magnetic tape. These magnetic tapes were then processed with the validation test simulator generator 10

r NEDC-30885 CLASS II TEST wAs t 917El ANO -

' $ l f ASSOC EQutP l l 7097 - TEST MAG VIO60 we0GO I" T Apg t l l 75musmAL3 TgmuinALe l

l I 11 I 1

80muATTER F0mMATTER A 171 8 til n

, t U  ? Ir , I l 4

l

, , Vict0 70$wieA 70$ Meta Lews pm.mtse

' Cop'E R CCNSOLEI ComSOLE 2 pas 4 Tem pLoTTan -

Its 113 e 1

d i

i l 80*M A TTE R c  ;

s isi _

if 11 TAA047 WAR U9stuS F0aWATTER It/780 ANO _, SW 68 2

A 481 INPah880N  ; jg ItRP CatiNET 0701 OtttCII iP olttC wieto vs080 TfausNAL TemusgAL

~- t 3 wet A we?

=ofes.

t. 70 es e=Cou0:0 0= e 070caArw l,a is,,, ,,,

o TAPS e

2. N FetfROPTIC Os5C Figure 2. Test Equipment and Computer Configuration for ERIS

.1 Validation Test N

~_ , _ - _ . - . _ _ - _ _ _ _ _ _ _ , _ _ -

NEDC-30885 CLASS II software and the resulting data was loaded on the validation test simulator disk. The simulator disk, was installed on the test system computer. The test computer system provided measured inputs to the target machine in an similar manner as a power plant will do (for the set of transients contained in the simulator). _

4.4 SUDEIARY OF SOFTWARE PROBLBIS AND THEIR RESOIDTION During generic ERIS (basic RTAD) sof tware validation, sof tware errors were discovered. Since validation is primarily a testing activity the probicas were discovered in static / dynamic test phases as opposed to the inspection and analysis phase. A summary of major typical software problema discovered during the validation program is included in Appendir D.

Methodology for software problem reporting and resolution is identified in Item 6 of Appendix C. In summary, sof tware problems are documented in one or more of the following document types.

e Software Problem Report e Field Deviation Disposition Request e Corrective Action Report -

The documented problems were forwarded to the design team, which corrected / modified affected portions of the system. The schedule for modifi-cation / correction was generally governed by the software punch list. Upon correction of the problem, updated software was submitted for re-validation.

If the test was not successful, the software was returned to the design team for rework. When the test was successful, the upgraded software was archived in the library and is implemented on affected plant sites using field disposition instructions.

4.5 E@IPIGNT FAIIBRES AND THEIR IMPACT ON RRT.TARILITY ANALYSIS ASSUMPTIONS The validity of assumptions made in Reference 23 for the reliability analysis calculations have been addressed by Reference 24 The following 1 information further substantiates conservativeness of the aforementioned I assumptions.

A summary of various computer system hardware failure which were encountered up to the generic ERIS software validation are contained in Reference 17. Also included in it are the failure rate and repair data based j on the interval for which the equipment failures were monitored. A comparison of this failure rate and repair data with those contained in Table 2 of Reference 23 indicates that assumptions made in Reference 23 have not been invalidated as a result of additional data obtained up to and during validation test program.

12

NEDC-3088 5 CLASS II 4.6 INSFECTIW/ ANALYSIS (IA) AND DOGIMENTATION OF COMPLIANG ESTABLISHED BY IA It is not practical to validate some of the ERIS design features (func-tions, capability and performance) by testing. Features in this category, which are included in the ERIS validation requirement specification (Reference 1), were analyzed / inspected to establish their compliance to the design requirements. Simplifying assumptions were made to facilitate the analysis and generally accepted mathematical and analytical techniques were used.

Signature of the engineer in Column 4 (Compliance Certification) of the Appendix A table indicates the following:

e All assumptions and observations are contained in the Design Record File, o The analysis or inspection data and calculations are included in the Design Record File.

e Results and conclusions are summarized and their acceptance has been recorded in the Design Record File.

~

e Open items (items which have not been resolved and/or accepted) are ..

included in "the ERIS Software Problem Report". ~

4.7 D0QlMENTATIM OF COMPLIANG ESTARLTSHLD BY THE TESTS (STATIC / DYNAMIC)

All generic ERIS (Basic RTAD) features (function, performance capability, design parameters) are identified in column 1 of the Appendix A table. The method used to validate the features is indicated in column 3.1 of the table.

Column 2 refers to paragraph No. of the Validation Test Requirements Document (Reference 1), in which the validation requirements are specified and provides the interpretation of the validation requirement. Validation approach and acceptance criteria are described in Column 3.2 and 3.3 respectively of the table. The signature of the responsible engineer in Column 4 of the table (Compliance Certification) indicates the following:

e Generic ERIS master sof tware (Reference 13) was used as the test specimen to validate the feature.

e The tests (static and dynamic) were conducted using docunented and approved test procedures and said procedures are in compliance with ,

requirements contained in Column 2 of the table.

e Tests were conducted in a controlled fashion and no changes were made to the test specimen during and after the final run of the tests.

e The test specimen and corresponding source listings are archived.

e Record of the tests (test log, test plan and procedures, and test results) have been archived.

1 l

13

NEDC-30885 CLASS II e Open items aad software problem reports (with exception noted in Reference 15) have been resolved and the test results have been accepted.

e Record of resolution and acceptance of open items and SPRS have been included in the Design Record File.

e Those SPRS, which were judged to be significant and are not resolved and/or accepted, were included in "The ERIS Software Problem Report",

o Significant test results are included in "The Generic ERIS (Basic RTAD) Functionality and Performance Summary".

4.8 FIEIA VERIFICATIM TESTS:

The objective of this activity is to verify that the system is properly installed and the total site unique system will function and perform per requirements, in the plant environment. These tests are beyond the scope of this report. The installation procedures developen by the suppliers should be reviewed to assess the completeness and throughness with which the system is installed and checked. Additional field installation tests should be .

performed as required to verify the adequacy, accuracy and completeness of the.

total system installation. A pre-operational test should be performed to ensure that the system is correctly connected to the plant. A start up test should be performed to verify that all site unique data bases are correct and the systes performs correctly when interfaced with the power plant.

l I

i 14 I l

I

r NEDC-30885 CLASS II

5. CONCLUSIONS Results of the validation and verification testing are as follows:
1. The time delay between the time a new display is requested and the time the display first becomes available on the display terminal is 3 to 5 seconds, depending on the complexity of the display.
2. Average null time (CPU idle time) during measurement intervals of 10 to 30 minutes is 46% to 50% depending on the complexity of the test scenario.
3. Average sensor update time (the time delay between the time a sensor changes to the time a display parameter changes ) is-typically 0.4 to 2.4 seconds with a maximum of 6.0 seconds.

These results were obtained without performing parameter trend calcula-tions. A Software Problem Report has been written requiring a design change to the method of calculating and storing trend data. The redesign method will provide trend analysis without degrading the system's performance with respect to the required performance limits.

Analysis of the reliability of the system hardware during the validation and verification testing confirmed that the assumptions used in the original reliability analysis are valid.

In summary, the results of these tests confirm that the ERIS (Basic RTAD) system performs as designed.

15

4 NEDC-30885 CLASS II

6. REFRMCBS AND WFPLEMMTAL D0GDERS The following documents were used in preparation of this report. These documents provide detailed information about the applicable topics.
1. ERIS Validation Test Requirements, General Electric Company " Test Speci-fication", 386HA598 Revision 1.
2. U.S. Nuclear Regulatory, Commission, " Clarification of TMI Action Plan Requiremen* ". USNRC Report NUREG-0737, February 1981.
3. General Electric Emersency Response Information System Licensing Topical Report (General Electric GESSAR II SPDS) NEDE-30283-P, November 1983.

- 4. . U.S. Nuclear Regulatory Commission, "Draf t SER on the Safety Parameter Display System for GESSAR II", USNRC Docket No. 00007447, December 18, 1984.

5. General Electric "Ensineerina Operating Procedures", GE Document NEDE-21109. .
6. General Electric, Nuclear Services Product Department, Electronic and ~

Computer Products Section, "ERIS Sof tware Management Plan", G-81W-SMP-8430.1-0001, July 1984.

7. General Electric, Nuclear Services Product Department, Electronic and Computer Products Section, "ERIS Confiauration Manaaement Plan",

G-81W-CMP-8429.5-0001, July 1984

8. General Electric, " Document Preparation Guide", NEDS-24760.
9. General Electric, "Sof tware Engineerina Manual", NEDE-30682.
10. General Electric, "ERIS/0mnibus Validation and Verification Plan" NEDC-30675.
11. General Electric Company, Nuclear Services Products Department, Elec-tronic and Computer Products Section, "ERIS Integration (S:stic) Test Procedure, "ERIS Generic Test Plan Document", V-81W-GTP-8439.7-000.

September 1984.

12. General Electric Company, Nuclear Services Products Department, Elec-tronic and Computer Products Section, " Test Plan, Procedures and Report for the Validation Test of the Generic ERIS (Basic RTAD) Software Systes", (S) NEDC-30885.

16

m,--

NEDC-30885 CLASS II

13. General Electric Company, Nuclear Services Products Department, Elec-tronic and Computer Products Section, " Generic ERIS (Basic RTAD) Master Software-30885", Library Computer Tape-ERIS V and V Test Specimen.
14. General Electric Company, Nuclear Services Products Department, Elec-tronic and Computer Products Section, 'teneric ERIS (Basic RTAD) Master Sof tware Source Code Listina ", DRF-C95-00102-M.
15. . General Electric, Nuclear Services Products Department, Electronic and Computer Products Section, "ERIS Software Probles Report ".
16. General Electric, Nuclear Services Products Department, Electronic and Computer Products Support Subsection, 'teneric ERIS (Basic RTAD)

Functionality and Performance Summary Status ", DRF-C95-00102-Q.

17. General Electric Company, Nuclear Energy Business Operation, "ERIS Vali-

- dation Test Desian Record File ", DRF-C95-00102.

18. General Electric Company, Nuclear Energy Business Operation, *Ceneric ERIS (I,asic RTAD) Static / Dynamic Test Review", DRF-C95-00102-N.

19 ". General Electric, ' Emergency Response Information System Desian Specifi-cation" 23A1457 Rev.1.

20. General Electric, " Emergency Response Information System Application Desian Specification " 23A1434 Rev. No.1.
21. General Electric, " Emergency Response Information System Application Data Specification" 23A1435 Rev. No. 2.
22. General Electric, Nuclear Energy Business Operation, "ERIS V&V Test Simulator " DRF-C95-00102-9.
23. General Electric, Nuclear Energy Business Operation, " Availability Analysis of ERIS Hardware " DRF-C95-00048.
24. General Electric, Nuclear Energy Business Operation, Letter to U.S.

Nuclear Regulatory Commission from H.C. Pfefferlen, Subject 'Open Items from Draf t SER on Gessar II SPDS" December 20, 1984 (Ref. MFN-173-84, DRF-C95-00039).

17

NEDC-30885 CLASS II APPENDIX A ERIS SOFTWARE VALIDATION MATRIX

~

e 9

I

ERIS SOFTWARE VALIDATION MATRIK y,g, gg 2

1 A. APPLICARI.E PARACRAPH No. OF 3 VALIDATION APPROACH - g IDFMTIFICATION OF FRIS THE *ERIS val.!DATICIt TEST FEATUSF.S (FigeCTION REQUIREMENTS" SPECIFICATION 3.1 g)popg,3 PERFORMA81CE CAPARILITY, REVISION 1 VALI- auct 0 OR PARApsETER) TO RE R. INTERPRETATION OF VALIDATION DATION CERTIFI-ITEM VALIDATFD RRQUlBEMENTS NETHOD 3.2 $1504ARY OF VALIDATION APPROACH 3.3 SUlstARY OF ACCEPTANCE GITRBIA CATICII 1 DUMPINC, LISTIDIC A. 4.1.1.1, 4.2.1.1.1, 4.3.1.2 STATIC CosetAND THE COMPUTER SYSTEM 10 THE HARDG)PY OUTPUT DIRRCTLY IDCCING AND FIDTTING ARID 4.4.1.1 TEST PROVIDE DUMP, IDC, PIDT AND LIST CORRESP000DS TO THE 1000451 FILOS.

I FUNCTION OUTPUTS.

4 B. VERIFY THAT THE VAX/VMS IS i OPERATIODIAL AND CAN DUMP, IDC, FIDT AND LIST A FILE.

2 ANAIDC TO DIGITAL A. 4.1.1.2 AND 4.1.1.3 STATIC APPLY IL4N31 AND CAllRRATED IMPUTS VtRIFY THAT THE OUTPUT VALUES ARE

, CONVIRSION AND TEST PER STATIC TEST PROCEDURE PARA. WITHIN +/- 110F THE CORRESPONDl10C RANCE CHECK R. VALIDATE AMAIDC TO DICITAL 6.3.8.1.2 AND 6.3.8.1.5 EXPECTED VALUES DEFINED IN STATIC CONVIRSION ACCURACY. TEST PROCEDURE PARA. 6.3.8.1 AIID 6.3.8.1.5.

OZ 3 DAS DROR DETECTION A. 4.1.1.2 AND 4.2.1.1.3 STATIC REFER TO STATIC TEST PROCEDURE REFER 70 STATIC TEST PROCEDURE h , tri@

AND REPORTING TEST PARA. 6.3.10 AND 6.5. PARA. 6.3.10 AND 6.5. me R. VAI.lDATE THAT THE SYSTEM gy

. CAN DETECT AND REPORT DAS H co

) DRORS. $

4 DAS TIME TACCING A. 4.1.1.2 AND 4.1.1.4 STATIC REFER TO STATIC TEST PROCEDURE VERIFY THAT OUTPUT VALUES ARE CAPARILITY AND TEST PARAGtAPH No. 7.3.7. USE A RN0tti WITHIN +/- 110F THE EXPECTED RFS01 aft 3000 0F DAS R. VAI.!DATE DAS SYSTFM CAPA- RAMP WITH KMoogt START TIME AllD VALUES AT CORRESP008 DING TIMES.

CIOCK RILITY TO TIME TAC THE DATA. 1u00651 SIDPE AS AN INPUT. THESE EXPECTED VAIDES ARE DERIVED USIllC ALTERNATE CALCUIATI0ttS.

REFER TO STATIC TEST PROCEDURE Pamunagl go, 7,3,7.

5 VALIDATION OF DICITAL A. 4.1. l . 3, 4.1.1.5, 4.1.1.6, STATIC REFER TO STATIC TEST PROCEDURE REFER TO STATIC TEST PROCEDURE INPUT STATE, CHANCE OF 4.1.1.21, 4.2.1.1.2 AND TEST PARACRAPil NO. 7.3.7 AND STATIC PARACRAPH No. 7.3.7 AND STATIC STATE DETFSMINATION, 4.2.1.1.8 TEST PROCEDURE PARAGRAPH 100. 7.8. TEST PROCEDURE PARAGRAPH 900. 7.8.

TIME TACCINC OF IIIPUT l DATA SCAN RATE R. val.IDATE THE SYSTFMS CAPA-CAPARILITY OF 1, 2.10 RILITY TO DETFRMINE A DICI-25, 50, 100 AND 250 TAL INPUTS STATE, CHANCE OF FOR ANAIDC AND DICITAL STATE OF DICITAL INPUT, SCAN INPUTS, 500 SAMPLES RATE CAPARILITT OF AMAIDC '

PER SECOND FOR DICITAL AND DICITAL INPUTS, AND 5 INPUTS; AND 5 Milli- MII11-SECOND FVENT RFSOIJJ-SEC0piD EVENT RESOLUTION TION CAPARil.lTY OF THE SEQUENCE OF EVENTS.

I e i

ERIS SOFTWARE VALIDATION MATRIK Pese A2 2

1 A. APPLICABLE PARACRAPH NO. OF IDENTIFICATION OF ERIS 3' VALIDATION APPROACH THE *ERIS VALIDATION TEST 4 FEATURES (F11NCTION REQUIRENENTS" SPECIFICATION 3.1 COMPI.1-PERPORNANCE CAPAa!LITT. REVISION 1 VAL 1- ANCE OR PARANETER) TO DE B. INTERPRETATION OF VALIDATION DATION CERTIF1-VALIDATED REQUIRESENTS NETHOD 3.2 SitelART OF VALIDATION APPROACH 3.3 S1200487 0F ACCEPTANCE CRITERIA CATION CALENSAR TIME A. 4.1.1.11 STATIC REFER TO STATIC TEST PROCEDURE VERIFT THE rar mDAR TIME ASSO-ASSOCIATED WITH OIAEST TEST PARAGRAPH NO. 6.3.8.3. CIATED WITH OISEST DATA (RECORDED DATA ON TME DISK B. VALIDATE OPERATOR'S CAPA- IN DELTA RECORDIMC NODE)

SILITT TO 05 FAIN A HARD COPY CORRESPONDS TO KN0hes VALUE OF WHICH INCIRDES THE CALENDAR OLDEST DATA.

TIIE ASSOCIATED WITN DIEEST TRANSIENT DATA ON THE DISK.

DATA ARCHIVING AND A. 4.1.1.12 STATIC REFER TO STATIC TEST PROCEDURE REFER TO STATIC TEST PROCEDURE RETRIEVAL TEST PARAGRAPH NO. 6.4.1. PARACRAPH NO. 6.4.1.

B. VALIDATE THAT THE NORMAL STARTUP DATA CAN DE STORED ON AND RETRIEVED PROM THE TRA PROCESSOR NAC TAPE DRIVE.

SENSOR FAILURE REPORTING (FOR DAS A. 4.1.1.13 STATIC REFER TO STATIC TEST PROCEDURE TEST PARACRAPH NO. 10.4.3.1.1.2.

REFER TO STATIC TEST PROCEDURE PARAGRAPH NO. 10.4.3.1.1.2.

ny

} FAllDRE, REFER TO 5. VALIDATE SYSTEN'S CAPA-p ts i ITEN 3) BILITY TO REPORT SENSOR AND mQ m t.2 DATA ACQUISITION EQUIPMENT FA11mRES.

H@

M co tn CALigBATION CORRECTION A. 4.1.1.14 STATIC REFER TO STATIC TEST PROCEDURE V M IFY THAT THE DIFFERENCE TEST PARACRAPH NO. 7.5. BETWEEN THE ANA1AC VAIRE REPORTED B. VALIDATE THAT THE CALIBRA- BY THE TEST RESULT, AND THE CORRE-

, TION COEFFICIENTS OBTAINED SPONDING SiltilATED INPUT VALUES BY PERIODIC SENSOR CAllBilA- ARE LESS THAN OR EQUAL TO II 0F T!ON CAN BE CORRECTLY THE SiltilATED VALUE.

DETERMINED FOR THE FOLIEW- -

ING SENSOR TYPES:

a) LINEAR l

I b) QUADRATIC c) CUBIC d) SQUARE ROOT e) IACARITHNIC (BASE E) f) laCARITiellC (BASE 10) l a) ANT!!AG (BASE E) h) ANTilDC (BASE 10)

1) RMS s j) DOUBLE INTEGRATION T

4 6

i

ERIS SOFTWARE VALIDATION MATRIX Page A3 -

2 1 A. APPLICABLE PARACRAPH No. OF IDENTIFICATION OF BIS THE "215 VALIDATION TEST 3 VALIDATI,0N APPROACH FEATURES (FUNCTION 4 REQUIREMENTS" SPECIFICATION 3.1 PERFORMANCE CAFABILITY, REVISION 1 COMPLI-VALI- ANCE O OR PARAMETR) TO BE B. INTERPRETATION OF VALIDATION DATION ITEM VALIDATED REQUIREBENTS METHOD CERTIF1-3.2 stb 9tARY OF VALIDATION APPROACH 3.3

SUMMARY

OF ACCEPTANCE CRITERIA CATION 10 APPLICATION OF FIRST A. 4.1.1.15 STATIC REFER TO STATIC TEST PROCEDURE REFER TO STATIC TEST PROCEDUEF. ]

ORDER IDW PASS /OR TEST PARAGRAPH NO. 7.2.7. PARACRAPH NO. 7.2.7. V SECOND ORDS BANDPASS 8. VALIDATE THE CAPABILITY TO FILTERING TO AMAIDG SELRCT EITHER FIRST ORDER INPUTS 14W PASS OR SECOND ORDER BANDPASS DIGITAL FILTERING.

VALIDATE DIGITAL FILTERING CONSTANTS FOR ANAIBG INPUTS.

11 ENGINEERING UNITS A. 4.1.1.16 STATIC REFER TO STATIC TEST PROCEDURE VERIFY THAT THE VALUES ARE CONVERSION TEST PARAGRAPH No. 7.5. WITHIN +/- 11 0F THE CORRESPONDING b J

B. VALIDATE THAT THE ENGINEER- VARIABLES (USING ENOWN INPUTS) IN INPUT VALUES.

ING UNIT CONVIRSIONS (ITEM THE ENGINEERING llN!TS.

24) CAN BE PERFORMED BY THE SYSTEM.

12 A. 4.1.1.17 CAIIUIATE COMPOSED VAIA!E USING REAL TIME STATIC REFER TO STATIC TEST PROCEDURE TEST DICITAL VALUE OF THE CORE FLOW IS bS'j ON

{g PARAGRAPH NO. 10.4.3.1.1.1. WITHIN +/- II 0F THAT SHOWN IN mO PROCESS DATA B. VALIDATE THAT A COMPOSED PolMT CAN BE CALCULATED THE CORRESPONDING FICURES OF THE STATIC TEST PROCEDURE PARAGRAPH MbO USING A FUNCTION WHICH USES A MAXIMtM OF 16 0F THE FOL-NO. 10.4.3.1.1.1. s$u H

IDWING OPERATIONSt a) DIFFERENCE b) SUM c) PRODUCT d) QUOTIENT e) SQUARE Rout f) DIVIDE BY A CONSTANT g) IRILTIPLY BY A CONSTANT h) ADD A CONSTANT

1) SUBTRACT A CONSTANT 13 USE OF HISTORICAL DATA A. 4.1.1.17 STATIC REFEK TO STATIC TEST PROCEDURE VERIFY THAT THE VALUE OF COMPOSED TO CDERATE COMPOSED TEST PARACRAPH N0'S PARAMETERS 7.3.5, 7.8.1, POINT CA100 IS WITHIN +/- 110F
3. VALIDATE THE CAPABILITY TO 7.4, 10.3. THE ALTERNATIVELY CALCUIATED RETRIEVE PROCESS SIGNAL DATA VA1 ale.

NECESSARY TO RECENERATE COMPOSED PolNT PARAMETERS.

14 CAlfU!ATED VARIABLE A. 4.1.1.18 STATIC REFER TO STATIC TEST PROCEDURE VERIFY THAT THE CA!IUIATED VALUE 3, CALCULATION TEST PARACRAPH NO. 6.3.8.1.4 PRESENTED AS TEST RESULT IS

8. VALIDATE THAT A CA!IUIATED WITHIN +/- 110F THE ALTER-VARIABLE CAN BE FORMED FROM MAT 14ELY CAlfulATED VALUE.

COMBINING AMAIDG, COMPOSED

  • AND OTHER CALCULATED VA!EES.

ERIS SOFTWARE VALIDATION MA1RIX Page A4 2

1 A. APPLICABLet PARACRAPH NO. OF IBfJITIFICATION OF BIS THE *ERIS VALIDATION TEST 3 VALIDATICII _APPROACM 4 FEATURES (FUNCTION REQUIREMENTS" SPECIFICATION 3.1 PERPORMANCE CAPA51LITY, REVISION 1 COMPLI-VALI-0 Os PARAMETER) To at A41CE j B. INTSPRETATION OF VALIDATION DATICII '

ITEM VALIDATED CERTIFI-BBQUIRSEllTS METHOD 3.2 St200ARY OF VALIDATION APPROACH 3.3 SISSIARY OF ACCEPTAIICE CRITERIA CAT 1001 IS COMPOSED CoelTACT A. 4.1.1.19 STATIC RFM. TO STATIC TEST PROCEDURE VERIFY THAT THE COMPOSED CONTACT ran ntLATines TEST PanacmaPN 100. 6.3.8.1.3. ,

STATE IS SA E AS THE ALTERNATIVRLY B. VALIDATE THE CAPABILITY 10 DERIVED STATE.

CEIIERATE COMPOSED C00ffACT rains afl0IIS OF VA!EES BASED UFOII UP TO 16 OPERAT100tS WHICit IIICIEDES AT LTAST ONE OF A!J. TIIE OPERAT10015 0F THE Tite FOLlaNIIIG TYPE:

a) OR b) AND c) IIOT OR d) NOF AND e) CREATER THAN f) LESS THAN R) equal 70 16 COMPOSED CGIITACT A. 4.1.1.21 CHANCE OF STATE STATIC REF m TO STATIC TEST PROCEDURE TEST REFER TO STATIC TEST PROCELURE O$r -

DETERMINATION 3. VALIDATE THE CAPABILITY TO PARAGRAPH NO. 7.3. Panacea *H NO. 7.3.

(Q M I l

DETBMINE THE CHANGE OF STATE FOR C000 POSED CONTACT.

"y H

I l

17 gg DELETE POINT FR006 SCAN A. 4.1.1.22 STATIC REFER TO STATIC TEST PROCEDURE VRIFY THAT A POIIrF HAS BEEN O's -)

TEST PARACRAPH 100. 6.2.9 DELETED FROM SCAfl.

B. VALIDATE THAT A POINT CAN DE DELETED FROM SCAN.

18 RESTORE TO SCAN A. 4.1.1.22 STATIC REFER TO STATIC TEST PROCEDURE VERIFY THAT A POINT HAS BEEN TEST PARAGRAPH NO. 6.2.9. RESTORED To SCAN.

5. VALIDATE THAT A POINT CAN BE RESTORED TO SCAN.

19 POINT CALIBRATIces A. 4.1.1.22 STATIC REFER TO STATIC TEST PROCEDURE V RIFY THAT THE POINT DEFINITION

^

TEST Panac8Artl NO. 7.5. DATA BASE MAS BF2sl UPDATED WITH B. VALIDATE THAT A POINT CAN BE NEW CONVERSIGII CostSTAlfTS.

CALIBRATED.

20 DATA CoeIPRESS10el A. 4.1.1.22 STATIC REFER 70 STATIC TEST PROCEDURE VRIFY THAT THE POIIff DEFINITION LIIITS TEST PARar2&MI 100. 6.2.3.8.1. DEFINITION DATA BASE MAS DEEN B. VALIDATE THAT DATA COM- UPDATED TO INCLUDE ENTERED PRESSION LIMITS CAN DE $1C811FICAlff CHA80GE LIMITS.

ENTERED. g 4

ERIS SOFTWARE VALIDATION MATRIX Pete AS 1 A. APPLICABLE PARACRAPH NO. OF IDENTIFICATION OF SIS THE *ERIS VALIDATION TEST 3 VALIDATION APPROACH 4 FEATURES (FUNCTION REQUIREMENTS

  • SPECIFICATION 3.1 PERPORMAIGCE CAPASILITT, COMPL1-REVISION 1 VAL 1-O ANCE OR PARAMETER) TO DE 3. INTMPRETATION OF VALIDATION DATION ITDI VALIDATED CERTIF1-REQUIRE 8ENTS METHOD 3.2 SIBeuRT OF VALIDATION APPROACH 3.3 SLBeuRT OF ACCEPTANCE CRITERIA CATION 21 PRINTINC OF VAlEE/ A. 4.1.1.22 STATIC REFER 70 STATIC TEST PROCEDURE REFER TO STATIC TEST PROCEDIRE ST*.TUS OF A POINT TEST PARAGRAPH NO. 7.3.2. Panacma*H NO. 7.3.2.

B. VALIDATE THAT THE VAIEE/

STATUS OF A POINT CAN BE PRINTED.

22 ENTERING OF FILTER A. 4.1.1.22 STATIC REFER TO STATIC TEST PROCEDURE VERIFT THAT THE OUTPUT (USING THE CONSTANTS TEST PARACRAPH 7.4.2. '

APPLICABLE DIGITAL FILTERS) ARE

3. val.IDATE THAT THE FILTHt WITHIN +/- IT OF THE ALTERNATIVELT CONSTANTS / FREQUENCIES (FOR CALCfJIATED VAIEE.

IDW AND BANDPASS FILTERS)

CAN DE LNTERED IN THE PolNT DEFINITION DATA BASE AND ,

THET ARE APPLIED TO POINTS.

23 2D FIDT OF HISTORICAL A. 4.1.1.28 DATA B. VALIDATE THAT TWO DIMEN-STATIC REFER TO STATIC TEST PROCEDURE TEST PARACRAPH NO. 7.6.2.

REFFJt 70 STATIC TEST PROCEDURE Panacma*H NO. 7.6.2. 4'f n S10NAL PIATS OF PROCESS (% t3 us O HISTORICAL DATA CAN DE CENERATED. THE PLOTS CON-us b TAIN PolNT IDENTIFICATION H

H @

co AND THE TIME INTERVAL.

  • 24 2D Plaf FOR COMPOSED / A. 4.1.1.28 STATIC REFER TO STATIC TEST PROCEDURE REFER TO STATIC TEST PROCEDURE Cr CALCULATED / TRANSFORMED TEST PARAGRAPH NO. 7.6.2. PARACRAPH NO. 7.6.2.

VARIABLES 8. VALIDATE THAT TWO DIMEN-SIONAL PIDTS OF COMPOSFD/

CALCUI.ATLD/TRANStMMED HISTORICAL DATA CAN SE CENERATED. THE PIETS CON-TAIN P0lNT IDENTIFICATION

  • AND THE TIME INTERVAL.

25 DATA RECORDINC Fil2 A. 4.1.1.29 STATIC REFER TO STATIC TEST PROCEDURE O REFER TO STATIC TEST PROCEDURE EDIT TEST PARACRAPH N0'S. 6.2.9.3, 6.3.4 PARAGRAPH N0'S. 6.2.9.3, 6.3.4 B. VALIDATE THAT DATA RECORDING AND 6.3.6. AND 6.3.6.

FILES CAN DE CREATED AND EDITED BY AN OPERATUR.

23 0FFLINE CALIBRATION A. 4.1.1.29 STATIC REFER 70 STATIC TEST PROCEDURE REFER TO STATIC TEST PROCEDURE b FILES TEST PARAGRAPH NO. 7.5. PARAGRAPH NO. 7.5.

S. VALIDATE THAT OFF-LINE Call-3 RATION FILES CAN DE CREATED .

AND EDITED BY AN OPERATOR.

DIS SOFTWARE VOLIDATION NAIRIX Page A6 2

1 A. APPLICARLE PARAGRAPH NO. OF IDENTIFICATION OF ERIS THE "ERIS VALIDATION TEST 3 VALIDATION APPROACH 4 FEATURES (FUNCTION REQUIREMENTS" SPECIFICATION 3.1 COMPL1-PERFORMANCE CAPASILITY, REVISION 1 VALI- ANCE 3 OR PARAMETER) TO RE B. INTERPRETATION OF VALIDATION DATION CERTIFI-IIM VALIDATED REQUIREENTS METHOD 3.2 SUtttARY OF VALIDATION APPROACH 3.3 SUlttARY OF ACCEPTANCE CRITERIA CATION U OPERARILITY OF IDT'S A. 4.2.1.1.1, 4.3.1.2 STATIC REFER '1 STATIC TEST PROCEDURE REFER TO STATIC TEST PROCEDURE TEST PARAGRAPH NO. 10.3. PARAGRAPH NO. 10.3.

R. VALIDATE THAT THE OFFAATING SYSTEN FUNCTIONS SUPPORTING THE VIDED DISPIAY ARE OPERATIONAL. ,

28 OPERARILITY OF VIDEO A. 4.2.1.1.1 STATIC REFE3 To STATIC TEST PROCEDURE REFER TO STATIC TEST PROCEDURE COPIER (VDSATEC) TEST PARAGRAPH NO. 10.3. PARAGRAPil NO. 10.3.

B. VALIDATE THAT THE OPERATING SYSTEM FUNCTIONS SUPPORTING HARDCOPY DEVICES ARE OPERATIONAL.

9 RANCE CHECK AND A. 4.2.1.1.3, 4.2.1.1.4 STATIC REFER 70 STATIC TEST PROCEDURE REFER TO STATIC TEST PROCEDURE PARAMETER VALIDATION TEST PARAGRAPH NO. 10.3.4.1.1. PARAGRAPH NO. 10.3.4.1.1.

STOTUS REPORTING B. VALIDATE THAT THE RANGE CHECK INDICES CENERATE n Z APPROPRIATE IELROR MESSACES FOR OUT-OF-SENSOR-RANCE p@n m

PROCESS INPU15. VALIDATE mb THAT THE RTAD SUB-SYSTEM SHOWS THE VALIDATION STATE.

sQ H co t.n O USE OF PARAMETER A. 4.2.1.1.4 STATIC REFER TO STATIC TEST PROCEDURE REFER TO STATIC TEST PROCEDURE 6 CALIDATION STATUS FOR TEST PARAGRAPH No.10.4.3.2 AND PARACRAPH No.10.4.3.2 AND DISPIAY GENERATION 5. VALIDATE THAT THE PARAMETER 10.4.3.4 10.4.3.4.

STATUS IS UStD COMkFCTLY FOR DISPLAY CtRtJtATION.

31 APPLICATION OF INPUT A. 4.2.1.1.5 STATIC REFER TO STATIC TEST PROCEDURE PEFER TO STAttc TEST PROCEDURE g POINT COMPENSATION TEST PARAGRAPH NO. 10.4.3.1.2.1. PARACRAPH NO. 10.4.3.1.2.1.

73 DISPIAYlD VARI ABLES B. FOR DISPLAYED VARIABLES REQUIRING COMPENSATION, VAll-DATE THAT ANAIDC OR COMPUTED VARIARLES REPRESENTING UNCORRECTED FIDWS AND LEVELS ARE COMPENSATED AS A FUNCTION OF FLUID TEMPERATURE AND PRESSURE. i e

FRIS SOFTWARE VALIDATION MATRIX Page A7 2

1 A. APPLICABLE PARAGRAPli NO. OF IDENTIFICATION OF ERIS THE 'TJtlS VALIDATION TEST 3 VALIDATION APPROACH 4 FEATURES (FUNCTION REQUIREMENTS" SPECIFICATION 3.1 COMPLI-PERFORMANCE CAPABILITY, REVISION 1 VALI- ANCE O OR PARAMETER) TO BE B. INTERPRETATION OF VALIDATION DATION CERTIF1-ITEM VALIDATED REQUIREMENTS METHOD 3.2

SUMMARY

OF VALIDATION APPROACH 3.3

SUMMARY

OF ACCEPTANCE CRITERIA CATION 32 CALCUIATION.0F DERIVED A. 4.2.1.1.6 STATIC USING SIMULATED INPUTS AND/OR USIE APPROPRIATE RPV/C0ffrAlletENT PARAMETERS TEST DATA BASE PROVIDE ALL NECESSARY DISPLAYS VERIFY THAT THE DISPIAYED B. VALIDATE THAT FOR PROCESS INPUTS TO CALCUIATE POOL TEMP. OUTPUT IS WITHIN +/- 11 OF THE VARIABLES NOT DIRECTLY AND RPV TEMP. ALTERNATIVELY CALCULATED VALUES.

MEASURED, e.g. , BULK POOL AND RPV TDtPERATURES AND RV LEVEL, THE RTAD SUBSYSTEM CAN CORRECTLY DERIVE THESE VARIABLES FROM OTHER MEASURED VARIABLES.

33 LIMIT CHECKING OF A. 4.2.1.1.7 STATIC RU ER TO STATIC TEST PROCEDURE REFER TO STATIC TEST PROCEDURE VARIABLES TEST PARACRAPH NO. 10.6.1.1.3. PARAGRAPH NO. 10.6.1.1.3.

B. VALIDATE THAT THE SYSTEM CAN CHECK ANY ANAIAG INPUT AGAINST MULTIPLE LIMITS.

34 DETERMINATION OF THE A. 4.2.1.1.9 STATIC REFER TO STATIC TEST PROCEDURE RUER TO STATIC TEST PROCEDURE ', $

VALIDATION STATUS & TEST PARAGRAPH N0'S. 10.4.3.1.1 AND PARAGRAPH N0'S,10.4.3.1.1 AND $

CRITICAL PLANT B. VALIDATE THAT THE RTAD SUB- 10.4.3.1.2. 10.4.3.1.2. Hg VARIABLES SYSTEM IS ABLE TO C)RRECTLY v, DETDtMINE THE VALIDATION STATUS (NOT MEASURED NOT VALIDATED, VALIDATED) 0F THE MEASURED / DERIVED VALUE OF THE CRITICAL PLANT VARIA-BLES (CONTKOL PARAMETERS) .

ALSO VALIDATE 1 HAT THE .

STATUS IS USED CURRECTLY -

FOR DISP!AY GENERATION.

35 RPV/ CONTAINMENT AIALM A. 4.2.1.1.10 STATIC REFER TO STATIC TEST PROCEDURE REER TO STATIC TEST PROCEDURE g INDICATION CHLOC TEST PARAGRAPH NO. 10.4.3.3. PARAGRAPH NO. 10.4.3.3.

B. VALIDATE THAT THE TPV AIARM" AND/OR 'tONTAINMENT A! ARM" INDICATIONS AS THEY APPEAR ON EACH DISPIAY, REFLECT THE PROPER STATUS (INACTIVE, CAUTION. OR e AIARM) AS COVERNED BY THE RPV AND CONTAINMENT C0KIROL DISPIAY LIMIT TAGS AhD EVENT INDICATIONS. ,

DIS SOFTWARE VOLIDATION M41RIE Pare AS 2 .. . . .... .... .

1 A. APPLICARLE PARACRAPN NO. OF IRENTIFICAT10N OF ERIS THE *talS VALIDATION TEST 3 FEATURES (PUNCTION EkqutREMENTS* SPECIFICATION VALIDATION APPROACM PERPORMANCE CAPARILITY, 3.1 ' ~ ~

4 REVISION 1 VAL 1-D OR PARAMETER) 70 RE ConPL1-ITRn B. INTSPRETAT10N OF VALIDATION DATION AIICE VALIRATED RBqu1RRENTS METHOD CERTIF1-

.3.2

- _ SIRetARY OF VALIDATION APPROACM

. . . _ _ _ . _ _ . _ . 3 1 36 . . . - .

_ . . . . _ .. . . _.3.SUIRIARY OF ACCEPTANCE CRITRRIA CATION VALIDATE RTAD DATA A. 4.2.1.1.16 STORACE STATIC CREATE A SCENARIO TMAT A11A1WS 30 TEST VERIFY TNAT THE SCREEN DISPIATS MINUTES STORACE OF DATA POR All ARE DYNAMICAIJ.Y UPDATED PM TW B. VALIDATE THAT THE RTAD SUR-C0erTROL PARAMETERS AND TMEIR PULL 30 MletiTES.

SYSTRN CAN STORE 30 MINUTES COMPONENTS, AND UPDATES THE DIS-ts0RTM OF DATA FOR ALL CON-1ROL PARAMETSS AND THEIR PIAY FOR 30 MINUTES BASED ON THE Sremaalo INPUTS.

C0090NENTS. AISO VALIDATE <

THAT THIS DATA IS

_) RETRIEVABLE FOR DISP!AY CENERATIOII.

37 PROVISION FOR OPERATOR A. 4.2.1.1.17 TO CALL DISP!AYS/ MENU STATIC REFER TO STATIC TEST PROCEDURE TEST PARAGRAPM NO. 10.3. ACTIVATIces PROVIDES REQUESTED -

R. VALIDATE THAT FuteCTIost REYS DISP!AYS/ MENU.

A141NE OR FINICTION KEYS AND AIJNANUMERIC REYS (CollEC-TIVELY 1000481 AS OPERATORS KEYBOARD) ARE PROVIDED AND Z CAN CAIJ. UP ANY DISP!AY OR stENU.

b Q o

34 PROVIStost FOR OPERATOR A. 4.2.1.1.17 .-.

b o

STATIC REFER TO STATIC TEST PROCEDURE 70 Call FOR IIARDCDPY 0F SCREEN DISPIAY.

TEST PARAGRAPN 110. 10.4.3. A MARDCOPY OF THE DISP!AY PORMAT IS PROVIDED.

i B. VALIDATE THAT THE OPERATOR u i

CAN REQUEST A MARDCOPY OUT- e PUT OF A PORMAT USIIIG THE REYBOARD. ,

39 PROVISION FOR OPERAT0lt A. 4.2.1.1.11 i

TO Call FOR VERTICAL STATIC REFER TO STATIC TEST PROCEDURE TEST Panacma*le NO.10.3 (CM& LICE ACTIVAT1081 PROVIDES REQUESTED C SCALE SCREEN DISPIAY R. VALIDATE THAT THE OPERATOR VERTICAL SCALE CHAIIGE VERTICAL SCALES USileG PACE Cast REQUEST VtATICAL SCALE FORWARD / PACE RAcavana REYS).

CHAleGE OF SELECTED COIITROL g PARAMETS ON EPC DISP!AYS OR TREND fists USING THE REYBOARD.

40 PROVISION FOR PACING A. A.2.1.1.17. A.2.1.2.1 STATIC REFER TO STATIC TEST PROCEDURE PORWARD OR RACRWARD TIIROUCN ItENU FORMATS TEST PARAGRAPit 100. 10.3. ACTIVATICII CHAleCES DISPLAY PAGES IN REQUESTED FASHION. ,

' R. VALIDATE THAT THE OPERATENt CAN PACE FORWARD AND RACR-WAftD Tim 0 UGH EMU FORMATS.

a

D IS SOFTWARE V!.LIDATION MATRIX Page A9 2

1 A. APPLICABLE PARACRAPH NO. OF 3 VALIDATION APPROACH 4 IDFMTIFICATION OF FRIS THE "ERIS VAI.IDATION TEST FFATURES (FUNCTION RFQUIREMFNTS* SPECIFICATION 3.1 mMPLI.

PERFORMANCE CAPABILITY, REVISION 1 VALI- ANCE 0 OR PARAMFTn) TO DE B. INTERPRETATION OF WAI.IDATION DATION CFATIFI-ITDt VALIDATED REQUIRDIENTS METHOu 3.2 SUM 4ARY OF VALIDATION APPROACH 3.3 S199tARY OF ACCFFTANCE GITERIA CATION 41 REYp0ARD REQUESTS AND RESPONSES ARE DIS-A. 4.2.1.1.17 STATIC STATIC PROCEDURE PARACRAPH TEST NO. 10.3, STEP 28.

STATIC PROCEDURE PARACRAPN No.10.3, STEP 25.

M PLAYED WITM INSTRUC- B. VALIDATE THAT RFYBOARD TION FOR OPGATOR REQUESTS AND RFSPONSES ARE DISPLAYED ON THE OT WIT 11 INSTRUCTIONS FOR QPERATOR CHOICES AND INFORMATION REQUIRED.

42 PROVISION FOR PRE- A. 4.2.1.1.17 STATIC STATIC PROCFDURE PARACRAPH STATIC PROCEDURE PARACB APH VENTION OF InADVSTFNT TFST No.10.3, STEPS 4A-4E, 2, 3, 4 NO.10.3, STEP 2C-2C, 4A-4E, SA, EXECUTION OF PROGRAMS 3. VALIDATE THAT CAPABILITY 8. AND 9. SS, AND 9A-9C.

WHICH MAY ALTUt STORFD EXISTS TO PREVENT INADVFR-DATA. TENT OF PROCRAMS So STORFD i DATA MAY NOT DE ALTERED.

43 PROVISION FOR DISABI.ING A. 4.2.1.1.17 STATIC STATIC PROCEDURE PARACRAPH SEE STEP 5A 1HROUCH SE.

TEST No. 10.3 STEP SA TiptOUCH SE. OM AND ENASLINC OF REY-BOARD SELECTABLE 5. VALIDATE THAT ANY SET OF h FUNCTIONS REYSOARD SELFCTABLE FUNC- $ W TIONS CAN BE DISABI.FD ON ONE OR MORE SYSTEM gS H

REYSOARDS. j O

44 PROVISION FOR A. 4.2.1.1.17 STATIC ASSICN FUNCTION REY TO A St1ECTED ACTIVATION OF FUNCFION REY BRINGS REASSIGNINC OF FUNCTION TEST FORMAT AND SEE IF ACTIVATION OF UP RIGHT FORMAT.

SWITCHES UNDG SOFTWARE 8. VAI.IDATE THAT FUNCTION FFY BRINCS UP THIS FORMAT.

CONTROL. SWITCHFS ARE RfASSICNARI.E STEP d>B.

UNDER SOF1 MARE CONTROL.

. 45 TABUIAR FORMAT OF MENU A. 4.2.1.2.1 STATIC USINC CDC CALL THE SYSTEM NPNU IT IS CONCLUDED FY otSERVATION f CISPIAYS TEST AND ANALY2E THAT THE SYSTDI MENU THAT THE SYSTDI MENU DISPLAYS C 4-B. VALIDATE THAT THF. SYSTEM IS DISPIAYED IN TABULAR FORMAT. ARE PRESDrTED IN TADUIAR FORMAT.

MENU DISPIAY IS PRESENTED IN TABUIAR FORMAT.

46 HORIZONTAL SCALE FOR A. 4.2.1.2.2A STATIC USING CDC CONTROIS, CALL RPV IAST 10 MINUTES WORTH OF TREND /

RPV CONTROL PARAMETDt TEST COM1ROL DISPLAY (S). APPLY FNOWN IS PRFSENTED FOR ALL CONTROL C.yV l (EICEPT RPV TPMP.) 3. VALIDATE THAT THE HnRIZONTAL SINUIATED INPUTS FOR CONTROL PARAMETERS. IAST 30 MINUTF.S OF TR D D FIST IS THE MOST SCALE FOR FADI CONTROL PARAMETGS Folt 30 MINUTES. TREND IS PRESFJfTED FOR THE RPV RFCENT PAST 10 MINUTFS. PAR AMETm REPRFSENTS THE (PARA. NO. 10.4.3.1.2). TEMPERATURE.

THE SCAI.E FOR RPV TEMP MOST RECENT TFM MINUTES IS 30 MINUTFS. EXCEPT FOR THE RPV TEMPERA-TURE IN WHICH CASE THE MUST RECENT 30 MINUTES SHOUlD BF.

R EPR ESFMTFD.

NEDC-30885

, CLASS II 3

. $.EE -- -

E <!$$b ^

'h }

4

~ g8.. . . . ,

!g al!" -  ! =*

2 =g=g~8=* n . . .

==$

s l58E em. 8ll181imi8

.uS5

{tagg $2*o 5** *

= 8'5EME5

-2=c gr e .s ese %]5'85aEn.Bg eE- 5 .~d e 3

e .$gEsc "-d ge..eg s

g* - -

-- sE-5=

3 3- e. E=5

-<5--

5"Eg 25 EE gsg 84 aS agt gg5$oE -

1 g ". g5 a*Eg5o - e.

g* 5E-!=8g

= -E 38.5a -s

. M 2 g E.2*. a "

3"".#"dS5a. 5

  • g 5 5 s EUCt"E= g EW * =

o E-2$*

r..

g 5"gma "5PW Meta t.55E"3*>5 EE Ea UE"*W.5DW !ss o*U

$"~.5ggmuats*5E 5 5 . D 3 ogCW 5

2-E

. r3o,7 aj:".=%gb

",gggg,, ,gg8g gEEco".E o fs$$8 n

-58=s E

EE 3,rclg-

% e 5 nu US E"== E.< 3 r<ou 085. E"5a Ws ou. ..m.t o <2 2 5 =

s . = r 3 g . Sow . .

s 2 . andEagd E i #  % 5 a E 'd 5-e.5*u2 E <$"O.!'

andu8. n

! d5g#*

$"E~gg~3 g 8

.4 u -en5 .5

-2

5. 3-"3= =. E "m2 E

e 2

=

l-s a

3 5 g;

. e

===s=..E

== . e .

t 2.* a g I  % "$$

U EEaE." $**.E.aE=g

  • g ., -

R$ tu.Sa8$

53 '

55, c "821-  ! 2-a:EBo

-a5 28.e3-r

<a  !""8s=g 3 . R g"c 202 8

  • ~ d~ gM .E EE WdF5 a 4 d5ndb d$va5E5 d Ents.as

.Ba S S S 3 -g E 4.NE Ug Mr Og Qg

. Mr Mr 5 I a5 5 $ E g f . E. Q .d d- .

.g p 4 5 2 Sh Uy pa'5 2*  !'<o$

gE!g .t.

i s5 e s"gg .. g a.1E.*g25s8.

  • =8=-5g m SW3Ep

.m .

n!c

=

34U

  • i =<m.

"$d52 E

=~

-en >.a5.0.

2 g #.ex . " 8:.

""UI"gs. -us: 8 B

E - Wl~dl$

a d e Bo-eg*o. EEE "W5EE"uE8"$32d

<.d' g e! -

o 5 a 32 5 -

~

    • Sgg" E5g .W"5. .. g s38"

.3*I ~. 2 "u. o .

- ~ u , JEEa= e Egs g-

~

ea m_ 5.

- s e

maE--:e se@5 i a gg g5 de ~ . g "St- - 1 2;5gg. gqgggg...t

~ egegg,;3

- g05. E2 g g e, gg=-j

-a2*

g'W

%-. $t.er -

$"1

. . i . . v .= ."

  1. > 5.s t .BrE 5gg 2 - a 5 "4 d =a 5a : 5 *i 2.

<r C t.sE.25 e .-o-.E .f i " 3 r

~J a d a aa 4 a I

2 dg d id EWy A 55s 3m 5 g* '

a 3E 5 E*5"E E N

-g 2i E E- 5"22=ssge*

m m c5 .made-g,5 " a r,s . . n u

,3 5"EsE-m -,a E

. ev5c- la-50 g o5g Er soy . "Ecg5g" 5 Eg > gg g gEgEal g Ca# 8 on 5" ge.=5s." g 3g ,2 m8  !.

o5..

a s=

vE

=

Er=t5 .a o

5

4 8 C v! E3 E 8.c U-o! O S

v J

- 8 BIS SOFTWARE CALIDATION MATRIX

' Page C11 2 - . _ , _ _ _ . . . . . _ , _ . _ . _

1 A. APPLICABLE PARACRAPH NO. OF lDENTIFICATION OF ERIS THE 'ERIS VALIDATION TEST 3 FEATURES (PUNCTION VAtthATION APPROACM REQUIREMENTS

  • SPECIFICATION 3.1 4 PERPORMANCE CAPASILITT, REVISION 1 O VAL 1- COMPL1-OR PARAMETER) TO BE 3. INTERPRETATION OF VALIDATION ANCE ITEM DATICII VALIDATED REQUIRE 3EstTS METHOD 3.2 SineIARY OF VALIDATION APPROACM CERTIF1-49 3.3 Sl2004RY OF ACCEPTANCE G lTERIA CATION (Goet)

SCREEN AIID THE TIME ASSOCIATED WITN FIACEMENT OF THE MOST RSCENT VALUE OF PLACEMENT ON THE TRES DISPLAY.

50 TRACKIIIC OF VARIABLE A. 4.2.1.2.2C LIMIT VAIEES USIIIC

  • STATIC USING ENOISI SIMUIATED INPUTS THE DISPIAYED VARIABLE LIMIT TEST -

LIMIT LINES B. VALIDATE THAT FOR THE RPV AND/OR DATABASE VARY THE VALUES (PRESENTED SY ASSOCIATED LIMIT AND VERIFY THE REQUIRED TRACKING LINE) IS WITHIN +/- II 0F THE CON 1ROL DISPIATS THE LIMIT A800 COLOR CODINC.

LINES TRAG THEIR ASSOCIATED ALTERNATIVELY CALCUIATED VAlEE OF PROCESS LIMIT VALUES WITHIN THE VARIABIE LIMIT AT ANY CIVEN THE VERTICAL SCALE. VALIDATE INSTANT IN TIME, A80D THE COIER THAT THE COLOR CODIIIG IS CODE OF THE LIMIT TAC IS CONSIS-CONSISTENT WITH THE CURRE- TENT WITN THE VALUE OF THE SPONDING LIMIT TAC. PARAMETER.

51 NORITONTAL SCALE FOR A. 4.2.1.2.2A

' All CONTAll#EIIT CONTROL STATIC USINC CDC CONTROLS CALL EACH PARAMETERS IN TREND TEST CONTAINMENT CONTROL DISPIAY, IAST 10 MINUTES OF 1 BEND IS PRE- f O%

Pl#tS IS MOST RECENT B. VALIDATE THAT THE NORIZOIITAL SCALE FOR EACH PARAMETER APPLY EMotal SIMUIATED INPUTS SENTED FOR AIL CONTAllSIENT CONTROL O PARAMETERS.

ht2 M

FOR CONTROL PARAMETERS FOR PAST 10 MINUTES.

REPRESENTS THE MOST RECENT 10 N!NUTES. "w TEN MINUTES.

52 OPERATOR SELECTABLE A. 4.2.1.2.25 [ ]O AND DEFAULT VERTICAL STATIC Call AIL EPC CONTAINMENT CONTROL TEST THE OPERATOR SELECTABLE AND DEFAULT SCALES FOR CONTAIISIENT CONTROL PARAMETERS

3. VALIDATE THAT THE OPERATOR DISPIAYS ON IDr COIISOLES AND REVIEW SELECTABLE AND DEFAULT VERTICAL S N M ARE TNE SAME AS C[

SELECTAtlE AND DEFAULT THOSE SHOIAI IN THE SYSTEM DESIGN VERTICAL SCALES POIt All SPECIFICATIONS C95-4020.

VERTICAL PIET SCAL >3 FOR

" CONTAINMENT COIITR0l

  • CONTROL i THE CONTBot. PARAMETFRS ARE PARAMETERS. i AS SHOWN IN 095-4020 DOCUMENT. (

l 53 CORRECT PRESENTATICII A. 4.2.1.2.2D OF THE LIMIT TACS AND STATIC Call All ColffAlleIENT CONTROLS EPC THE LIMIT TAC NAMES ARE THE SAME AS C,Y TEST DISPLAYS AND VERIFY THAT THE RACE-CORRECT COIIIIRCTICII 0F B. VALIDATE THAT THE DYNAMIC TIIDSE 5800NN IN THE DESICII SPECIF1-THE LIMIT TAC TAILS 70 CROUIID INFO CORRESPOIIDS 70 THAT IN CATIOIIS. THE LIMIT TAC TA!!S AND/OR STATIC PROCESS LIMITS THE DESIGN SPECIFICAT1001. USIIIC TNE RAR CRAPMS FOR THE FOR EACH QNITROL PARAMETER COIINECT TO TNE SAR CRAPH AT THE COIITAIISENT C0001ROL SIIRilATED IIIPUTS AND OR DATABASE POSITION SPECIFIED IN THE DESIGN POR CONTAIIDENT COIF 1ROL) ARE PAAAMETERS VERIFY THAT THE ColER CODES ARE SPECIFICATION, AIID COIER CODE OF AS SPECIFIED IN C95-4020.

COIISISTENT WITH THE VALUE OF THE THE LIMIT TAC IS COIISISTENT WITH VALIDATE THAT THE LIMIT TAC VARIABLE.

  • TAILS" COIstECT TO THE SAR THE VAIEE OF THE PARAMETER.

CRAPH AT THE CORRECT Poleff WHICN CORRESPOIIDS TO THE PROCESS LIMIT A310 ARE I SMOIAI IN 095-4020. .

e ERIS SOFTWARE VALIDATION MATRIK Page A12 2 -

1 A. APPLICABLE PARAGRAPH NO. OF IDENTIFICATION OF ERIS THE "ERIS VALIDATION TEST 3 VALIDATION APPROACH 4 FEATURES (FUNCTION REOUIREMENTS" SPECIFICATION 3.1 PERPORMANCE CAPABILITY, REVISIC 1 1 COMPLI-  :

VALI- ANCE O OR PARAMETER) TO BE B. INTERPRETATION OF VALIDATION DATION ITEN VALIDATED CERTIF1-REQUIREPENTS METNGD 3.2 S199tARY OF VALIDATION APPROACH

_ 3.3 SurB4ARY OF ACCEPTANCE CRITERIA CATION 54 ACCURACY OF CONTAllelENT A. 4.2.1.2.2F CONTROL PARAMETER TREND TREND LINE COLOR STATIC USING StriUIATED INPUTS AND/OR TEST DATABASE, VARY THE VAlEE OF THE THE VAIEE PRESENTED BY TREND LINES IS WITHIN +/-!! 0F THE CORRESPOIS-M B. VALIDATE THAT THE TREND CONTAINMENT CONTROL PARAMETER OVER CODING, FIACEPENT OF ING RNOWN INPUT VAIEE. THE C0lmt LINES TRACK THE CONTAINMENT A RANCE OF -101 70 +1101. AT OF THE TREND LINE CORRESPONDS TO TREND LINE FOR UNDER/ CON 1ROL PARAMETERS. VALI- LEAST 4 SETS OF INPUT VAlEES THE VAlEE OF THE PARAMETER DIS-OVER SCALE AND DISPLAY DATE THAT THE LINE CotDR SHOUIA BE SIMULATEb IN THE DISPIAY FIAYED (ON BAR CRAPH) AT ANY TIME OF AVERACE VALUE IN CODING 15 CONSISTEh? WITH RESOIETION INTERVAL.

CASE OF MULTIPLE INSTANT. THE TREND LINE IS DIS-THE PARAMETER STATUS IN BAR FIAYED IN THE BOTTOM OF THE TREND SAMPLES DURING THE CRAPH. VALIDATE THAT TREND P14T WHEN Tite PARAMETER val)3ES ARE DISPLAY RES0!# TION LINES APPEAR AT THE TOP OR BETWLEN O AND -102 0F THE SELECTED INTERVAL BUTTOM OF THE P! JOT WHEN CON-SCALE. THE TREND LiliE IS DISPIAYED PARMETER EXCEEDS THE VERTI- IN THE TOP OF THE TREND PIAT WHEN CAL SCALE IN THE CORRESPOND- THE PARAMETDt VAISES ARE BETWEEN ING DIRECTION. VALIDATE THE 1001 AND 110% OF THE SELECTED WEIGHTED AVERACE VALI'E IS SCALE. THE VAX.UE PRESENTED BY THE DISPIAYED WHEN MORE THAN ONE TREND PLOT AT ANY CIVEN INSTANT, SAMPLE IS TAREN BY DAS IN OR THE WEICHTED AVERACE OF ALL THE THE DISPLAY REsclET10H TIME CORRESPONDING SIMUIATED INPUT INTERVAL.

VALUES WHICH OCCUR BETWEEN THE O$

TIME WHEN IAST TREND PLOT VALUE h t3 WAS FIACED ON THE SCRED6 AND THE Mh TIME ASSOCIATED WITH THE PLACEMENT M

y 0F THE MOST RECENT VALUE OF oo PARAMETER ON THE TREND DISPIAY.

55 TRACKING OF VARIABLE A. 4.2.1.2.2C STATIC USING KNOWN SIMUIATED INPUTS AND/ THE DISPIAYED VARIABLE LIMIT LIMIT VA!EES USING TEST OR DATARASE, VARY THE VALUE OF (PRESENTED BY ASSOCIATED LIMIT LlHlf LINES B. LIMIT l.INES ARE APPLICABI F. VAMI ABI.E PROCESS LIMIT (S) ASSO- LINE) IS WITHIN +/-110F THE TO CON!KUL PARAMETt*S ONLY CIATED WITH THE CONTROL ALTERNATELY CALCULATED VALUE OF AND SHALL BE VALIDATED PARAMETERS. THE VARIABLE AT ANY CIVEN INSTANT ACCURDING TO PARAGRAPil IN TIME, AND THE COIOR CODE OF THE 4.2.1.2.2.C FOR INDIVIDUAL LIMIT TAC IS CONSISTENT WITH THE TREND PLOTS AS WELL AS THE VALUE OF THE PARAMETER. '

PIATS ON THE EPG DISPIAYS.

54 PRESENTATION OF A. 4.2.1.2.3 STATIC USING CDC Call THE TREND PIET EACH CONTROL PARAMETERS IDENTIFIED CONTROL PARAMETERS TEST IENU. USING FORMAT NUMBERS IN THE SYSTDI DESIGN SPECIFICATION IN TREND FIAT FORM B. VALIDATE THAT All CONTROL IDENTIFIED IN THE MENU CAIL EACH HAS A FORMAT NUMBER ASSIGNED TO IT PARAMETERS AND THEIR COM- 0F THE FDRMATS LISTED IN THE MENU, IN THE TREND PIDT MENU. THE BACE-PONDtTS CAN BE DISPLAYED AND VFAIFY STATEMENTS CONTAINED CROUND INFORMATION ASSOCIATED WITH ON TREND PIETS AND THAT IN COLUMN 2 0F THIS MATRIR EACH FORMAT CORRESPONDS DlkECTLY THE DISPLAYS CONFORM TO ACAINST C95-4020. USING RNOWN TO THE SAME INFORMATION PRESDfTED THE FOLIEWING SIMULATED INPUTS AN0/OR DATABASE, IN THE DESIGN SPFLIFICATION. THE a) HORIZONTAL PIET SCALEt VARY THE VALUE OF THE PROCESS DICITAL VA!EES CORRESPOND DIRECTLY VALIDATE THAT THE DATE VARIABlA AND VRIFY THE PRESEN- TO THE KNOWN INPUTS AND BAR GRAPHS

. .g 9

ERIS SOFTWARE VALIDATION MATRIK

.- Page A13 2

1 A. APPLICARLE PARAGRAPH NO. OF 3 DENT 1FICAT10N OF BIS THE "215 VALIDATION TEST 3 FEATWES (FUNCTION VALIDATION APPROACM REQUIREMENTS

  • SPECIFICATION 3.1 4 PERPORMANCE CAPAtiLITT. REVISION 1 OR PARAMETER) TO SE VALI- COMPL1-EM R. INTERPRETATION OF VALIDATION DATION ANCE VALIDATED REQUIRE 9ENTS l METHOD CRETIF1-3.2 SIsotARY OF WALIDATION APPROACH b 3.3 StnetARY OF ACCEPTANCE CRITERIA CATION mt) TNAT TNE NORIZONTAL SCAL 2 POR EACH PLOT TATION OF BAR GRAPMS AND DIGITAL VAIEES PER REQUIRFJW.NTS INCIEDED ARE ACCURATE TO WITHIN THE RES0l#-

REPRRSElrFS THE MOST TION OF THE IDT CRT.

KECENT TNIRTT MINUTES. IN COLU196 2 0F THIS MATRIX.

b) VRITICAL PICT SCAL.Es VALIDATE THAT THE VERTI-CAL USED POR EACH PIAT REPRESE8tTS THE INSTRINENT RANGE OF THE PROCESS INPUT BEING PIATTED EXCEPT FOR CON 1ROL PARAM-ETERS WKICH MAY HAVE MULTIPLE SCALES (SEE PARAGRAPH A.2.1.2.2.5 FOR VALIDATION OF CON 1ROL PARAMETER TREND FIATS).

c) RAR CRAru AND DIGITAL READOUT VALIDATE THAT THE BAR GRAPil AND DICITAL READOUT CORRECTLY REFLECT V. l THE SINGLE PROCESS INPUT VAISE AND STATUS (OUT OF OQ hn RANGE, IN RANCE) AS APPLICARLE To EACH INPUT.

$b O FOR VALIDATION OF CONTROL H co l M 00 j PARAMETERS SEE PARACRAPH

  • 4.2.1.2.2.C.

d) LIMIT TACS LIMIT TACS ARE i

APPLICARI.E ONI.Y TO CONTROL t PARAMETERS AND SHALL RE VALIDATED ACCORDING 70 I PARAGRAPH 4.2.1.2.2.D FOR THE It4DIVIDUAL PIATS AS WElJ. AS FOR THE PIETS ON .

THE EPC DISPIAYS.

e) TREhD LINES: VALIDATE THE TREND LINES FOR PROCESS INPUTS AS WElJ. AS CONTROL PARAMETERS ACCORDING TO

8) PARACRAPH 4.2.1.2.2.F.

LIMIT LINES: LIMIT LINES I ARE APPLICARLE TO CONTROL PARANETERS ONLY AND SHAIJ.

DE VALIDATED ACCORDIMC TO PARAGRAPH 4.2.1.2.2.C FOR INDIVIDUAL TREND PLOTS AS

  • WE!J. AS THE PIETS ON THE EPC DISPIAYS.

1

E !!S SOFTWARE VALIDATION MATRIX Page A14 2

1 A. APPLICAnLE PARAGRAPN No. OF IDENTIFICATION OF B IS THE *ERIS VALIDATION TEST 3 VALIDATION APPROACM g FEATURES (FUNCTION REQUIREMENTS" SPECIFICATION 3.1 PERFORMANCE CAPASILITT, REVISION 1 COMPLI-VALI- As0CE OR PARANETER) TO BE B. INTERPRETATION OF VALIDATION DATION 0 VALIDATED CERT 1P1-REQUIRfsEllTS NETHOD 3.2 SIRetART OP VALIDATION APPFOACH 3.3 S140tARY OF ACCEPTANCE CRITERIA CATION SRV OPEN STATUS EVENT A. 4.2.1.2.4.1 STATIC SEE STATIC TEST PROCEDURE PARA.

!! DICA 10R ON EPC E1 SPLAT B. FOR A REPRESENTATIVE NIStBER TEST NO. 10.4.3.2.1.

SEE STATIC TEST PROCEDURE PARA.

No. 10.4.3.2.1. M.

OF INPUT COMBINATIONS OF SRV OPEN C0000 ANDS AND POSITION INDICATI005. VALIDATE THAT THE SRV OPEN STATUS INDICA-T1086, VALIDATE THAT THE SRV OPEN STATUS INDICATIONS ON DISP!AYS CORRECTLT REFLECT THE APPROPRIATE SYSTEM STATES AND THEIR ASSOCIATED CotDR CODING AND IADELS.

l NSIV SHUT STATUS EVENT A. 4.2.1.2.4.2 STATIC SEE STATIC TEST PROCEDURE PARA. SEE STATIC TEST PROCEDURE PARA.

131CA1DR ON EPG TEST No. 10.4.3.2.2.

DISPLAY NO. 10.4.3.2.2.

5. FOR A REPRESENTATIVE NtStBER OF INPUT COMBINATIONS OF NSIV ISOLATION C0094 ANDS, Z n

TIME SINCE RECEIPT OF CON-NAND, AND VALVE POSITIONS, VALIDATE THAT THE NSIV SHUT

{b in ?

03 t.a STATUS INDICATION CORRFITLY H

REFLECTS THE APPROPRIATE STATES AND THEIR ASSOCIATED H 0@0 COIDE CODING AND IABELS.

CROUP IS01ATED STATUS A. 4.2.1.2.4.3 STATIC SEE STATIC TEST PROCEDURE PARA. SEE STATIC TEST PROCEDURE PARA 3 EVENT 1%DICATUR ON TLST No. 10.4.3.2.3. NO. 10.4.3.2.3 EPG DISPLAY 3. FOR A REPRESENTATIVE NUMBER OP INPUT COMBINATIONS OF ISOLATION Comt4NDS. TIME SINCE RECEIPT OF ComtAND, VALVE POSITIONS, FIDW PATHS, AND IS01Afl0N 200PS, VAL 1- '

DATE THAT THE CROUP !solATED STATUS INDICATIONS CORRECTLY i l

REFLECT THE APPROPRIATE SYSTDI STATES AND THEIR ASSOCIATED COIDE CODING AND ,

LABELS.

4 e .,

  • ER15 SOFTWARE VALIDATION MA1RIE Page A15 ,

2 1 A. APPLICABLE PARAGRAPH NO. OF IDENTIFICATION OF ERIS THE "ERIS VALIDATIUN TEST 3 VALIDATION APEROACM 4 F::ATURES (FUNCTION REQUIREMENTS

  • SPECIFICATION 3.1 PERFURMANCE CAPAsiLITY, REVISION 1 COMPL1-VAL 1-3 OR PARAMETER) TO DE 3. INTERPRETATION OF VALIDATION ANCE DATION TEM VALIDATED REQUIREfENTS CERTIFI-METHOD 3.2 SteetARY OF VALIDATION APPROACH 3.3 SImetARY OF ACCEPTANCE CRITERIA CATION 60 SCRAM STATUS EVENT A. 4.2.1.2.4.4 STATIC SEE STATIC TEST PROCEDURE PARA. SEE STATIC TEST PROCEDURE PARA. -

IZICATOR ON EPC TEST NO. 10.4.3.2.4 DISPLAY No. 10.4.3.2.4.

5. FOR A REPRESENTATIVE NIStRER OF INPUT COMBINATIONS OF SCRAM C0petANDS. TIME SINCE RECEIPT OF C000END AND ROD POSITION INDICATION, VALI-DATE THAT THE SCRAM STATUS INDICATIONS CARECTLY REFLFIT THE APPROPRIATE SYSTEM STATES AND THEIR ASSOCIATED COIDR CODING AND IABEIS .

61 D.C. OPERATION STATUS A. 4.2.1.2.4.5 STATIC SEE STATIC TEST PROCEDURE PARA. SEE STATIC TEST PROCEDURE PARA.

EVENT INDICATOR ON EPG TEST NO. 10.4.3.2.5.

DISPIAY NO. 10.4.3.2.5.

3. FOR A REPRESENTATIVE NtStBER OF INPUT COMSINATIONS OF DG INITIATION C0094AND. TlHE z SINCE RECEIPT OF CostuMD, AND DG CPERATIONAL STAltlS n@o INFORMATION, VALIDATE THAT THE DIESEL CENERATOR OPERA- m6o 710N STATUS INDICATIONS CORRECTLY REFLECTS THE g$

H un APPROPRIATE SYSTEM STATE AND THEIR ASSOCI ATED COLOR CODING AND IABEI.S.

b2 SPM SYSTEM STATUS A. 4.2.1.2.4.6 STATIC SEE STATIC TEST PROCEDURE PARA. SEE STATIC TEST PROCEDURE PARA.

EVENT INDICATUR ON EPG TEST No. 10.4.3.2.6. NO. 10.4.3.2.6.

DISPEAY B. FOR A REPRESENTATIVE NUMBER OF INPUT COMBINATIONS OF SPMS INITIATION COpetAND.

TIME SINCE RECEIPT OF COM-MAND, AND VALVE POSITIONS, VALIDATE THAT THE SPMS STATUS INDICATION CORRECTLY REFLECTS THE APPROPRI ATE e SYSTEM STATES AND THEIR ASSOCIATED COIDR CODING AND LA9EIS.

ERIS SOFTWARE VALIDATION MATRIX Pagi A16 2

1 A. APPLICABLE PARACRAPH NO. OF IDENTIFICATION OF m lS THE "m!S VALIDATION TEST 3 VALIDATION APPROACH g FEATURES (FUNCTION REQUIREMENTS" SPECIFICATION 3.1 PERFORMANCE CAPABILITY, REVISION 1 COMPL1-VAL 1-0 OR PARAMETER) TO BE ANCE B. INTE PRETATION OF VALIDATION DATION l TEM VALIDATED REqu!REN!NTS CERTIF1-NETHOD 3.2 SunNART OF VALIDATION APPROACH 3.3 SUtetARY OF ACCEPTANCE CRITERIA CATION 63 MSL RADIATION / RADIATION A. 4.2.1.2.4.8 STATIC SEE STATIC TEST PROCEDURE PARA. SEE STATIC TEST PMOCEDURE PARA. C

  • 1:DICATION EVENT INDI- TEST No. 10.6.1.2.4.

CATOR ON EPG DISPIAY NO. 10.6.1.2.4.

B. FOR A REPRESElfTATIVE NUMBER OF NSL RADIATION AND GENERAL RADIATION HONITOR INPUT COM-BINATIONS VALIDATE THAT THE STATUS OF THE RADIAT!'ON INDI-CATION IS CORRECT. V mIFY THAT THE STATUS WILL INDI-CATE AIARM WHEN BUTH AIARF.

AND BAD DATA STATES EXIST.

WHEN BUTH CAUTION AND BAD DATA STATES EXIST. VERIFY THAT THE BAD DATA STATUS IS INDICATED. VERIFY THAT THE DEFAULT STATUS IS INACTIVE.

64 ADS OPEN STATUS EVENT A. 4.2.1.2.4.13 STATIC SEE STATIC TEST PROCEDURE PARA. SEE STATIC TEST PROCEDURE PARA. O nZ

!!DICATOR ON EPG TEST NO. 10.6.1.2.3.

IISPIAY No. 10.6.1.2.3.

B. FOR A REPRESENTATIVE NUMBEW p @O OF ADS INITIATION C0pBtANDS m AND SRV POSITIONS VERIFY Mb THAT VHEN A C0tB4AND IS sO INITIATED, A SAFE STATUS H$*

IS INDICATED FOR EACH OPEN VALVE AND AN AIARM STATUS IS INDICATED FOR EACH CIDSED VALVE. FOR ANY VALVE WHOSE POSITION IS NUT KN0 lad OR BAD DATA VmIFY THAT BAD DATA IS INDICATED. WHEN NO INITIA-TION CONNAND HAS BEEN RECEIVED VmlFY THAT THE INACTIVE STATUS IS REFLECTED.

35 SRV OPEN QUAlfTITY EVENT A. 4.2.1.2.4.14 STATIC SEE STATIC TEST PROCEDURE PARA.

1NIICATUR ON EPG TEST NO. 'S 10.6.1.2.1 AND 10.6.1.2.2.

SEE STATIC TEST PROCEDURE PARA.

NO.'S 10.6.1.2.1 AND 10.6.1.2.2.

@)

DISPLAY B. FOR A REPRESENTATIVE NtMBER '

0F INPUT COMBINATIONS OF SRV OPEN AND STUCK OPEN POSI-TIONS VALIDATE THAT THE TOTAL QUANTITY OF OPEN VALVES ARE INDICATED. e

ERIS SOF1 WARE VALIDATION MATRIX Page A17 2

1 A. APPLICABLE PARACRAPH NO. OF IDENTIFICATION OF ERIS THE "ERIS VALIDATION TEST 3 VALIDATION APPROACH FEATURES (FUNCTION 4 REQUIREMENTS" SPECIFICATION 3.1 PERPORMANCE CAPABILITY, REVISION 1 VAL 1- ConPL1-O OR PARAMETER) 70 BE ANCE

5. INTERPRETATION OF VALIDATION IATION ITEM VALIDATED REQUIRDENTS METHOD CERTIF1-3.2 SUPGtARY OF VALIDATION APPROACH 3.3 SupetARY OF ACCEPTANCE CRITERIA CATION 65 VALIDATE THAT THE AIARM (coot) STATUS IS INDICATED IF AT LEAST ONE VALVE IS STUCK OPEN. VERIFY THAT BAD DATA IS INDICATED ONLY IN THE EVENT THAT No SRVS ARE OPEN OR STUCK OPEN AND THAT AT LEAST ONE OF THE IMPUT VALVE POSITIONS IS BAD DATA.

% WATER AVAllABLE STATUS A. 4.2.1.2.5.1 STATIC SEE STATIC TEST PROCEDURE PARA. SEE STATIC TEST PROCEDURE PARA.

SYSTDI STATES ON TEST NO. 10.4.3.4 EPC DISPIAYS NO. 10.4.3.4.

B. FOR EACH APPLICABLE SYSTEN AND A REPRESENTATIVE NUMBER OF INPUT COMBINATIONS OF WATER SOURCE QUANTITY, WATER SOURCE LEVEL, AND MINIMUN PUMP OPERATING LEVEL, VALI-DATE THAT THE WATER AVAILA- O BLE STATUS IN9ICATIONS CORRECTLY REFLECT THE APPRO-h PRIATE SYSTIM STATES AN9 THEIR ASSOCIATED COIDR COD- "m g

gu ING AND 1ABELS.

67 COOLING AVA!!ABLE A. 4.2.1.2.5.2 STATIC SEE STATIC TEST PROCEDURE PARA. SEE STATIC TEST PROCEDURE PARA.

STATUS SYSTEM STATES TEST NO. 10.4.3.4.

ON EPG DISPLAY No. 10.4.3.4.

B. FOR EACH APPL.ICABl.E SYSTEM AND A REPRt.SENTATIVE NUMBER OF INPUT COMBINATIONS COOL-ING FIDW, COOLING PUMP RUNNING STATUS, VALVE LINE UP STATUS, AND COOLING WATER AND SYSTEM INLET TEMPERA-TURES, VALIDATE THAT THE COOLlHG AVAllABLE STATUS INDICATIONS CORRECTLY REFLECT Tile APPROPRIATE SYSTEN STATES AND THEIR ASSOCIATED C01DR CODING 8 AND IABEIS.

e e .e 6

ERIS SOFTWARE VALIDATION MATRIX Pete All 2

1 A. APPLICABLE PARACRAPH NO. OF IDENTIFICATION OF MIS THE "ERIS VALIDATION TEST 3 VALIDATION APPROACH 4 FEATURES (FUNCTION REQUIREMENTS" SPECIFICATION 3.1 PERPORMANCE CAPABILITY, REVISION 1 COMPLI-VAll-

) OR PARAMETER) TO BE ANCE B. INTD PRETATION OF VALIDATION DATION

'EM VALIDATED REQUIRDENTS CERTIFI-METHOD 3.2 SUl9tARY OF VALIDATION APPROACH 3.3 SUl9tARY OF ACCEPTANCE CRIT R IA CATION O MAI] CONDENSER VACulM A. 4.2.1.2.5.3 STATIC REFER TO STATIC TEST PROCEDURE REFER TO STATIC TEST PROCEDURE PARA.

STATUS SYSTEM STATES TEST PARA. NO.'S 10.4.3.4.5 AND NO. 'S 10.4.3.4.5 AND 10.4.3.4.6.

ON EPG DISPLAYS B. FOR EACH APPLICABLE SYETfM 10.4.3.4.6.

AND A REPRESENTATIVE NUMBFR OF INPUT COMBINATIONS OF CONDENSER PRESSURE AND.

CONDENSER TRIP SETPOINT PRESSURE, VALIDATE THAT THE MAIN CONDF.NSER VACUUM STATUS INDICATIONS COR-RECTLY REFLECT THE APPRO-PRIATE SYSTEM STATES AND THEIR ASSOCI ATED COIDR COD-ING AND 1ABELS.

D LIQUID AVAIIABLE A. 4.2.1.2.5.4 STATIC 3EE STATIC TEST PROCEDURE PARA.

STATES ON EPG DISPIAY SEE STATIC TEST PROCEDURE PARA.

TEST No. 10.4.3.4.8. NO. 10. 4. 3. 4. 8.

B. FOR EAcil APPLICABLE SYSTfM AND A REPRESENTATIVE NUMBER tFI 0F INPUT COMBINATIONS OF @

LIQUID SOURCE QUANTITY, us n LIQUID SOURCE LEVEL, AND U3 b MININUM PUMP OPERATING H O LEVEL, VALIDATE THAT THE LIQUID AVAllABLE STATUS H$*

INDICATION CORRECTLY REFLECTS THE APPROPRI ATE SYSTtN STATES AND THEIR ASSOCI ATED CotDR CODING AND LABELS.

O RPV PRESSURE STATUS A. 4.2.1.2.5.5 STATIC SEE 10.4.3.4 SEE 10.4.3.4 SYSTEM STATES ON TEST EPC GISPLAY. B. FOR EACH APPLICABI.E SY' TEM AND A REPRESENTATIVE NUMBER OF INPUT COMBINATIONS OF SYSTEM PUMP PRESSURES, RPV PRESSURES. AND PUMP QUANTI-TIES, VALIDATE THAT THE RPV PRESSURE STATUS INDICATIONS 8 CORRECTLY REFLECT THE APPRO-PRI ATE SYSTEM STATES AND THEIR ASSOCIATED COIDR COD-ING AND IABELS.

e

gtNa$0$

f v 0 9 - .>

g- m g 1 - I A I FN L IO

  • e PETI t MCt T e ONPAJ P 4CACC I

A R

E . .

T A A I R R R A A C P P E E E C R R N U U A D D e T E E P C C E O O C R R C P P A

T. T F S1 S O E E .

T4 T4 Y

R C3 C3 A I I H t e T4 T4 C S A A A L T0 T0 O S S1 S1 R

P 3 E . Z .

P EO EO A 3 SN SN N

O I H T C A A D O . .

I L

R P

A A R R A P A A V A P P

  • N E E O R R I U U T D D A E E D C C X I O O I L R R R A P P .

T V A T . T M F S1 S .

O E E .

N T4 T4 O Y I R C3 C3 T A t

I I A e T4 T4 D B A A I I T0 T0 L S S1 S1 A 2 E .

V E .

3 EO Eo E 3 SN SN R

A M ND C 1 C F - OO I I O I I H TT TT S 1.L AAE TT AS TE AS TE S 3VDM ST ST I

R E

N / R -

F N O R D RENO - R D D O O I tE N EWOR D ME N T E

.TI ST T DB A .TOI P O EB A . C T A TMF NAPTPDC OEA NTC D

I SUO YN

,O W HI DPC AAN AR TMF SUO YM

.NMEE OELL A .

I S I L S STTEMI E HI TBFMCL HNF A ENAAEUDHSI D S STTSAEEOE ENAAYl RTSB POI V EVOPMFPNTEO EVOPMSl AI C RTE F LI I BTTWOEN R I TC LI I R ASYAlSSA GAP O TA BTTWOEVNS ADS AAADFHESCTD CTNI NTTUESE . AAADFHAO RD RI N INI FI STL TS CTNI NT I EI N TE RTTEA AL O I NI FI LEB TYAFMAL LEB P A"S VT I S PSM .EASTEE IE PSM .EAWCITG AAH TT 6 PEOSCN SRTCB 7 PEOSCHOI R N E N1AN ARCPRTE TEYYSI SOA LSE ARCPRTPDPDI BI MNE TEJ 5 P UU 5 P UU NOND HETOOEALLSS HETOOENI RAO P AREORF 2 CRURSTSBT AD 2 CRURSTA C CERI PR I* SRI 1 A

BAN PG ANAEE N A EAN P C AFSPS L

I UI E U RDEl FTRA I Dl RAI 1 RD/UAER PEQVTQ 2 RD I

fE l WLNARI EG 2 RD I PEI PT MWLMAEAl T) i PMEENE ONFU0AOVORHN ONFUOAUTHTo ATRRI R 4 PAOP8 VCACPTI 4 FA0PPVPSTSC 2A B A B A B S Y I TE R Y RNIB E EOL T E

W A l

I I O A O P FTBT SR P S OCA NES1 MI NP) D EWU0 NED NUARE DOT AT OFCET NPAS FSC I( TA O TE /YP T EED CPSTY PSE ASCMI /M AA M CENAL RUETl UEN IRARA E PLSP PLO FUMAV T B S B ITRP TAO AMAMI WEl ED MAS E LE NEPR 0Tl T TIT 1I EFSO D

P ESASG EYVYP SAA YVT FSASE SAS M

E 1 2 OT 7 7 I

4

ERIS SOFWARE VALIDATION MATRIX Page A20 2

1 A. APPLICABLE PARACRAPH NO. OF .

IDENTIFICATION OF ERIS THE "ERIS VALIDATION TEST 3 VALIDATION APPROACH g FEATURF.S (FVNCTION REQUIREMENTS" SPECIFICATION 3.1 PERPORMANCE CAP /31LITY, COMPLI-REVISION I VALI- ANCE O OR PARANETt3) TO BE B. INT!*PRETATION OF VALIDATION DATION ITEM VALIDATED CERTIFI-REQUIRDENTS METHOD 3.2 SIMIART OF VALIDATION APPROACll 3.3 SUtetART OF ACCEPTANCE CRITERIA CATION A. 4.2.1.2.3.8 73 HYDRAULIC POWER AVAllARLE STATUS STATIC SEE STATIC TEST PROCEDURE PARA.

TEST No.'S 10.4.3.4.5 AND 10.4.3.4.6.

SEE STATIC TEST PROCEDURE PARA.

NO. 'S 10.4.3.4.5 AND 10.4.3.4.6.

M SYSTEM STATES ON B. FOR EACH APPf.lCABI.E SYSTFN EPC DISPLAY AND A REPRESENTATIVE NUMBER OF INPUT COMBINATIONS OF HYDRAULIC PRESSURE,* AND MINIMlM OPERATING HYDRAULIC PRESSURE, VALIDATE THAT THE HYDRAULIC POWER AVAllABLE STATUS INDICATIONS CORRECTLY REFIACT THE APPROPRI ATE SYSTtM STATES AND THEIR ASSOCI ATED COIDR CODING AND LABFIS.

74 SYSTEM %ALVE POWER A. 4.2.1.2.5.9 STATIC SEE STATIC TEST PROCEDURE PARA. SEE STATIC TEST PROCEDURE PARA. NO. 4*

AVAILABLE STATUS TEST NO. 10.4.3.4.7. 10.4.3.4.7.

SYSTt.M STATES ON B. FOR EACH APPLICABLE SMTEM EPG DISPLAY AND A REPRESENTATIVE NUMBER n Z OF INPUT COMBINATIONS OF FIDW PATHS AND VAI.VE POWER

%b un7 INFORMATION, VALIDATE THAT M t.)

THE HYDRAUI.IC POWER AVAllA- H O

m BLE STATUS INDICATION COR- Hm RECTLY REFLECT THE APPRO- u PRI ATE SYSTFN STATES AND THF.lR ASS 0tM ATED COIDR COD-ING AND IAPEIS.

75 A. 4.2.1.2.5.10 FEEDWATEA - CONDENSATE STATIC SEE STA% c TEST PROCEDURE PARA. SEE STATIC TEST MtOCEDURE PARA. O,!-

SYSTEM PUMP RUN STATUS TEST nu. 10.4.3.4.1 No. 10.4.3.4.1.

SYSTEM STATES C10 EPC B. F0k EACll APPLICABl.E SYSTFM DISPLAY AND A REPRESENTATIVE NUMBER OF INPUT COMBINATIONS OF PUMP Ot0UPS, FIDW PATHS AND PUMP RUNNING INFORMATION, VALIDATE THAT THE FEEDWATER/

CONDENSATE SYSTEM PUMP RUN STATUS INDICATION CORRFfTLY 8 REFLFfT THE APPROPRI ATE SYSTEM STATES AND THEIR ASSOCIATED COIDR CODING AND 1ABEIS.

e

ERIS SOFTWARE VALIDATION MATRIX Page A21 -

2 1

A. APPLICABLE PARAGRAPH No. OF IDENTIFICATION OF ERIS THE

  • BIS VALIDATION TEST 3 VALIDATION APPRUACH FCATUPES (PUNCTION g REQUIREMENTS" SPECIFICATION 3.1 PERPORMANCE CAPABILITY, REVISION 1 VAL 1- COMPL1-O OR PARAMETER) 70 RE ANCE B. INTERPRETATION OF VALIDATION DATION ITEM VALIDATED REQUIRE 8ENTS CERTIFI-METHOD 3.2 St99tARY OF VALIDATION APPROACH 3.3 SUlttARY OF ACCEPTANCE CRITERIA CATION 76 SYSTDI PUMP / FAN RUN A. 4.2.1.2.5.11 STATIC SEE STATIC TEST PROCEDURE PARA.

STATUS SYSTDI STATES SEE STATIC TEST PROCEDURE PARA.

RUO EPG DISPLAY TEST NO. 10.4.3.4. NO. 10.4.3.4.

B. FOR RACM APPLICABLE SYSTDt AND A REPRESENTATIVE NUMBER OF INPUT COMBINATIONS OF FIDW PATHS, PUMPS, AND PUMP RUNNING INFORMATION, VAll-DATE THAT THE SYSTEM PUMP /

FAN RUN STATUS INDICATIONS CORRECTLY REFLECT THE APPRO-PRI ATE SYSTDI STATES AND THEIR ASSOCIATED CORER COD-ING A:3D IABEIS.

?? VALVE OPEN STATUS A. 4.2.1.2.5.12 STATIC SEE STATIC TEST PROCEDURE PARA.

SYSTDI STATES ON EPC SEE STATI': TEST PROCEDURE PARA. O DISP 1AY TEST N') . 10.4.3.4 NO. 10.4.3.4.

B. FOR EACH APPLICABLE SYSTDt AND A REPRESENTATIVE NUMBER 0F INPtrT COMBINATIONS OF ny YALVE POSIFIONS AND MINIMUM { t1 VALVE POSITIONS, VALIDATE THAT THE VALVE OPEN STATUS mQ mu INDICAT104S CORRECTLY REFLFCTS THE APPROPRIATE H@

H co SYSTEM STATES AND TH !R

  • ASSOCIATED CotDR CODING AND IABEIS .

! 78 VALVE LINE UP STATUS A. 4.2.1.2.5.13 SYSTDI STATES UN LPG STATIC SEE STATIC 1EST PROCEDURE PARA. SEE STATIC TEST PROCEDURE PARA. 6 DISPLAY TEST No. 10.A.3.4. NO. 10.4.3.4.

B. FOR EACH APPLICABLE SYSTDt AND A REPRESENTATIVE NUMBER OF IMPUT COMBINATIONS OF FIDW PATHS AND val.VE POSI-TIONS, VALIDATE THAT THE VALVE LINE-UP STATUS INDILA-TIONS CORRECTLY REFLECT THE APPROPRIATE SYSTDI STATES AND THFIR ASSOCI ATFD COLOR CODING AND IABEIS. '

a

ERIS SOFWARE VALIDATION MATR11 PaBe A22' 2

1 A. AP?LICABLE PARAGRAPH No. OF ,

IDENTIFICATION OF ERIS THE "ERIS VALIDATION TEST .

VAI.IDATION APPROACH 4 FEATimES (FUNCTION REQUIREMENTS" SPECIFICATION 3.1 PERFORNANCE CAPABILITY, REVISION 1 COMPL1-VAL 1-

'3 OR PARAMETER) TO BE ANCE

3. INTMPRETATION OF VALIDATIN DATION ITEM VALIDATED CERTIF I-REQUIRE 3ENTS METHOD 3.2 SUMARY OF VALIDATION APPROACH 3.3 SulttARY OF ACCEPTANCit CRITERIA CATION 79 CONTAINMENT PRESSURE A. 4.2.1.2.5.14 STATIC SEE STATIC TEST PROCEDURE PARA. SEE STATIC TEST PROCEDURE PARA.

STATUS SYSTIM STATLS TEST Mo. 10.4.3.4 ON FPC DISP!AY NO. 10.4.3.4 B. FOR EACM APPLICABLE SYSTDI AND A REPRESENTATIVE NUMBER OF ll.Plff CONSINATIONS OF CONTAlleENT OR DRYWELL PRESSURES AND SPOAY INITIA-T10N PRESSURE LIMITS, VALI-DATE TWAT THE CONTAINMENT PRESSUWE STATUS INDICATINI CORRECTLY REFLECTS THE APPROPRIATE SYSTEM STATES AND THEIR ASSOCIATED COLOR CODING AND 1ABELS 80 POOL LEVEL STATUS A. 4.2.1.2.5.15 SYSTIM STATES ON EPC STATIC REFEK TO STATIC TEST PROCEDURE REFER TO STATIC TEST PROCEDURE d EISPIAY TEST PARA. NO. 10.4.3.1.1.6. PARA. NO. 10.4.3.1.1.6 B. FCR EAC;f APPLICABLE SYSTEM AND A REPRESENTATIVE MUMBER OF INPUT COMS! NATIONS OF nZ SUPPRESS 10M POOL LEVELS AND SPRAY N0ZZLE ELEVATIONS, hb VALIDATE THAT THE POOL MQ "u

LEVEL STATUS INDICATION CORRECTLY REFLECTS THE pO CD APPROPRIATE SYSTDI STATES u AND THEIR ASSOCI ATED C0thR CODING AND LABELS.

81 2-D PIETS A. 4.2.1.2.6 STATIC SEE STATIC TEST PROCEDURE PARA. SEE STATIC TEST PROCEDURE PARA. '

TEST NO. 10.4.3.5 NO. 10.4.3.5 B. VALIDATE THE CAPABILITY TO CENERfTE REAL-TIME ColAR CRAPHic 2D PIETS tROM PROCESS VARIABLES. COMPOSED POINTS, CALCULATED VAh!A-BLES, AND TRANSFORMED VARIA-BLES. VALIDATE TkAT THESE PIETS REPRESENT THE CURRENT VALUES OF THE TWO PARAMETERS i

AS A DISTINCT ENTITY AND THAT THE CURVE CONTINUOUSLY TRACKS THE VAIRES OF THE TWO PARAMETERS. ALSO VALIDATE e

E))S SOFTWARE VALIDATION MATRIX Pcae A23 2

1 A. APPLICABLE PARAGRAPH NO. OF IDENTIFICATION OF m lS THE "ERIS VALIDATION TEST 3 VALIDATION APPROACH 4 FEATURES (FUNCTION REQUIREMENTS" SPECIFICATION 3.1 PERFORMANCE CAPABILITY, REVISION 1 COMPLI-VALI- AMCE 0 OR PARAMETM) TO BE B. INTERPRETATION OF VALIDATION DATION ITEM VALIDATED CERT!PI-REQUIREE NTS METHOD 3.2

SUMMARY

OF VALIDATION APPROACH 3.3 SUD94ARY OF ACCEPTANCE CRITERIA CATION 81 THAT WHEN THE CURRENT VALUES (coot) PAIL WITHIN THE FORBIDDEN (LINIT) REGION, THE VAIAIES AND THEIR TRACE ARE CLEARLY DISTINCUISHED PROM THE FOR-BIDDEN REC 10N. ,

82 CRITICAL PLANT VARIABLE A. 4.2.1.2.7 STATIC WHILE RUNNING APPLICABI.E TESTS BACKCROUND INFORMATION CONTAINED DISPLAYS TEST DEFINED IN ITEMS 86 THRU 100, IN THE CRITICAL PIANT VARIABLE B. VALIDATE THAT THE CON 1ROL CAIL UP THE CRITICAL PIANT DISPLAY CORRESPONDS DIRECTLY TO PARAMETERS, LIMITS AND EVENT VARIABLE DISPIAY. THAT CONTAINED IN THE DESIGN INDICATIONS ON THE CRITICAL SPECIFICATION, THE VALUE OF CONTROL PIANT VARIABLES DISPIAY PARAMETERS, LIMIT TAC INDICATION RESPOND IDENTICA11Y TO THE AND STATUS OF EVENTS SHOWN IN CORRESPONDING DICITAL READ- THE CRITICAL PIANT VARIABLES OUT, LIMIT TAC OR EVENT

$ DISPLAY CORRESPONDS DIRECTLY TO INDICATION ON THE RPW CON- THOSE SHOWN IN THE EPC DISPIAY DUR-TROL OR CONTAINMENT CON 1ROL ING THE TESTS IDENTIFIED IN ITEMS DISPIAY. (SEE STATIC TEST 86 THRU 100.

PROCEDURE PARA. No.'S a 10.4.3.2,10.4.3.1, AND g n -Q 10.6.1.1 POR REFERENCE ONLY. mn Mb HO 83 DYNAMIC DATA UPDATE A. 4.2.1.1.12 TIME DYNAdlC WITH THE RTAD SYSTPel RUNNING, TEST Elfrigt SIMl!ATED DATA POR A TYPICAL VERIFY THAT THE NEASURED TIME IS LESS THAN 4 SECONDS. /

H$vt B. VALIDATE THAT THE DYNAMIC BHL TRANSIENT VIA FORMATTER INPUT DATA UPDATE TIE FOR CONSOLE PORTS. MEASURE THE TIME BETWEFJI CRT FORMATS IS 1.ESS THAN THE EMRY OF A SET OF DATA (ASSO-FOUR SFLONDS WITH MAXIMUM CIATED WITH A SINGLE POINT IN EXPECTED PROCESSOR LOADING. TIME) AND THE DISPIAY OF THAT DATA ON THE GRAPHICS DISPIAY CONSOLE.

84 HACKGROUND DATA A. 4.2.1.1.13 DYNAMIC WITH THE RTAD SYSTEM RUWNING AND VERIFY THAT THE MEASURED TIME IS

! t!PDATE TlHE TEST RESPONDING TO SIRIIATED DATA LESS THAN 4 SECONDS.

B. VALIDATE THAT CON 11t0L ROOM FOR A TYPICAL BWR TRANSIENT, CRT BACKCROUND DATA UPDATE MEASURE THE TIME BETWEEN THE

' TIME IS LESS THAN FOUR OPERATOR REQUEST FOR A PRE-SECONDS. PORMATTED PAGE DISPLAY AND PRESENTATION OF THE SAME DIS-PIAY ON GRAPHIC DISPIAY CONSOLE'S CRT.

e

E21S SOFTWARE VALIDATION MA1RIX Pare A24 2

1 A. APPLICABLE PARAGRAPH NO. OF IDENTIFICAT1000 0F BIS THE "ERIS VALIDATION TEST 3 VALIDATION APPRo&CN 4 FEATURES (PUNCTION REQUIREMENTS" SPECIFICATION 3.1 PERPORMANCE CAPARILITT, REVISION 1 VAll- COMPL1-0 OR PARAMETER) TO BE ANCE R. INTM PRETATION OF VAllDATION DATION ITEM VALIDATED RRQUIRSENTS METHOD 3.2 SipetARY OF VALIDATION APPROACH CERTIFI-3.3 SulttARY CF ACCEPTANCE CRITERIA CATION 85 1 REND FIDT DISPLAY AND A. 4.2.1.2.2 DYNAMIC REFER TO SCENARIO Z OF THE REFER TO APPLICAh!.E EXPECTED VALIDATION STATUS OF TEST DYNAMIC TEST PROCEDURE. TEST RESULTS CONTAINED IN THE CONTROL PARAMETERS IN R. VALIDATE THAT AIJ. CONTROL RPV DISPLAYS PARAMETIRS FOR RPV CON 1ROL DYNAMIC TEST PROCEDURE.

DISPIAYS ARE AVAllARLE IN TREND Pl#T PORM ON EPC DISPLAYS, AND THEY REFLECT THE VALIDATION STATUS SHOWN IN THE CORRESPONDING PARAMETER VALIDATION DISPLAYS.

86 PRESENTATION OF VALUE A. 4.2.1.2.2C DYNAMIC REFER TO Z SCENARIO OF THE REFER TO THE Z SCENARIO OF THE AN'S THE VALIDATION TEST DYNAMIC TEST PROCEDURE. DYNAMIC TEST PROCEDURE.

STATUS OF RPV CON 1ROL R. VALIDATE THAT THE VAIJ1E AND PARAMETERS IN RPV STATUS (NOT MEASURED, NOT CONTROL DISPLAYS VALIDATED, VALIDATED) 0F THE RPV CON 1ROL PARAMETERS IN O THE BAR GRAPHS AND DIGITAL RfADOUTS IS CORRECT.

mau 87 TREND FIAT DISPLAY AND A. 4.2.1.2.2 DYNAMIC REFER TO SCENARIO Z OF THE REFER TO SCENARIO Z OF THE VALIDATION STATUS OF TEST DYNAMIC TEST PROCEDURE.

N 00 DYNAMIC TEST PROCEDURE.

Coll 1ROL PARAMETERS IN CONTAllRtENT CONTROL R. VALIDATE THAT All CON 1ROL $

PARAMET R S FOR CONTAINMENT R'ISPIAYS CON 1ROL DISPLAYS ARE AVAllA-Bl.E IN TREND PIDT FORM ON EPG DISPLAYS, AND TilEY REFLECT THE VALIDATION STATUS SHOWN IN THE CORRE-SPONDING PARAMETER VAI.lDA-TION DISPLAYS.

88 PRESENTATION OF VALIDA- A. 4.2.1.2.2C DYNAMIC REFER TO DYNAMIC TEST REFER TO SCENARIO Z OF DYNAMIC TION STATUS OF CollTAIM- TEST PROCEDURE SCENARIO. K_7)

TEST PROCEDURE.

MENT CON 1ROL PARAMETERS R. VALIDATE THAT THE VALUE AND 1:3 CONTAllSENT CON 1ROL STATUS (NOT MEASURED, NOT DISP!AYS VALIDATED, VALIDATED) 0F THE CONTAINE CONTROL PARAM-ETERS IN THE BAR GRAPHS AND DIGITAL READ 0UTS ARE COR-RECTLY PRESENTED.

e

ERIS SOFTWARE VALIDATIO MATRIK Page A25 2

1 A. APPLICABLE PARAGRAPH NO. OF IDENTIFICATION OF BIS THE "ERIS VALIDATluN TEST 3 VALIDATION APPROACH 4 FEATURES (FUNCTION REQUIREMENTS" SPECIFICATION 3.1 PERFORMANCE CAPABILITY, REVISION 1 COMPL1-VALI-0 ANCF.

OR PARAMETER) TO BE B. INTDPRETATION OF VALIDATION DATION

!TDI VALIDATED CERTIF1-REQUIRFJENTS METH04 3.2 SUtttARY OF VALIDATION APPROACH 3.3 SID91ARY OF ACCEPTANCE CRITERIA , CATION 89 SYSTDI PERFORMANCE A. MONE DYNAMIC ENTER SIMU!ATED AND/OR MEASURED DESIGN REVIEW AND ANALYSIS INDI-DURING POWER TRANSIENT, TEST DATE, PER THE CRITERI A C01(TAINED AND PROCESSOR LOADING. CATES THAT ALL REQUESTED FUNCTICIIS B. DEMONSTRA15 ACCEPTABILITY OF IN THE DYNAMIC TEST PROCElURE ARE PERFORMED, AND THE RESP 0MSE/

SUSTAINED SYSTDI PERFORMANCE AND CAPTURE THE DATA REQUIRED EXECUTION TIME AND ACCURACY OF TNR WHEN SUBJECTED TO THE BY TiiE PROCEDURE. OUTPUTS ARE ACCEPTABLE. THE DYNAMICS OF THE PIANT AS PROCESSOR IAADING IS AT OR BEthW REPRESENTED BY A SET OF WHAT IS CONTAINED IN THE TYPICAL BWR PIANT TRANSIEN15. REQUIREMENTS.

\fRIFY THAT PRILTIPLE FUNC-TIONS CAN BE SitRILTAMEDUSLY PERFORMED. ALSO VIRIFY TilAT THE PROCESSOR LOADING IS IN ACCORDANCE WITH THE REQUIRE-MENTS, AND Tile SYSTEM IS FREE PROM SYSTEM TIMING PROBLEMS, MULTIPLE FUNCTION INTERACTION PROBLDtS AND RACE CONDITIONS.

nZ 90 RTAD SYSTEM CAPABILITY A. 4.2.1.1.11 h TO ACColet0DATP. 100 INSPEC- ANALYZE THE SYSTEM SOF1 WARE TO TION VERIFY TPAT 100 PREPORMATTED ANALYSIS PROVES THAT STATEMENT IN COLutet 1 OF THIS MATRIK IS yw PREFORMATTED ? ACES B. VALIDATE THAT THE RTAD ANALY-SYSTDI CAN DISPIAY A MINIMUM SIS FACES CAN BE ACC0t990 DATED BY THE RTAD SYSTEM.

CORRECT.

[OCD OF 100 PREfDRMATTED PACE

  • FORMATS.

91 CAPALILITY OF A DISPIAY A. 4.2.1.1.11 INSPEC- HEVIEW HEIATIVELY BUSY RTAD ANALYSIS PROVES THAT STATEMENT FCOMAT TO ACCottt0DATE 75 DYNAMIC VARIABLES B. VALIDATE THAT A FORMAT PACE TION DISPIAYS AND BY ANALYSIS VERIFY IN COLL'f91 10F THIS MATRIK IS 4 ANALY- THAT 75 DYNAMIC VARIABLES Will CORRECT.

CAN ACC00990DATE A MINIMUM SIS NOT CIAITTER A PREFORMATTED PACE OF 75 DYNAMIC VARIABLES DISPLAY.

INCLUDING (DICITAI. VALUES, TREND Plats, 2-D PIETS, ANC BAR CRAPHS.)

92 DEMONSTRATION OF A. 4.2.1.1.11 INSPEC- REVIEW RTAD DASPIAYS AND VERIFY DYNAMIC DISPLAY TYPES B. VALIDATE THAT THE RTAD TION THAT THE DISPIAY TYPES INDICATED ANALYIS PROVES THAT STATEMENT IN COLUMN 1 0F THIS MATRIK IS h ANALY- ABOVE ARE INCLUDED IN THE PRE- CORRECT.

SYSTEM CAN DISPIAY DYNAMIC SIS FORKATTED PAGE DISPIAYS.

DATA TYPES AS FOLIDWS BAR CRAPHS, TREND PLOTS, DIGITAL VAIMES, 2-D PIATS, COIDR CHANGES, SHAPE CHANGES, ALPHANUMtRIC CHARACTtRS AND ,

COIDR BLINKINC.

Page A26 DIS SOFTWARE VALIDATION MATRIX 2

A. APPLICABLE PARAGRAPH NO. OF 3 VALIDATION APPROACH 4 1

IDENTIFICATION OF ERIS THE "ERIS VALIDATION TEST COMPL1-REQUIREMENTS" CPECIFICATION 3.1 AMCE FEATURES (FUNCTION REVISION 1 VAL 1- CERTIF1-PERPORMANCE CAPABILITY,

5. INTERPRETATION OF VALIDATION DATION CATION

) OR PARAMETER) TO BE 3.3

SUMMARY

OF AC EI"TANCE CRITERIA METHOD 3.2 SUtetARY OF VALIDATION APPROACH IEN VALIDATED REQUIREIGLNTS INSPE;l- ANALYZE THE PREFORMATTED RTAD ANALYSIS PROVES THAT STATEMENTS D3 VERIFY THAT SYSTEM CAN A. 4.2.1.1.11 IN COLUlet 1 0F THIS MATRIX IS TION DISPLAYS AND VERIFY THAT DEFINE APPROPRIATE AISHANUMERIC/CRAPHIC SYMBOLS CORRECT.

5. VALIDATE THAT THE SYSTEM CAN ANALY-ALPHAN19tERIC DEFINE A!JHANUMFAIC AND SIS HAVE BEEN DEFINED FOR THE A30VE SYMBOLS CRAPHIC SYMBOLS FOR THE FOL- FUNCTIONS.

EDWING FUNCTIONS LABELS /

IDENTIFIIRS, ABBREVI ATID DESCRIPTIONS, UNITS OF MEASURE, PolNTS OF REFERENCE DEVICE ENCODING, AREA DELINEATORS, CONNECTORS AND LINE DIACRAMS.

94 CENERATION OF THE COPY A. 4.2.1.1.14 INSPEC- PRESS HARDCOPY FUNCTION ON IDT TO THE CENERATED COPY IS THE BIACK [

TION SEE IF REQUESTED COPY OF SCREEN AND WHITE COPY OF THE DISP 1AY OF THE DISPLAY SCREEN IS CENtRATED. ON THE IDT CONSOLE.

B. VALIDATE THAT BIACK AND ANALY-WHITE HARD COPY OF ANY ERIS SIS DISPLAY WHICH IS CURRENTLY DISPLAYED ON AM SIC CDC CAN g BE GENERATED.

THE REVIEW INDICATES THAT THE h

mQ A. 4.2.1.1.15 INSPEC- OBSERVE IF THE IDW13 RICHT HAND us CS VIDEO FORMAT DATA AND TION CORNER OF EACH DISPLAY SHOWS THE STATEMENT INC1 AIDED IN COLUtet 2B (J)

TIME VALIDATION CotDR CUM STATUS (R/C/B). ALSO OF THIS MATRIX IS TRUE. gg B. VALIDATE THAT THE COLOR CUN ANALY- sm STATUS (RED, GREEN AND BLAIE) SIS OBSERVE IF DATE AND TIME APPEAR IN N AND THE DATE AND TIME APPEAR IDWER RIGHT HAND CORNER AND TIME IN THE IDWER RIGHT HAND EXPRESSED TO NEAREST SECOND.

CORMER OF EACH DISPLAY AND THAT THE TIMF. IS FJPRESSED TO THE HEAREST SECOND.

INSPEC- USING THE CPU NULL TIME DATA THE ANALYSIS INDICATES THAT THE 96 VERIFY THAT THE NUMBER A. MONE NUMBER OF liff TERMINALS (CRT'S)

TION COILECTED DURING THE DYNAMIC OF I!YT TERMINAIS TEST, ANALYZE THE SYSTEM SOFT- IS CREATER OR EQUAL TO THOSE B. REFER TO COLUMN 1 0F ANALY-(CRTS) WHICH CAN DE WARE AND PROJECT THE NUMBER CONTAINED IN THE SYSTEM THIS MATRIX. SIS DRIVEN SY THE RTAD OF IDT TERMINAIS WHICH CAN REQUIREMENTS.

SYSTEM WITHOUT SYSTEM BE DRIVEN WITHOUT V101ATING PERFORMANCE DECRADA- CONSTRAINTS CONTAINED IN THE TION (BEYOND THE LIMIT COLUlet 10F THIS MATRIX.

SPECIFIED IN THE SYSTEM DESIGN SPECIF1- a CATION) IS CREATER THAN OR EQUAL 10 THOSE CONTAINED IN THE REQUIREMENTS e

ERIS SOFTWARE VALIDAT1!N MATRIX Page A27 ,

2 1 A. APPLICABLE PARAGRAPH NO. OF IDEWTIFICATION OF ERIS THE

  • IRIS VALIDATION TEST 3 VALIDATION APPROACH 4 FEATURES (FUNCTION REQUIREMENTS " S PECIFICATION 3.1 COMPLI-PERFORMANCE CAPABILITY. REVISION 1 VALI- ANCE O OR PARAMETER) TO BE B. INTERPRETATION OF VALIDATION DATION CDtTIF1-ITEM VALIDATED REQUIREMENTS METHOD 3.2

SUMMARY

OF VALIDATION APPROACH 3.3

SUMMARY

OF ACCEPTANCE CRITERIA CATION 97 VERIFY THAT THE A. NONE INSPEC- CAILUIATE THE ERIS SYSTEM RELIA- THE SYSTEM AVAl! ABILITY IS CALCUIATED ERIS SYSTDI TION BILITY BASED ON TYPICAL USE GREATER THAN OR EQUAL TO THAT AVAIIABILITY IS IN B. REFER TO COLUMN 10F THIS ANALY- AND SPARE PARTS. (X)NTAINED IN THE MEQUIREMENTS.

ACCORDANCE WITH THE MATRIX. SIS REQUIREMENTS 98 VALIDATE THAT THE TRA SYSTEM IS CAPABLE &

A. NONE INSPEC- PERFORM THE ANALYSIS FOR A TION TYPICAL ERIS SYSTEM AND VERIFY REFER TO COLUMN 3.2 0F THIS MATRIX.

h STORING AMOUNT OF B. REFER TO COLUMN 10F THIS ANALY- THE NUMBER OF MEGABYTE OF TRANSIENT DELTA DATA MATRIX. SIS STORAGE CAPABILITY AVAIIABLE FOR CONTAINED IN THE RECORDING OF THE DELTA DATA IS REQUIREMENTS GREATER THAN OR EQUAL TO THAT CONTAINED IN THE REQUIREMENTS.

oz b

a?

05 3

4

I NEDC-30885 CLASS II APPENDIX B MAJOR V&V ACTIVITIES OF THE ERIS PROJECT

,,g e

9

NEDC-30885

(

CLASS II 1

This Appendix describes the major V&V activities of the ERIS Project.

Throughout the project validation and verification activities were performed.

The major V&V activities are:

1. Regulatory requirement compliance verification ,

2 Safety evaluation of the basic RTAD subsystem

3. System document verification
4. Computer system hardware design document verification
5. Verification of sof tware requirements and design specification and software source listings 1 6. Generic sof tware validation
7. System field verification -

Bl. REGUIATORY REQUIREMENT COMPLIANCE YERIFICATION The system regulatory requirements are identified by the document

~

identified as Reference 2. This document in turn invokes detailed -

requirements by referring to other regulatory and industrial standards documents. GESSAR II (Referenu 3) verifies and demonstrates compliance to these requirements.

B2. SAFETY EVALUATION OF THE BASIC RTAD SUBSYSTEM This evaluation is detailed in the document identified in Reference 4.

This evaluation documents acceptance of the following, by US NRC:

, 1. Sof tware engineering standards, convention and practices i

2 Sof tware verification and validation plan

3. Correct application of human factors engineering program in the design
4. Use of emergency procedure guidelines as a basis for parameter I selection and correct selection of the parameters )
5. Adequacy of the parameter validation algorithms l
6. Adequacy of electrical and electronic isolation
7. System reliability B1

NEDC-30885 CLASS II B3. SYSTEM DOCUlGNT VERIFICATION System document establishes design requirements for the system. Major system documents are:

1. System design specification
2. System application specification 3.. System hardware configuration (instrument electrical diagram)
4. System application data specification
5. Plant interface document
6. Validation test requirement specification
7. Pre-operational test requirement specification
8. Startup test requirement specification These documents were independently verified, by design review and/or "*

engineering review memorandum, for the following:

1. Compliance to regulatory requirements and industrial standards
2. Compliance to the contract
3. Design accuracy and completeness
4. Design adequacy
5. Safety and reliability (if applicable)
6. Interface compatibility
7. System / plant application
54. COMPUTER SYSTEM HARDWARE DESIGN DOCUMENT VERIFICATION Major documents in this category include the following:
1. System elementary diagram and parts list
2. Hardware purchase specifications
3. Input / output list instruction B2

NEDC-30885 CLASS II These documents in conjunction with the hardware vendors (operation, maintenance, installation and test) documents, the pre-operational test instructions and BWR systems elementary diagram are used to procure the system hardware, install it at the site, test the hardware, interface the system to the plant and check the correctness of the installation. ~

Above documents were verified by design review and/or engineering review memorandum.

B5. VERIFIC& TION OF SOFNARE RE@IREMBITS AND DESIGN SPECIFIC &TIONS AND SOFTWARE SOURCE LISTINGS The Sof tware Requirement Specifications (SRS) were verified for compliance to the system design documents. The Sof tware Design Specifications (SDS) were verified for compliance to the SRS. The software source listings are verified for compliance to the SDS. The sof tware source listings were also verified / tested to ensure that the executable code is in accordance with -

the source listings. The method of verification was design review and/or engineering review memorandum.

B6. GENERIC SOFNARE VALTDATION .

During software development phase individual functions were tested for ~

verification of compliance to the software requirement specifications. This test is identified as integration (static) test. The test was based on documented test plan and procedures. The integr: tion test consists of a series of tests. Each test verifies one function at a time for operability and accuracy. The test specimen included multiple functions, but only one function was executed at a given time. The static tested generic ERIS sof tware was integrated with lead plant data base and resulting sof tware was dynamically tested. Realistic plant transients were simulated and a separate computer, (acting as a power plant) provided the plant inputs to the system being tested. During the test the results were logged and recorded. The results were subsequently reviewed for acceptance. The tested system has been archived and the test log /results are contained in the design record file.

B7. FIEID VERIFICATION AND TESTS Following verifications and tests are recommended to be performed in the field:

1. Computer hardware operability verification Tests specified by hardware vendor should be performed to check accuracy of system installation and intrasystem connections.
2. Pre-operational test This test should be used to verify the connection of the system to other plant systems.

B3

NEDC-30885 CLASS II

3. Integration test This test should be used to verify that the delivered software can perform the required functions.
4. System /startup test This test should be used to verify plant unique system data bases and total system operability in the plant environment.

4

-4 l

l l

1 B4

W NEDC-3088 5 CLASS II APPENDIX C ENGINEERING OPERATING AND PRODUCT QUALITY PRACTICES AND PROCEDURES e

e

4 l

ENGINEERING OPERATING AND PRODUCT QUALITY PRACTICES AND PROCEDURES MATRIX

, V&V i

Practices and P rocedures. Description Reference (s)

1. Project Management Organization and responsibilities for
  • Engineering Operating i and Responsibility tasks associated with applicable life Procedures (NEDE-21109)

Matrices cycles of the sof tware

  • Sof tware Management Plan (G-81W-SMP-8430.1-0001)
  • Configuration Management Plan (G-81W-CMP-8429.5-0001) n=
2. Standards, Practices, Documentation standards, logic
  • Document Preparation Guide p@

n and Conventions structure standards, code standards, (NEDS-24760) g?

" and commentary standards which must r, 8 be followed during the design

  • Software Engineering Manual H E

process (NEDE-30682)

3. Independent Design Independent review and audit practices
  • Engineering Operating Procedures

. Verification and for all designs, documentation, and (NEDE-21109)

Validation records. Reviewed material types include requirement specifications, *Sof tware Engineering Manual design specifications, purchase (NEDE-30682) specifications, data bases, test procedures, test results, etc. *Sof tware Management Plan j (G-81W-SMP-8430.1-0001)

Practices include design reviews, I

engineering review memorandums,

  • Configuration Management Plan independent design verifications, (G-81W-CMP-8429.5-0001) independent validation test, etc.
  • Verification and Validation Plan (NEDC-30675) e .e o

' ENGINEERING OPERATING AND PRODUCT QUALITY PRACTICES AND PROCEDURES NATRIX (Continued) i V&V Practices and Procedures Description Reference (s)

4. Software Configuration Software item identification, control, Management
  • Configuration Management Plan

! change implementation, and status (G-81W-8429.5-0001)

] re por ting.

4

5. Engineering Change Requirements for engineering document-Control
  • Engineering Operating Procedures tation and configuration change (NEDE-21109) control to ensure traceability.
6. Problem Reporting and Requirements and procedures for Corrective Actions
  • Engineering Operating Procedures identifying, tracking, resolving, (NEDE-21109) n n:

and documenting problems with n Sof tware Problem Reports. C@

"

  • Software Management Plan $i Corrective Action Reporta, Field (G-81W-SNP-8430.1-0001) s8 2

i Deviation Disposition Requests, and "E" Field Disposition Instructions, etc.

  • Configuration Management Plan (G-81W-CMP-8429.5-0001)

j 7. Engineering Records Traceability and long term retention

  • Engineering Operating Procedures

! Retention of key design documentation and

! (NEDE-21109) software media.

) Configuration Management Plan i

(G-81W-CMP-8429.5-0001) i 8. Software Tools, Tools, techniques, and methodologies

  • Software Management Plan l Techniques, and employed on the specific project.

, Methodologies (G-81W-SNP-8430.1-0001) j

NEDC-30885 CLASS II APPDIDIX D

SUMMARY

OF TYPICAL SOFTWARE PROBLEM $* DISCOVERED DURING ERIS SOFTWARE VALIDATION PROGRAM

  • Method of sof tware problem tracking and resolution is described in References 5, 6, 7 and 9. J l

NEDC-30885 CLASS II Ill. MTE: 11-MAR-1985 20:30:00:00 SPR NINEME: VAL 8 PROBIJM STATB ENT:

  • It is necessary to change the working set size of A_ACT_PT_DP to 1000 to improve performance and minimize page faulting.

DATA: 11-MAR-1985 21:30:00:00 RESOIRTION:

DAS control executable and startup file with needed change are now available from the library.

i l

D1

r s.

NEDC-30885 CLASS II s

D2. DATE: 12-MAR-1985 SPR NWSRs VAL 12 PROBIAM STATIBENT:

Two separate composed points are required for the APRM DOWNSCALE limit tag (Toshiba tristate) and tail (analog). LRAPRMDN only composes the tristate

~

tag point for the tag point for the tag. No code or instructions exist for the analog tail: point.

1 Operator A31 is correct for the tristate tag. Need new module to translate tag status to tail per attached.

Required relationship between tristate tag and analog tail: .

~

Tristate state, status Analog value, status TRI_ INACTIVE, PSG00D APRM_DhSCL_ LIM, PSG00D.

TRI_ CAUTION, PSPAIRM APRM DNSCL LIM, PSPALRM TRI SAFE, PSG00D APRM DNSCL LIM, PSG00D

  • e TRI_ INACTIVE, PSALARM APRM_DNSCL_ LIM, PSALARM APRM_,DNSCL_ LIM is an input constant value DATE: BIAMC RESOIRTION:

Corrections made as identified below: \

APRM Downscale Limit Status Input Lis t (C95-4020 Paragraph 20.3.2.1.a, Data Table 3.2.2.2.1.c)

1. Point Name: APRM Downscale Limit Status Mnemonic : APRM_DNSCL_ LIM _ STAT (tag)

Type  : Composed Analog Rate  : 1 Offset  : 0 Point Description Hist Constant Oprtr SCRAM _ STAT APRM DECAY

- ~

TIME APRM DNSCL

- ~

LIM VLDT_ PAR (of RX_PWR) A31 D2

s -

NEDC-30885 CLASS II

2. Point Name: APRM Downscale Limit State Mnemonic : APRM_DNSCL_ LIM _ STAT (tail)

Type  : Composed Analog Rate  : 1 offset  : 0 Point Description Hist Constant Oprtr APRM_DNSCL_ LIM _ STAT (tag)

APRM_DNSCL_ LIM A95 l

I

  • O a

1 D3

NEDC-30885 CLASS II l

.i D3. MTE: 12-MAR-1985 SPE 3MSM: VAL 16 l

FROBLEM STATBENT:

i Composed point on display requires an analog limit value (like 995 psig) to be supplied. The code module outputs only a value of zero and has no provision for inputting the limit value.

MTA: BLANK )

., RESOIRTION: ,

Corrections made as depicted below:

100% Bypass Valve Limit Status Input Lis t (C95-4020 Paragraph 20.3.2.1.b, Data Table 3.2.2.2.1.b)

1. Point Name: 100% Bypass Valve Limit Status Mnemonic : 100%_BPV_ LIM _ STAT l Type  : Composed Analog Rate  : 1 *

,s offset  : 0 Point Description Hist Constant Oprtr i VLDT_ PAR (of RPV_ PRESS) 100% BPV TIME l SRV_0 PEN _ STAT A30 ,

PROC _ LIM (100% BPV LIM) l' A95 l

l J

l

  • I  :

l D4 i