ML20010C774

From kanterella
Jump to navigation Jump to search
Nonproprietary Version of Cpc/Ceac Sys Phase II Software Verification Test Rept.
ML20010C774
Person / Time
Site: San Onofre  Southern California Edison icon.png
Issue date: 08/05/1981
From:
ABB COMBUSTION ENGINEERING NUCLEAR FUEL (FORMERLY
To:
Shared Package
ML13308B934 List:
References
CEN-173(S)-NP, NUDOCS 8108200405
Download: ML20010C774 (26)


Text

~.

' SAN Ol10FRE - UNIT 2 DOCKET 50-361 AND 362 CEN-173(S)-NP REVISION 00 CPC/CEAC SYSTEM PilASE II SOFTUARE VERIFICATION TEST REPORT p

AUGUST 5, 1981 Combustion Engineering, Inc. l Nuclear Power Systems I

Power Systems Group Windsor, Connecticut l

4 810820040S 810813 1

PDR ADOCM. 05000g

^ .

i LEGAL NOTICE

.o l

This response was prepared as an account of work sponsored by Combustion Engineering, Inc. Neither Combustion Engineering nor any person acting on its behalf:

7

a. Makes any warranty or representation, express or implied including the warranties of fitness for a particular ..

purpose or merchantability, with respect to the accuracy, completeness, or usefulness of the information contained in this response, or that the use of the information contained in this reponse, or that the use of any information, apparatus, method, or process disclosed in this response may not infringe privately owned rights; or .

b. Assumes any liabilities with respect to the use of, or for damages resulting from the use of, any information, apparatus, method or process disclosed in this response.

[

- i s

. . Page 1 of 24 e

_ ___=--- _ -_ m. g ____

ABSTRACT .

Phase II Testing is performed on the CPC/CEAC System to (1) verify that the CPC and CEAC software modifications have been properly integrated with the CPC and CEAC software and system hardware and (2) provide confirmation that the static and dynamic operation of the integrated system as modified is consistent with that predicted by design analyses. -

This report presents the Phase II test results for the Southern California Edison 50NGS-2 plant CPC/CEAC Rev. 00 software.

The Phase II Testing was performed according to previously issued procedures (Reference 1). The test results indicate that the CPC and CEAC software modifications have been properly integrated with the CPC and CEAC software

~

and hardware and that the operation of the integrated system as modified is consistent with the carformance predicted by design analyses.

. 1 l

l 1

Page 2 of 24 .

d

. -i

= -a - - _ - -

.. - = . ~ - _ + _ + _ ,  ;

I

. TABLE OF CONTENTS Section Title ,

Page No.

1.0 INTRODUCTION

4 1.1 Objectives 4 1.2 Description of Phase II Testing '5 1.3 Applicabilit" 5 2.0 CPC/CEAC INPUT SWEEP TESTS 6 2.1 CPC Input Sweep Test Case Selection 6 2.1.1 CPC Processor Uncertainty Results 6 2.1.2 Analysis of CPC Input Sweep Test Results 7 2.2 CEAC Inpu'. Sweep Test Casc Selection -

9 2.2.1 '

CEAC Processor Uncertainty Resul+s 9 2.2.2 Analysis of CEAC Input Sweep Test Results 9 3.0 DYNAMIC SOFTWARE VERIFICATION TEST 10

.~ . 3.1 DSVT Case Selection 10 3.2 Generation of DSVT Acceptance Criteria 11 3.3 -

DSVT Results 17 3.4 Analysis of DSVT Results 18 4.0 LIVE INPUT SINGLE PARAMETER TEST 20 4.1 LISP Test Case Selection 20 4.2 Generation of LISP Acceptance Criteria 20 4.3 LISP Test Results .

. 21 5.0 PHASE II TEST RESULTS

SUMMARY

23

6.0 REFERENCES

24.

'. f I

. . p l

Page 3 of 24 , {

. I

n - - .~ _ ,- _~_a__- - - _ - n--

1.0 INTRODUCTION

The verification of software modifications of the CPC/CEAC System consists of several steps which address two major areas of the modification process:

(1) Specification of software modifications (2) Implementation of software modifications The specification of software modifications is documented in the CPC and CEAC Functional Descriptions and the Data Base Document and is verified by design analyses contained in recorded calcu-lations. The implementation of software modifications is documented in Software Design Specifications and assembly listings. The verifi-cation process for the modified software implementation includes Phase I and Phase II software verification tests.

The requirements of the Phase II software verification testing are based on the fact that the Phase I testing has been previously performed. Successful completion of Phase I testing. verifies the correct implementation of the modified software. Phase II testing completes the software modification process by verifying that the integrated CPC System responds as expected.

This document contains the test results and conclusions for the Phase II software wrification test.

1.1 Objectives The primary objective of Phase II testing is to verify that the

. CPC and CEAC software modifications have been properly integrated l l.- with the CPC and CEAC software and systent hardware. In addition ,

Phase II testing prcvides confirmation that the static and dynamic

, , , l

    • operation of the integrated system as modified is consistent with that predicted by design analyses. 'These objectives are achieved .

I i

,Page 4 of 24

. I-  :

i

' 1 by comparing the response of the integrated system to the response predicted by the CPC/CEAC FORTRAN simulation code. This comparison

.. is performed for a selected range of simulated static cnd dynamic input conditions. .

1.2 Description of Phase II Testing Phase II testing consists of the follcuing tests.:

(1) Input ~ Sweep Test, t (2) Dynamic Software Verification Test, and j (3) Live Input Single Parameter Test.  ;

These tests are performed on a single channel CPC/CEAC System with integrated software that has undergone successful Phase I testing.

1.3 Applicability This report applies to the Phase II testing performed on the 9

Southern California Edison SONGS-2 plant CPC/CEAC system software.

The software revisions documented in this report are designated

- es Revision Number 00 to the SONGS-2 CPC/CEAC system software.

i-i-

Page 5 of 24

~

2.0 CPC/CEAC INPUT SWEEP TESTS The Input Sweep Test is a real time exercise of the CEAC and CPC application software and executive software with steady-state CFC and CEAC input values read from a storage device. This test has the following, objectives:

(1) To determine the crocessing uncertainties that are inherent in the CPC and CEAC designs.

(2) To verify the ability of the CPC and CEAC algorithms used in the system hardware to initialize to a steady state after an auto-restart for each of a large number of input combinations ,

within the CPC/CEAC operating space, and (3), To complement Phase I module testing by identifying any

, abnormalities in the CPC and CEAC algorithms used in the

, system hardware which were not previously uncovered.

~

~

2.1 CPC Input Sweep Test Case Selection test cases, each involving different combinations of process inputs and addressable constants, were used for CPC design qualification testing of the Revision 00 software.

2.1.1 CPC Processor Uncertainty Results For each test case, differences in the results of the FORTRAN simulat' ion code and CPC system were calculated. A statistical

. analysis of these differences produced the processing uncertainties.

The DNBR statistics did not include those cases for which the i DNBR as calculated on either system was at the limits .

This is because a difference of zero (or close to zero) would be .

. computed and would incorrectly weight the distribution of differ- i i

ences. A total of(p. _ cases remained after these cases were j

i

,Page 6 of'24 ,  !

l

~_ .. , _

m __ _ .m _ .

.a m 5

.. eliminated. The LPD statistics did not include those cases for which the LPD as calcciated on either, system was equal to or_ _

.. greater than the upper limit of core average kw/ft (= ,

. kw/ft). A total of cases remained after these ca::es were eliminated.

~

Although ' cases were not. included in the computation of DNBR and LPD statistics, respectively, they were still included as Input Sweep t::st cases for the purpose of identifying potential software errors.

The processor uncertainties for DNBR and LPD are defined as the one-sided tolerance limits which encompass 95% of the distribution of L.1BR and LPD differences for all test cases with a 95% confidence level. The processor uncertainties determined from Input Sweep for DNBR and LPD respectively are ._

,DNBRunitsandf core average kw/ft.

. . However, since the distribution of differences is so restrictive the maximum error may be used (that is, the limits which encompass 100% of the difference). This is more conservative and yet still results in small processor uncertainties.

Thus defined, the processor uncertaintirs for Revision 00 on DNBR and LPD are DNBR units and and _

g , core average kw/ft, respectively.

2.1.2 Analysis of CPC Input Sweep Test Results The results of the test cases exceeding the 95/95 tolerance limit

.- were analyzed for evidences of software errors.

The review results of the DNBR and LPD test cases outside the l 95/95 tolerance limit will now be discussed. For DNBR_there were cases below the lower tolerance limit of (DNBR -

r units)and ,Jtestcasesabovetheuppertolerancelimitof l

. c I i

1

. Page 7 of 24 '

I

l

- ~

(DNBR units). Withf

-] exceptions, the cases

. _with the largest __.l absolute riifferences in DNBR occurreo ~

, with DNBR's greater thani- ]at power levels less than[ .

. Only( ]of the{ ] exceptions had DNBR's of less tha'iij ~]

. withthelimitingcasehavingaDNBRofgreaterthanl_]_at apowerluvelof{ 1 This is not a signifit. ant commonality since the differences are absolute (not relative) and it should be expected that the largest differences should occur at high DNBR's. The common input data toall{~ltestcaseswas'foundinothertestcaseswithless maximum difference and less percent error. It is therefore concluded that no errors are indicated in the CPC Single Channel DNBR program.

For LPD the cases examined were:[ ] cases with differences below the lower 95/95 tolerance limit of[_ j~(%ofcoreaverage

'kw/ft){ ] cases with differencas greater than the upper tolerance limitof[ ] and[] cases with LPD values greater than of core average kw/ft and with differences outside the above

, stated tolerance limits. For the LPD cases with values above

[ ]of core average kw/ft, the largest percent error was[ ]l.

The size of this percent error term indicates that the differences between the CPC Single Channel and the CPC FORTRAN simulation are due to machine differences in accuracy when calculating large numbers.

~

. FortheT-ltestcaseswithLPDvalueslessthan n ]ofcore averagekw/ft,{lcaseshadabsolutepercent

=- , greater than

_ l-The largest percent error was l.

The common input to these test cases was found in other test cases with less maximum difference and less percent error. Examination of the inputstoall{]LPDcasesoutsidethetolerancelimitsshowed l

that the inputs covered a wide spectrum. .No common area was

,. found. It is therefore concluded that there is no indication of

.. software errors in the Single Channel calculation of LPD.

Paae 8 of 24

  • ___m-_ m _. __ z . _ .

4 2.2 CEAC Input Sweep Test Case Selection .

"^ .

test cases, each involving different combinations

- of CEAC process inputs were used for CEAC design qualification testing of the Revision 00 software. These test cases covered all CEAC operating space.

2.2.1 'CEAC Processor Uncertainty Results For each test case, differences between the CEAC FORTRAN simulation \

code and CEAC system results were calculated. The processor <

t uncertainties for DNBR and LPD are defined as the one-sided tolerance limits which encompass C.,% of the distribution of DNBR and LPD penalty factor differences for all test cases with a 95%

confidence level.

The processor uncertainties for the DNBR and the LPD penalty

.- factor differences are and respectively.

'. 2.2.2 Analysis of CEAC Input Sweep Test Results

. The results of the test cases exceeding the 95/95 tolerance limit were analyze <f for evidences of software errors. test cases had differences in the big penalty factor flag.

The test cases with differences in ' big penalty factor flag were examined. Results indicated th s .nese differences were due to implementation differences between the CEAC software and the CEAC FORTRAN and not due to software or FORTRAN programming errors. These implementation differences do not impact calculation of the CEAC penalty f actor output words. It was concluded that

~

the results of the _ ] test cases did not indicate the existence

,of software errors. ,

. il

. bl

. :l Paqs 9 raf 24

w._ - .. .n -

3.0 DYNAMIC SOFTWARE VERIFICATION TEST The Dynamic Software Verification Test (DSVT) is a real time exercise of the CPC application software and executive software with transient CPC input values read from a storage device. This test has two objectives: [

(1) To verify the dynamic response of the integrated CPC software is consistent with that prediced by design analyses, and ,

(2) To supplement design documentation quality assurance, Phase I module tests, and Input Sweep Tests in assuring correct implementation of software modifications.

Further information concerning DSVT may be found in Reference.l.

3.1 DSVT Case Selection .

Test cases for DSVT are selected to exercise dynamic portions of the CPC software with emphasis on those portions of the software that have been modified. The major modifications made by the Revision 00 changes are: .  ;

(1) CEAC Logic and Data Base changes to allow proper operation with plants containing 2-CEA subgroups.

(2) Replacement of the COSM0/W-3 based DNBR calculation with CETOP2 based on the TORC /CE-1 DNBR correlation. This change totally replaces the STATIC program and modifies the DNBR

- UPDATE calculation.

(3) Changes to curve-fitting routines, modelling core power

.- distribution for more precise CEA configurations and the use' of additional corrections and offsets to yield improved accuracies. ,

l.

- s .

Page 10 of 24

a . - ~ mnnn m_ , - .s (4) Algorithm simplificalions in the POWER program resulting in improved computer efficiency.

(5) Additional addressable constants to facilitate changing constants likely to vary during plant life, and to allow clearance of the CEAC snapshot buffer and rewriting of the entire CRT display. -

1 For more detail on Revision 00 software modifications, see  %

Reference 2. \

DSVT requires that as a minimum cases be -

sek 1,:4 for testing (Reference.1). These cases are from the Phase II test series (identified in Reference 1) and consist of a Because the changes to the program algorithms were significant, all of the DSVT test cases ,

were executed using the CPC/CEAC FORTRAN simulation code and the Single Channel facility with the Rev. 00 CPC software. This ensures that all dynamic portions of the CPC software are adequately exercised.

3.2 ' Generation of DSVT Acceotance Criteria Acceptance criteria for DSVT are defined (in Reference 1) as a trip tim'e and initial values of DNBR and LPD for each test case.

These trip times and initial values are generated using the certified CPC/CEAC FORTRAN simulation code. Processing uncer-tainties obtained during I.1put Sweep testing are factored into the acceptance criteria for initial values of DNBR and LPD where necessary. Trip times are affected by program execution lengths ,

as well as the Input Sweep uncertainties. The minimum, maximum ll and average application program execution lengths for the Revision '

00 software modifications were calculated and used in DSVT.

These execution lengths (in milliseconds) are listed below. .

,1

.' 1 Page 11 of 24

.. CPC Application Program Execution Lengths

,- Program Minimum Average Maximum

. (msec) (msec) (msec)

FLOW UPDATE POWER STATIC Each DVST case was initially executed once with th'e average program execution lengths and nominal trip setpoints using the CPC/CEAC FORTRAN simulation code. Following execution of the same cases using the Single Channel facility, cases which did not yield trip times equivalent to those calculated by the CPC FORTRAN code were re-analyzed.

., Each of these DSVT cases were re-executed once with the minimum execution lengths and most conservative DNBR and LPD trip setpoints ari once with the maximum execution lengths and least conservative DNBR and LPD trip setpoints. This results in a bandwidth of trip times for each test case which contains the effects of processing uncertainties and variations in application program execution lengths.

The software DSVT program also includes a )) millisecond interrupt

- cycle in order to check.for DNBR and LPD trip signals. This results in a[-] millisecond interval limit on trip time resolu-tion which is factored in the acceptance' criteria. The following tables contain the final DSVT acceptance criteria for initial values and trip times for DNBR and LPD.

. ,. The values listed are those calculated and. truncated to the same number of digits observed on.the Single Channel facility Operator's .

module.

~

. l l

  • I l .

[

Paae 12 of 24

, u , . _ _ _- _m. x,;pw , . .

. . ~

Acceptance Criteria for DNBR and LP" Initial Values (DilBR Units and kw/ft., respectively) 1, ..

DNBR DNBR LPD LPD Test Case (Min.) (Max.)- (Min.) (Max.)

5 i t

b

\

4 e

I l

. j-

. I i

i 4

I f .

s 9

I a

  1. # 9

, I i

?

~

I Acceptance Criteria for .

DNBR and LPD Initial Values (DNBR Units and kw/ft., respectively)

(Cont.)

DNBR DNBR LPD- LPD Test Case (Min.) (Max.) Bin.) (Max.)

c'.- .

. ,o l'

i

. .' ll'

{

l Page 14 of 24 i,

, .. . . . Acceptance Criteria for -

\

, DNBR and LPD Trio Times (seconds)

DNBR Trip DNBR Trip '

LPD Trip LPD Trip Test case *

(Min.) (Max.) (Min.) (Max.)

e

= = = - - - - . . -

k I

a 1

i.

j .

I i t  ?

s t

]

s

% i

) 5 s

4

, 6 a

f i e*

3 i' t

k y, S.

s m hM wN. %wm* hg-,=m==, ww w-4 . , spe+e**..* -"" O g ~w ' Sr.e ww**v

. . . .  ;\

. 'I

_- - -e_ ... _ - - - -

\

i

,. Acceptance Crteria for '

. DNBR and LPD Trip Times (seconds)

(Cont.) . .

. 4 DNBR Trip DNBR Trip LPD Trip LPD Trip Test case (Min.) (Max.) (Min.) (Max.)

5 i, h

\

t i

\

\

t i

r i -

1 f

. . t, e

  • i i +

1, i i

i h

r I

! e i

t  !

l I.

L

(

. . i.

!i

.h

,r n

+

i l<

j

t. '

i Page 16 of 24 j;

. _ . ~ . _ . . . - - - . . - - . . . . . . . . - - _ . . . - _.... . _. .. _...,._m =-

l . .

3.3 DSVT Results .

i l;i I.

The Dynamic Software verification Test was executed on the Single 'i 6 I Channel Facility using the Revision 00 CPC software., The DSVT results are contained below. .

Initial Initial' DNBR LPD DNBR Trip LPD Trip Test Case (DNBR Units) (kw/ft.) (sec.) (sec.)

. .m k W

$8 .

~

r

. 4 e

)

I

}

i i

6 i

I r

i .

,m - s. . c ._

7--

i

. j Run l

. . . t All recorded trip times meet the final acceptance criteria for LISP.

Ma[oraspectsoftheoperator'smoduleoperation,particularly

,, the point ID verification and addressable constant range limits were tested. As part of the testing, the CPC and CEAC Point ID

,.'. tables were checked to assure that the Point ids displayed on the operator's module are the same as those listed in the Point ID tables.

All aspects of automated reentry of addressable constants were -

tested and were determined to have been correctly implemented.

.i O I

Page 22 of 24' j

,..-n. -.

w e

i

?

5.0 PHASE II TEST RESULTS

SUMMARY

- The Phase II software verification tests have been performed as required in Reference 1. In all cases, the test results fell within the acceptance criteria, en. apt for one case discussed in Chapter 3. This case was analyzed and results indicated that this was not due to CPC/CEAC software or CPC FORTRAN simulation code errors. Based on these results, the Phase II test objectives have been successfully achieved. It is concluded that the CPC and CEAC software modifications described as Revision 00 have been properly integrated with the CPC and CEAC software and system hardware and that the static and dynamic operation of the integrated system as modified is consistent with that predicted by' design analyses.

+

4 i

I I

Page 23 of.24 .

. - - n_n . n . -

V.

6.0 REFERENCES

, . 1. CPC Protection Algorithm Software Change Procedure CEN-39(A)-P,

[ Revision 02, December 21, 1978.

2. CPC/CEAC Software Modifications for SONGS Units 1 and 2, CEN-135('S)-P Revision 00.

i

\

i k

S 0

e S 9 9

t a

e 4 9 e *

, 9

  1. 4 e

t *

g. .

l i

l ENCLOSURE 7 I

l i

l l

l.

i l

l

. _ _ _ _ _ _ _ , _ _