ML20040D020

From kanterella
Jump to navigation Jump to search
Nonproprietary Version of Revision 2 to Cpc/Ceac Sys Phase II Software Verification Test Rept
ML20040D020
Person / Time
Site: San Onofre  Southern California Edison icon.png
Issue date: 11/30/1981
From:
ABB COMBUSTION ENGINEERING NUCLEAR FUEL (FORMERLY
To:
Shared Package
ML19297F285 List:
References
CEN-173(S)-NP, CEN-173(S)-NP-R02, CEN-173(S)-NP-R2, NUDOCS 8201290487
Download: ML20040D020 (26)


Text

,

4 5

1 SONGS liUCLEAR OttE - UNIT 2 J

DOCKET 50-361 AND 362 CEN-173(S)- N P REVISION 02 CPC/CEACSYSTEM PHASE II SOFTWARE VERIFICATION TEST REPORT l

i NOVEMBER 1981 t

l 4

Combustion Engineering, Inc.

Nuclear Power Systems i

Power Systems Group i t Windsor, Connecticut 8201290487 820122 PDR ADOCK 05000361 l

?

A PDR

4

)

LECAL NOTICE This response was prepared as an account of work sponsored by Combustion Engineering, Inc. Neither Combustion Engineering nor any person acting on its behalf:

I Makes any warrahty or representation, express or j

a.

implied including the warranties of fitness for a particular e

purpose or merchantability, with respect to the accuracy, j

completeness, or usefulness of the information contained in i

this response, or that the use of the information contained in this reponse, or that the use of any information, apparatus, t-sethod, or process disclosed in this response may not infringe

. privately owned rights; or -.

b.

Assumes any liabliities with respect to the use of, or for damages resulting from the use of, any information,

~

i apparatus, method or process disclosed in this response.

.t y.

..e..

.., "4 A[.' ' '.5. ';

t

,,. y

.s.

.a.,7 pgj

~

...,.'^

4. &b. ;,

M. *.

Cci. :,:.

l

- (.. _... ~. - -..y...

-l.

,4..

i,,......

~ ~ ' _ ~

Pa.. ; - i,J*Y--

vi--..-

O g

8 Pago 1 of 25

.i-ABSTRACT

~

Phase II Testing is perfonned on the CPC/CEAC System to (1) verify that the CPC and CEAC software modifications have been properly integrated with the CPC and-CEAC software and system hardware and (2) provide confirmation that the static and dynamic operation of the integrated system as modified is l

consistent with that predicted by design analyses.

This report presents the Phase II test results for the Southern California Edison SONGS-2 plant CPC/CEAC Rev. 01 software.

The Phase II Testing was performed according to previously issued procedures

.(Reference 1). The test results indicate that the CPC and CEAC software modifications have been properly integrated with the CPC and CEAC software and hardware and that the operation of the integrated system as modified is consistent with the performance predicted by design analyses.

This document was prepared and reviewed in accordance with se'ctions 5.2 and 5.4 respectively of QADP, Revision 15.

l'

' a

.i.

.+,

, 3:, -

n

t* -

^

+.,

Ifage' 2 ~ f 25 o

2 o

30LE OF C0f!TEf!TS Section Title Page No.

1.0 INTRODUCTION

4 i

~

4 1.1 Objectives

~ '

1.2' Description of Phase II Testing 5

1.3 Applicability 5

2.0 CPC/CEAC INPUT SWEEP TESTS 6

2.1 CPC Input Sweep Test Case Selection 6

. 2.1.1 CPC Processor Uncertainty Results 6

_2.1.2 Analysis of CPC Input Sweep Test Results 7

2.2 CEAC Input Sweep Test Case Selection 9

2.2.1' CEAC Prncessor Uncertainty Re'sults

~~

~

9 2.'2.2 Analysis of CEAC Input Sweep Test Results 9

.3.0 DYNAMIC SOFTWARE VERIFICATI0ft TEST 10

.3.1 DSVT Case Selection 10

.3.2 Generation of DSVT Acceptance Criteria 11 i

DSVT Results 18 l

3.3-3.4 Analysis of DSVT Results 19 4.0 LIVE INPUT SINGLE PARAMETER TEST 21 4.1 LISP Test Case Selection 21 4.2 Generation of LISP Acceptance Criteria 21 4.3 LISP Test Results 22 5.D FHASE II TEST RESULTS

SUMMARY

~.

24

6.0 REFERENCES

25 l

l Page.3of25,

1.0 INTRODUCTION

The verification of software modifications of the CPC/CEAC System consists of several steps which address two major areas of the modification process:

.(1) Specification of software modifications (2) Implementation of software modifications

~

The specification of software modifications is documented in the l

CPC and CEAC Functional Descriptions and the Data Base Document and is verified by design analyses contained in recorded calcu-lations. The implementation of software modifications is documented in Software Design Specifications and assembly listings. The verifi-

.ca' tion process for the modified software implementation includes Phase I and Phase II software verification tests.

The requirements o'f the Phase II software verification testing are based ~ the fact that the Phase I testing has been previously performed. Successful completion of Phase I testing verifies the correct implementation of the modified software.

Phase II testing i

completes the software modification process by verifying that the integrated CPC System responds as expected.

l This document contains the test results and conclusions for the l

Phase II software verification Test.

1.1 Objectives The primary objective of Phase II testing is to verify that the CPC and CEAC software modifications have been properly integrated with the CPC and CEAC software and system hardware.

In addition Phase II testing provides confimation that the static and dynamic I

operation of the integrated system as modified is consistent with that predicted by design analyses. These objectives are achieved Page 4 of 25 y

(

by conparing the response of the integrated system to the response prddicted by the CPC/CEAC FORTRAN simulation code.

This comparison is performed for a selected range of simulated static and dynamic input conditions.

1.2 Description of Phase II Testing Phase II testing consists of the following tests:

(1)

Input Sweep Test, (2) Dynamic Software Verification Test, and (3) Live Input Single Parameter Test.

.These tests are perfonned on a single chanitel CPC/CEAC System with integrated software that has undergone successful Phase I

' testing.

.1.3 Applicability This report applies to the' Phase II testing performed on the Southern California Edison SONGS-2 plant CPC/CEAC system software.

The software revisions documented in this report are designated as Revision Number 01 to the SONGS-2 CPC/CEAC system software.

I l

' i:..

1.

e.....

i I

Page 5 of 25

t 2.0 CPC/CEAC INPUT SWEEP TESTS The Input Sweep Test is a real time exercise of the CEAC and CPC aL71tcation software and executive software with steady-state CPC and CEAC input values read frem a storage device.

This test has the following objectives:

i 1

(1) To determine the processing uncertainties that are inherent in the CPC and CEAC designs.

4 (2) To verify the ability of,the CPC and CEAC algorithms used in the system hardware to initialize to a steady state after an auto-restart for each of a large number of input combinations within the CPC/CEAC operating space', and (3)~ To complement Phase I module testing by identifying any abnormalities in the CPC and CEAC algorithms used in the system hardware which were not previously uncovered.

G 2.1 CPC Input Sweep Test Case Selection l est cases, each involving different combinations of

~

t

_J

__ process inputs and addressable constants, were used for CPC design qualification testing of the Revision 01 software.

2.1.1 CPC Processor Uncertainty Results For each test case, differences in the results of the FORTRAN simulation code and CPC system were calculated.

A statistical analysis of these differences produced the processing uncertainties.

The DNBR statistics did not include those cases for which the _

DNBR as calculated on either system was at the limits This is because a differencc of,zero (or close to zero) would be computed and would incorrectly weight the distribution of differ-ences. A total of cases remained after these cases were

_J Page 6 of 25

/

~

o eliminated. The LPD statistics did not include those cases for which the LPD as calculated on either system was equal to or_

greater than the upper limit of c

g ore average kw/ft (=

kw/ft). A total of cases remained after these cases we7e g

eliminated.

~

Although cases were not included in the computation of f

DNBR and LPD statistics, respectively, they were still included as Input Sweep test cases for the purpose of identifying potential software errors.

The processor uncertainties for DNBR and LPD are defined as the one-sided tolerance limits which encompass 95% of the distribution

, of DNBR ana LPD differences for all test cases with a 95% confidence level. The processor uncertaintie_s determined from Input Sweep for DNBR and LPD respectively are

, DNBR units and

_l core average _kw/ft.

l l

However, since the distribution of differences is so restrictive the maximum error may be used (that is, the limits which encompass l

' 100% of the difference). This is more conservative and yet still results in smail processor Gncertainties.

i Thus defined the processor uncertaintie_s for Revision 01 on DNBR 2

l and LPD are DNBR units and

_l and core average kw/ft, respectively.

2.1.2 Analysis of CPC Input Sweep Test Results The results of the test cases exceeding the 95/95, tolerance limit were analyzed for evidences of software errors.

The review results of the DNBR and LPD test cases out' side the 95/95 tolerance limit will now be discussed.

For DNBR there were cases below the lower tolcrance limit of (DNBR units)and test cases a_bove the upper tolerance limit of Page 7 of 25

t

\\

1

~.

]cxceptions, the cases (03BRunit,s). With f

witti the largest absolute differences in 0l:BR occurred with Ot!BR's greater thanf]at power levels less than

~ '.

I Only{ } of the[_

{'exNptions had DtlBR's of less than[

j with the limiting case having a DitBR of greater than at

]

a power level of This is not a significant coamonality since t1e differences are absolute (not relative) and it should be expected that the largest differences should occur at high OttBR's. The common input data to all 7 test cases was found in other test cases with less f

~

maximum d'ifference and less percent error.

It is therefore concluded that no errors are indicated in the CPC Single Channel DNBR program.

For LPD the cases examined were: _ > cases with differences below thelower95/95tolerancelimitof{

j(%ofcoreaverage kW/ft),

cases with differences greater than the upper tolerance limit of

]and cases with LPD values greatier than{

]

f of core average kw/ft and with differences outside the above

_ stated tolerance limits. For the LPD cases with values above

'Jof core average kw/ft, the largest percent error was _

a

-The size of this percent error term indicates that the differences between the CPC Single Channel and the CPC FORTRAtt simulation are due to machine differences in accuracy when calculating large numbers.

1 For the test cases with lod values less than jofcore average kw[ft,[] cases had absolute percent errors greater than The connon input The largest percent error was g

j to these test cases was found in other test cases with less maximum difference and less percent error.

Examination of the

l inputs to all "]LPDcasesoutsidethetolerancelimitsshowed that the inputs covered a wide spectrum.

No common area was found.

It is therefore concluded that there is no indication of software errcrs in the Single Channel calculation of LPD.

I Page 8 of 25

e 2,2

_. CEAC Input Sweep Test Case Selection

~

test cases, each involving different combinations "of CEAC proceTs inputs were used for CEAC design qualification testing of the Revision 01 software. These test cases covered all CEAC operating space.

}

I

.,2. 2.1 CEAC Processor Uncertainty Results For each test case, differences betweer the CEAC FORTRAN simulation code and CEAC system results were calculated.

The processor uncertainties for DNBR and LPD are defined as the one-sided tolerance limits which encompass 95% of the distribution of DNBR and LPD penalty factor differences for all test cases with a 95

' confidence level.

The processor uncertainties for the DUBR and the MO penalty factor differences are respectively.

2.2.2 Analysis of CEAC Input Sweep Test Results The results of the test cases exceeding the 95/95 tolerance limit were analyzed for evidences of software errors.

] test cases had differences in the big penalty factor flag.

The 6

- test cases with differences in the big penalty factor flag were examined.

Results indicated that these differences h re due to implementation differences between the 'CEAC software and the CEAC FORTRAN and not due to software or FORTRAN programning errors. These impicmentation differences do not impact calculation of the CEAC penalty fac_ tor output words.

It was concluded that the results of the I test cases did not indicate the existence J

of software errors.

9 Page 9 of 25

e 3.0 DYNAMIC SOFTWARE VERIFICATION TEST JRie Dynamic Software Verification Test (DSVT) is a real time exercise of.the CPC application software and executive software with transient CPC input values read from a storage device.

This

{

test has two objectives:

~

(1) To verify the dynamic response of the integrated CPC software is consistent with that prediced by design analyses, and (2) To supplement design documentation quality assurance, Phase I module tests, and Input Sweep Tests in assuring correct implementation of software modifications.

Further information concerning DSVT may be found in Reference 1.

.3.1 DSVT Case Selection Test' cases for DSVT are selected to exercise dynamic portions of

'i the CPC software with emphasis on those portions of the software that have been modified. The major modifications made by the Revision 00 changes are:

(1) CEAC Logic and Data Base changes to allow proper operation

_with plants containing 2-CEA subgroups.

~ r;

t:- ::...

(2) ~ Replacement of the COSM0/W-3 based DNBR calculation with

'i' CETOP2 based on the TORC /CE-l DNBR correlation.

This change T'

' totally replaces the STATIC program and modifies the DNBR

~

i'-

UPDATE calculation.

~

~

(3) Changes to curve-fitting' routines, mpdelling core power distribution for more precise CEA configurations and the use of additional corrections and offsets to yield improved accuracies.

Page 10 of 25 1

I, (4) Algorithm simplifications in the POWER program resulting in improved computer efficiency.

(

(5) Additional addressable co'nstants to facilitate changing constants likely to vary during plant life, and to allow

'. clearance of the CEAC snapshot buffer and rewriting of the entire CRT display.

For more detail on Revision CD softs'are modifications, see Reference 2.

Reference 3 describes the software modification. that was implemented for Revision 01. This modification involved changing the allowable time for A/D conversion in the CPC/CEAC Executive software.

r 7

~

DSVT requires that as a minimum cases j be selected for testing (Reference 1). These cases are from the Phase II test series (identified.in Reference 1) and consist of a e

Because the modification to the Executive software involved a timing change, all of the DSVT test cases were executed using the CPC/CEAC FORTRAN simulation code and the Single Channel facility with the Rev. 01 CPC software. This ensures that all dynamic portions of the CPC software are adequately exercised.

3.2 Generation of DSVT Acceptance Criteria Acceptance criteria for DSVT are defined (in Reference 1) as a trip time and initial values of DNBR and LPD for each test case.

These trip times and initial values are generated using the dertifiedCPC/CEACFORTRANsimula'tioncode.

Processing uncer-7 tainties obtained during Input Sweep testing are factored into the acceptance criteria for initi al values of DNBR and LPD where i

necessary. Trip times are affected by program execution icngths as well as the Input Sweep uncertaintics. However, the Revision F.P. age 11 of 25

-f-01 modification did not impact the FORTRA!! simulation code.

~

The~refore, the minimum, maximum and average application program execution lengths calculated for the Revision 00 software modiff-cations were used in this DSVT. These execution lengths (in milliseconds) are listed.below.

j i

CFC Application Program Executio~n Lengths l

Program Minimum Average Maximum (msec)

(msec)

(msec)

FLOW UPDATE POWER

~

STATIC Each DVST case was initially executed once with the average program excce. tion lengths and nominal trip setpoints using the CPC/CEAC FORTRAft simulation code.

Following execution of the same cases using the Single Channel facility, cases which did not f

yield trip times equivalent to those calculated by the CPC FORTRAtt code were re-analyzed.

Each of these DSVT cases were re-executed once with the minimum execution lengths and most conservative DriBR and LPD trip setpoints and once with the maximum execution lengths and least conservative i

DNBR and LPD trip setpoints.

This results in a bandwidth of trip times for each test case which contains the effects of processing uncertainties and variations in application program execution lengths.

ThesoftwareDSVTprogramalsoincludesa{]millisecondinterrupt cycle in order to check for DtiBR and LPD trip signals.

This results in a millisecond interval limit on trip time resolu-

~

tion which is factored in the acceptance criteria.

The following tables contain the final DSVT acceptance criteria for initial values and trip times for DriBR and LPD.

Page l'2 cf 25

'The values listed are those calculated and truncated to the ta=c number of digits ob:erved on the Single Channel facility Operator's module.

t

~

t

..y I Page'.13 of 25 i

m i

Acceptance Criteria for DNBR and LPD Initial. Values (ONCR Units and kw/ft., re.ectively)

+

DNBR DNBR LPD LPD Test Case (Min.)

(Max.).

(Min.)

(Max.)

i t

t l

s I

t i,

}

l l

t I

i l

i 1

i I

l l

1 r.

Page 14 of 25

}

i 1

Acceptance Criteria for DNBRandLPDinitialValues(DNBRUnitsandkw/ft..re.pctt..LQ j

(Cont.)

i DNBR DNBR LPD LPD Test Case _

(Min.)

_(Max. )

(Min.)

(Max.)

t I

i I

l i

4 1

1 1

1 41 1

Page 15 of 25

l Acceptance Criteria for DNBR and LPD Trip Tintes (seconde)

DNBR Trip' DNBR Trip LPD Trip LPD Trip Test Case _

(Min.)

(Max.)

(Min.)

(Max.)

~'

i l

6 l

l i

I s

i t

I Page 16 of 25 l

i

-t i

Acceptance Ceteria for DNBR and LPD Trip Times (seconds)

(Cont. )

s DNBR Trip DNBR Trip LPD Trip LPD Trip

~

Test Case _

(Min.)

- (Max. )

(Min.)

(Max.)

I 9

1

.i 4

i i

1 i-j l

I 4

i I

i.Page 17 of 25

(

i t

~ ~ ~

--'- ~

i

~

3.3 DSVT Results t

The Dynamic Software Verification Test was executed on the Single i,f Channel Facility using the P.evision 01 CPC software. The DSVT results are contained below.

Initial Initial DNBR LPD DNBR Trip LPD Trip Test Case (DNBR Units)

(kw/ft.)

(sec.)

(sec.)

i f

l 1

l Page 18 of 25 m

O DSVTResults(Cont.)

Initial Initial DNBR LPD DNBR Trip LPD Trip Test Case (DNBR)Jgits)

(kw/ft.)

(sec.).

(sec.)

~

~'

I 1

i 1

I 3.4 Analysis of DSVT Results.

For all test cases, the initial values of DNBR and LPD were within those defined by the CPC/CEAC FORTRAN simulation code generated bandwidths which include the processing uncertainties obtained from the CPC Input Sweep Test.

In addition all test cases met the DSVT acceptance criteria for trip times.

Test cases were cases with the plant in the trip condition initially.

InEheCPCFORTRAN simulation code, one program execution cycle was required to generate a trip output, therefore the minimum and maximum trip Page 19 of 25

..: ~~..

" ~'

~~

J times from the CPC FORTPAft simulation code werc[

] seconds while the trip times from the CPC Single Channel werc{ )cconds.

Proper software initialization was verified for these cases by checkin'g the CPC FORTRAi! simulation code outputs at time (

seconds and verifying that the CPC FORTRAfl simulation code did l

indeed calculate a trip condition.

1 It was concluded that the DSVT results did not indicate the existence of software error.-

l t

e 1

FIPage 20 of 25

~

r 4.0 LIVE IttPUT sit:CLE PADAMETEP. TEST The Live Input Single Parameter test is a real-tico exercise of the CPC/CEAC application and executive software, with transient t

CPC/CEAC input values generated from an external source and read throug.h the CPC/CEAC input hardware. The objectives of this test are:

l

. (1) To verify that the dynamic response of the integrated CPC/CEAC software and hardware is consistert with that predicted by design analyses.

i 9

(2) To supplement design documentation quality assurance, Phase I module tests, Input Sweep Tests, and DSVT in assuring correct implementation of software modifications.

(3) To evaluate the integrated hardware / software system during

. operational modes approximating plant conditions.

4.1

[.ISPTestCaseSelection Reference 1 identifies the test cases to be used for LISP. These cases are the single variable dynamic transient test cases from the Phase II test series.

i These test cases, which are applicable to SONGS-2, consist of a l

(

i 4.'I Generation of LISP Acceotance Criteria 1

The acceptance criteria for LISP are based on trip times for the dynamic test cases.

f Page 21 of 25

[

O These cases are simulated in the CPC FCP.ir*a simulation code and contain the following adjustment co,,anents.

-\\

f I

r l

j s

t A

Application program execution lengths used for LISP testing were the same as -those for DSVT, with the addition of CEAC minimum and maximum execution lengths msec respectively).

The fina! acceptance criteria (generated by the CPC FORTRArt simulation code and adjusted for the above components) for LISP are contained

'in the following table.

Test Case.

Minimum Trip Time Maximum Trip Time (seconds)

(seconds) l s

?

l.

l-

' 4.3 MSPTestResults p

~

l

]dynamictransientswereexecutedontheCPCSingle LThe Char.ncl facility. The recorded trip. times (in seconds) for each case are listed in the followin'g table:

Page 22 of 25 t

\\-

L Run

's.

I All recorded trip times meet the final acceptance criteria for LISP.

Since the change in Revision 01 SONGS-2 CPC/CEAC software does not impact the operation of the point ID display and addressable constant range limit, (Reference 3), retesting of the point ID verification and addressable constant range limit is not required.

/

~

However, spot checks during the LISP testing verified the point ID were indeed displayed correctly.

l All aspects of automated reentry of Addressable Constants were tested and were determined to have been correctly implemented.

l 4

l 1

Page 23 of 25

~

5.0 PHASE II TEST RESULTS SUrctA'tY The Phase II software verification tests have been performed as required in. Reference 1.

In all cases, the test results fell within the acceptance criteria.

Based on these results, the Phase II test objectives have been successfully achieved.

It is l

concluded that the CPC/CEAC software modification described as 1

Revision 01 has been properly integrated with the CPC and CEAC O

~

software and system hardware and that the static and dynamic operation of the integrated system as modified is consistent with that predicted by design analyses.

i Page 24 of 25

I

~

6.0 REFERENCES

1.

CPC Protection Algorithm Software Change Procedure CEN-39(A)-P, Revision 02, December 21, 1978.

2.

CPC/CEAC Software Modifications for SONGS Units 1 and 2, CEN-135(S)-PRevision00.

j

,)

3.

CPC/CEAC System Phase I Software Verification Test Report, CEN-176(S)-P, Revision 01, November,1981.

t l

Pige 25 of 25

. - - - - -