ML20100J254

From kanterella
Jump to navigation Jump to search
Rev 2 to Core Protection Calculation/Control Element Assembly Calculator Sys Phase II Software Verification Test Rept
ML20100J254
Person / Time
Site: Palo Verde Arizona Public Service icon.png
Issue date: 09/30/1984
From:
ABB COMBUSTION ENGINEERING NUCLEAR FUEL (FORMERLY
To:
Shared Package
ML17298B596 List:
References
CEN-219(V)-NP, CEN-219(V)-NP-R02, CEN-219(V)-NP-R2, NUDOCS 8412100265
Download: ML20100J254 (26)


Text

-

1 i

PALO VERDE NUCLEAR GENERATING STATION UNIT 1

.y

.s CEN-219(V)-NP REVISION 02 l

CPC/CEAC SYSTEM PHASE II SOFTWARE VERIFICATION TEST REDORT SEPTEMBER, 1984

.e l

I l o l

,- Combustion Engineering, Inc.

((p Nuclear Power Systems Power Systems Group /

Windsor, Connecticut 8412100265 841203 PDR ADOCK 05000528 PDR A

l

[-

w 6

1 0

LEGAL NOTICE THIS REPORT WAS PREPARED AS AN ACCOUNT OF WORK SPONSORED BY COMBUSTION ENGINEERING, INC. NEITHER COMBUSTION ENGINEERING NOR ANY PERSON ACTING ON ITS BEHALF: -

A. MAXES ANY WARRANTY OR REPRESENTATION, EXPRESS OR IMPLIED INCLUDING THE WARRANTIES OF FITNESS FOR A PARTICULAR PURPOSE OR MERCHANTABILITY, WITH RESPECT TO THE ACCURACY, COMPLETENESS, OR USEFULNESS OF THE INFORMATION CONTAINED IN THIS REPORT, OR THAT THE USE OF ANY INFORMATION, APPARATUS,-

METHOD, OR PROCESS DISCLOSED IN THIS REFORT MAY NOT INFRINGE PRIVATELY OWNED RIGHTS; OR B. ASSUMES ANY LIABILITIES WITH RESPECT TO THE USE OF, OR FOR DAMAGES RESULTING FROM THE USE OF, ANY INFORMATION, APPARATUS, METHOD OR PROCESS .

DISCLOSED IN THIS REPORT.

4 4

2

ABSTRACT Phase II Testing is performed on the CPC/CEAC System to (1) verify that the CPC and CEAC software modifications have been properly integrated with the CPC

- and CEAC software and system hardware and (2) provide confirmation that the static and dynamic operation of the integrated system as modified is

. consistent with that predicted by design analyses, which provide design inputs to CPC/CEAC Functional Design Specifications.

This report presents the Phase II test results for the Arizona Nuclear Power Project, PVNGS-1 Plant CPC/CEAC Rev. 01 Cycle 1, Software.

The Phase II Software Verification Tests have been performed as required in Reference 1. In all cases, the test results fell within the acceptance criteria, or are explained. The test results indicate that the CPC and CEAC software has no indication of software error and that the operation of the integrated system is consistent with the performance 4

predicted by design analyses.

l 3

TABLE OF CONTENTS Section Title Page No.

1.0 INTRODUCTION

5 1.1 Objectives 5 1.2 Description of Phase II Testing 6 o 1.3 Applicability 6.

2.0 CPC/CEAC INPUT SWEEP TESTS 7 2.1 CPC Input Sweep Test Case Selection 7 2.1.1 CPC Processor Uncertainty Results 7 2.1.2 An'alysis of CPC Input Sweep Test Results 8 2.2 CEAC Input Sweep Test Case Selection 10 2.2.1 CEAC Processor Uncertainty Results 10 2.2.2 Analysis of CEAC Input Sweep Test Results 10 3.0 DYNAMIC SOFTWARE VERIFICATION TEST 11 3.1 DSVT Case Selection 11 3.2 Generation of DSVT Acceptance Criteria 12 3.3 DSVT Results 18 3.4 Analysis of DSVT Results 20 4.0 LIVE INPUT SINGLE PARAMETER TEST 21 4.1 LISP Test Case Selection 21 4.2 Generation of LISP Acceptance Criteria 22 4.3 LISP Test Results 23 5.0 PHASE II TEST RESULTS

SUMMARY

24 4

6.0 REFERENCES

25 4

. . - . - - . ~ . - _ . - ._.

1.0 INTRODUCTION

The verification of software modifications of the CPC/CEAC System consists of several steps which address two major areas of the modification process:

(1) Definition of software modifications (2) Implementation of software modifications The definition of software modifications is documented in t'he CPC and CEAC Functional Design Specifications and the Data Base Listing and is verified by design analyses contained in recorded calculations. The implementation of software modifications is I

documented in Software Design Specifications and assembly listings.

The verification process.for the modified software implementation includes Phase I and Phase II Software Verification Tests.

i The requirements of the Phase II Software Verification Testing are based on the fact that the Phase I Testing has been previously l performed. Successful completion of Phase I Testing verifies the correct implementation of the modified software. Phase II Testing i completes the software modification process by verifying that the integrated CPC System responds a,s expected.

This document contains the test results and conclusions for the Phase II Software Verification Test.

. 1.1 Objectives '

a The primary objective of Phase II Testing is to verify that the CPC and CEAC software modifications have been properly integrated with the CPC and CEAC software and system hardware. In addition Phase II Testing provides confinnation that the static and dynamic operation of the integrated system as modified is consistent with that predicted by design analyses. These objectives are achieved I

5

by comparing the response of the integrated system to the response predicted by the CPC/CEAC FORTRAN Simulation Code. This comparison is performed for a selected range of simulated static and dynamic input conditions.

., 1.2 Description of Phase II Testing

, Phase II testing consists of the following tests:

(1) Input Sweep Test, ,

(2) Dynamic Software Verification Test, and .

(3) Live Input Single Parameter Test.

These tests are performed on a Single Channel Test Facility (SCTF)

CPC/CEAC System with integrated software that has undergone successful Phase I Testing (Reference 2).

1.3 Applicability

This report applies to the Phase II testing performed on the Arizona Nuclear Power Project, PVNGS-1 CPC/CEAC system software. The software revisions documented in this report are designated as Revision Number 01 to the PVNGS-1 Cycle 1 CPC/CEAC System Software.

T l

I i M I

S

2.0 CPC/CEAC INPUT SWEEP TESTS The Input Sweep Test is a real time exercise of the CEAC and CPC application software and executive software with steady-state CPC and CEAC input values read from a storage device. This test has the following objectives:

1

- (1) To determine the processing uncertainties that are inherent in the CPC and CEAC designs.

(2) To verify the ability of the CPC and CEAC algorithms used in the system hardware to initialize to a steady state after an auto-restart for each of a large number of input combinations within the CPC/CEAC operating space, and (3) To complement Phase I module testing by identifying any.

abnormalities in the CPC and CEAC algorithms used in the system hardware which were not previously uncovered.

2.1 CPC-Input Sweep Test Case Selection

~

test cases, each involving different combinations of

" process inputs and addressable constants, were used for CPC design qualification testing of the Revision 01 software.

2.1.1 CPC Processor Uncertainty Results For each test case, differences in the results of the FORTRAN Simulation Code and Single Channel Test Facility (SCTF) were e calculated. A statistical analysis of these differences produced the processing uncertainties.

7

The DNBR statistics did not include those cases for which the DNBR

~

as calculated on either system was at the limits

~

". This is because a difference of zero (or close to zero) would be computed and would incorrectly weight the distribution of differences. A total of

~

cases remained after these cases were eliminated. The LPD statistic's did not include those cases for which the LPDDC as calculated on either system was. equal to or greater than the upper ,

- limit of core average kw/ft (=

kw/ft). A total of _ _

cases remained after these cases were eliminated.

~ '

Although cases were not included in the computation of-DNBR and LPD statistics, respectively, they were still included as Input Sweep Test cases for the purpose of identifying potential software errors.

The processor uncertainties for DNBR and LPD are defined as the one-sided tolerance limits which encompass 95% of the distribution of DNBR and LPD differences for all test cases with a 95% confidence level. The processor uncertainties determined from Input Sweep for DN9R and LPD respectively are ~

~ ~

DNBRunits,andf core average kw/ft.

However, since the distribution of differences is so restrictive the -

aaximum error may be used (that is, the limits which encompass 100%

of the difference). This is more conservative and yet still results in small processor uncertainties. Thus defined, the processor ,

uncertainties for Revision 01 on DNBR and LPD are

~

DNBR units and core average

~ ~

kw/ft, respect 1vely.

. 2.1.2 Analysis of CPC Input Sweep Test Results The results of the test cases exceeding the 95/95 tolerance limit were analyzed for evidences of software errors.

8

The review results of the DNBR and LPD test cases outside the 95/95 ~

tolerance limit will now be discussed. For DNBR there were ,

cases below the lower tolerance lin,it of (DNBRu" nits) and test cases above the upper tolerance limit of

'(DNBRunits). For these test cases the difference between the Single Channel Test Facility"(SCTF) and the CPC Fortran Simulation Code is within the accuracy of the two systems. The largest percent

- error among the cases was .

These differences do not show a significant commonality since the

- differences are absolute (not relative) and it should be expected that the largest differences should occur at high DNBR's. It is therefore concluded that no errors are indicated in the CPC Single Channel DNBR program, l- For LPD the cases examined were: cases with differences below.

-a -

the lower 95/95 tolerance limit of (% of core average

~ ~

kw/ft),[_[caseswithdifferencesgreatertha the upper tolerance

. limit of .

The largest percent error amongsthe cases was . The s __ _ .

common input to these test cases was found in other test cases with less maximum difference and less percent error. Examination of the

inputs to all

~ ~

LPD cases outside.the tolerance limits showed that the inputs covered a wide spectrum. 'No common area was found. It is therefore concluded that there is no. indication from the Input Sweep test results of software errors in the Single Channel calculation of LPD.

W

9 4

W 2.2 CEAC Input Sweep Test Case Selection test cases, each involving different combinations of CEAC process Inputs, were used for CEAC design qualification testing of the Revision 01 software.* These test cases covered all CEAC o operating space.

2.2.1 CEAC Processor Uncertainty Results .

4

For each test case, differences between the CEAC FORTRAN simulation code and CEAC single channel system results were calculated. The processor uncertainties for DN8R and LPD are defined as the one-sided tolerance limits which encompass 95% of the distribution of DNBR and LPD penalty factor differences for all test cases with a I 95% confidence level.

The processor une,ertainties for the DNBR ag the LPD penalty factor L differences are respectively.

i

j. 2.2.2 Analysis of CEAC Input Sweep Test Results

. The results were reviewed for representativeness and for any evidence of computational differences between the CEAC FORTRAN

! Simulation Code and the Single Channel Test Facility (SCTF). The

! test data produced penalty factors which swept the respective DNBR l and LPD penalty factor ranges with emphasis on the midrange values.

The differences between the penalty factors from the SCF and the i- CEAC FORTRAN Simulation Code were within a range which is justified j .

by the differences in word length. There were __ cases in which l

, , the packed penalty factor words from the SCTF and from the CEAC i Fortran Simulation Code differed. This difference was in the least significant bit of the DNBR penalty factor and was found to be due l to the Ifmited precision of the conversion constants. Therefore. it , ,

was concluded that the results of the test cases did not indicatetheexistenceofsoftwareerrors[

l I

10 i

-- --,-v----,, e . , - , - - -,.,a+..,,e,,,,_--w ._,..w_.- ,,.m ,_,.__._._,._,,,u . , , , , _ , , , . _ , _ , , , , _ , . _ _ . , _ , _ _ _ _ , , , , _ , .

.3.0 DYNAMIC SOFTWARE VERIFICATION TEST The Dynamic Software Verification Test (DSVT) is a real time exercise of the CPC application software and executive software with transient CPC input values read from a storage device. This test has two objectives:

(1) To verify that the dynamic response of the integrated CPC software is consistent with that predicted by design analyses, end (2) To supplement design documentation quality assurance, Phase I~

module tests, and Input Sweep Tests in assuring correct implementation of software modifications.

Further information concerning DSVT may be found in Reference 1.

3.1 DSVT Case Selection Test cases for DSVT are selected to exercise dynamic portions of the CPC software with emphasis on those portions of the software that have been modified. -

DSVT requires that, as a minimum, cases be selected for testing (Reference 1). The7ecasesarefromthePhase

~~

II test series (identified in Reference 1) and consist of a{

_ ] The l

entire series of DSVT test cases was executed, using the CPC/CEAC l FORTRAN Simulation Code and the Single Channel Test Facility (SCTF) with the Rev. 01 CPC software. Because PVNGS-1 has one fewer _

regulating CEA banks than previous CPC-protected plants, only of the usual subcases needed to be executed in the shutdown sequence represented by case -- . Subcase was retained in a I

11

dummy format to preserve a test case numbering sequence which was consistent with previous tests. In addition, cases J . each consistingof]subcases,wereexecutedtotestthe3C/CEAC response to reactor power cutback.

. 3.2 Generation of DSVT Acceptance Criteria The acceptance criteria for DSVT are defined in Reference 1 as the trip times and initial values of DN8R and LPD for each test case.

These' acceptance criteria are generated using the certified CPC/CEAC

/0RTRAN Simulation Code an Data Base for PVNGS-1 Cycle 1.

Processing uncertainties obtained during Input Sweep testing are j factored into the acceptance criteria for initial values of DN8R and LPD where necessary. Trip times are affected by program execution lengths as well as by the Input Sweep uncertainties. The minimum, average,andmaximumexecutionlengths(inmilliseconds) calculated for the Revision 01 software are listed below, i

CPC Application Program Execution Lengths Program Minimum Nominal Maximum (msec) (msec) (msec)

RW UPDATE

! POWER STATIC I

Each DSVT case was initially executed once with nominal program execution lengths (values between the minimum and maximum) and data.

base values of trip setpoints using the CPC/CEAC FORTRAN Simulation  ;

j Code. Following execution of the same cases using the Single l Channel Test Facility (SCTF), the single case which did not yield a ONBR trip time equivalent to that calculated by the CPC/CEAC FORTRAN Simulation Ccde was re-analyzed. j 12 e.-en--,m---,,wn-- -- -

l r

a 1

This DSVT.c,ase was re-executed once with minimum execution lengths

, and the mos't conservative trip setpoints and once with maximum execution lengths and the least conservative trip setpoints. This process produced a bandwidth of trip times .for the test case which contained the effects 'on:p.rocessing uncertainties.

. -s _

'The softwa',e DSVT program includes a , _mfilisecond interrupt cycle in order to check for DNBR and LPD trip signals. This results in a , ,

millisecond interval limit on trip time resolution which is

., factored into the acceptance criteria. The following tables contain

)J the final DSVT acceptance criteria for initial values and trip times for DNBR and LPD.

.b -

1 e

3 I

3 s i

\

Q

[! t-

't n

[

13

f

Acceptance Criteria for DNBR and LPD Initial Values (DNBR Units and kw/ft.,~respectively)

DNBR DNBR LPD LPD Test Case (Min.) (Max.) (Min.) (Max.)

e d

6 4

Y 14

~

Acceptance Criteria for DNBR and LPD Initial Values (DNBR Units and kw/ft., respectively)

(Cont.)

DNBR DNBR LPD LPD Test Case -(Min.) (Max.) (Min.) (Max.)

O t

1 l

I 15 i

J Acceptance Criteria for DNBR and LPD Trip Times (seconds)

DNBR Trip DNBR Trip LPD Trip LPD Trip Test Case ~(Min.) (Max.) (Min.) (Max.)

D 3

J 9

i r

ll 16 i

c.

Acceptance Criteria for DNBR and LPD Trip Times (seconds)

DNBR Trip DNBR Trip LPD Trip LPD Trip Test Case (Min.) (Max.) (Min.) (Max.)

O e

Y 17

DSVT TEST RESULTS Initial Initial DNBR LPD DNBR Trip LPD Trip Test Case (DNBRUnits). (kw/ft.) (sec.) (sec.)

e 4

i 18

DSVT TEST RESULTS Initial Initial DNBR LPD DNBR Trip LPD Trip Test Case (DNBR Units) (kw/ft.) (sec.) (sec.)

+

4 9

19

t 4

t j 3.4 Analysis of DSVT Results The trip times for all of the test cases executed on the single channel facility met the acceptance criteria determined by the CPC/CEAC FORTRAN Simulation Code.

O . =

For all test cases with the exception of , the initial valuesofDNBRandLPDwerewithinthebalidwidthidefinedbythe FORTRAN Simulation Code which include the processing uncertainties ,

~ ~

i obtained from the CPC Input Sweep Test. For cases the

~

initial values of LPD were outside the upper limits detemined by i applying the uncertainties derived from Input Sweep Testing to the CPC FORTRAN results. Tju differences were a from the nominal LPD values y kw/ft for cases , respectively,

. _ . . - - - _ a

. corresponding to per cent design core avg.

kw/ft. Theseer$rsareless.thanthemagnitudeofthelargest _

lower limit difference detemined from Input Sweep Testing ( *

~

, per cent design core avg.. kw/ft) and can be attributed to the differences in the precision between the machines and to the input interpolation of the DSVT program. No software error is indicated.

l-l Test cases , are cases with the

~

t inputs initially defining a tripped condition. In the CPC FORTRAN Simulation Code, one program execution cycle is needed to generate a trip output. This implies an acceptance criterion of sec. for

'~

minimum and maximum time-to-trip, while the actual trip t1mes for the CPC Single Channel were _ ysec. These CPC FORTRAN case,s were .

examined to verify that a trip condition existed at time ,

' ' ~ ~

justifying the indicated acceptance criteria for time-to-trip' of

~~

-> sec., cons' stent with the expected CPC Single Channel, response.

a 20

4.0 LIVE INPUT SINGLE PARAMETER TEST The Live Input Single Parameter test is a real-time exercise of the CPC/CEAC application and executive software, with transient CPC/CEAC input values generated from an external source and read through the CPC/CEAC input hardware. The objectives of this test are:

(1) To verify that the dynamic response of the integrated CPC/CEAC software and hardware is consistent with that predicted by design analyses.

(2) To supplement design documentation quality assurance, Phase I ,

module tests, Input Sweep Tests, and DSVT in assuring correct implementation of software modifications.

(3) To evaluate the integrated hardware / software system during operational modes approximating plant conditions.

4.1 LISP Test Case Selection Reference 1 identifies the test cases to be used for LISP. .These cases are the single variable dynamic transient test cases from the

~

Phase II test series. In addition, a test case is included to test the Reactor Power Cutback (RPC) feature.

These test cases, which are applicable to PVNGS-1, consist of a

]

21

4.2 Generation of LISP Acceptance Criteria The acceptance criteria for LISP are based on trip times for the dynamic test cases. For the RPC test case, there should be no trip during RPC.

These cases are simulated in the CPC FORTRAN Simulation Code and

. contain the following adjustment components.

Application program execution lengths used for LISP testing were the same as those for DSVT, with the addition of CEAC minimum and maximum execution lengths of , ,

msec, respectively.

The final acceptance criteria (generated by the CPC FORTRAN Simulation Code and adjusted for the above components) for LISP also, include and are contained in the following table.

Test Case Minimum Trip Time PhaximumTripTime i

(seconds) (seconds)

I

+<

9 I

22

4.3 LISP Test Results The

~~ dynamic transients were executed on the CPC Single Channel TestFacility(SCTF). The recorded trip times (in seconds) for each case are listed in the following table:

kn All recorded trip times met the final acceptance criteria for LISP.

~

The result of test case ~

~showed that the RPC feature, as

~

expected, caused no CPC trip when the single bank RPC was inserted.

Major aspects of the system diagnostic features were verified.

These include the trips buffer and failed sensor reports, CPC and CEAC Point ID's, and correct operation of CEAC displays and j

operator's module lamp indications. The addressable constant range

, limit check and all aspects of automated reentry of Addressable i

Constants were also tested. Therefore all testing was determined to l, be acceptable and the system diagnostic features were correctly

! implemented.

4 e

4 s

1 23 t

m-5.0 PHASE II TEST RESULTS SUPNARY The Phase II software verification tests have been performed as required in Reference 1. The test results indicate that the CPC and CEAC software has no indication of software errors and that the-operation of the integrated system is consistent with the

. performance predicted by design analyses, which provide design

  • inputs to CPC/CEAC Functional Design Specifications.

i i-i.

?

i 9

i j

1

)

24

6.0 REFERENCES

1. CPC Protection' Algorithm Software Change Procedure CEN-39(A)-P, Revision 02, December 21, 1978.
2. Palo Verde Nuclear Generating Station Unit 1, CPC/CEAC System Phase I Software verification Test Report, CEN-217(V)-P, a Revision 02, Septefrber 1984.

9 I

25