ML20100J222
| ML20100J222 | |
| Person / Time | |
|---|---|
| Site: | Arkansas Nuclear |
| Issue date: | 03/31/1985 |
| From: | ABB COMBUSTION ENGINEERING NUCLEAR FUEL (FORMERLY |
| To: | |
| Shared Package | |
| ML19269B331 | List: |
| References | |
| CEN-162(A)-NP, CEN-162(A)-NP-R01, CEN-162(A)-NP-R1, NUDOCS 8504100400 | |
| Download: ML20100J222 (25) | |
Text
'
T ARKANSAS POWER & LIGHT COMPANY ARKANSAS NUCLEAR ONE - UNIT 2 (ANO-2)
DOCKET 50-368 CEN-162(A)-NP REVISION 01-NP CPC/CEAC SYSTi'i PHASE II SOFTWARE VERI ICATION TEST REPORT MARCH 1985 COMBUSTION ENGINEERING, INC.
Nuclear Power Systems Power Systems Group
}
Windsor, Connecticut 5
m E
hh i
s g 1~g
,=~~ =;-- r -,
g=== ;
=._ :.
LEGAL NOTICE THIS RESPONSE WAS PREPARED AS AN ACCOUNT OF WORK SPONSORED BY COMBUSTION ENGINEERING, INC. -NEITHER COMBUSTION ENGINEERING NOR ANY PERSON ACTING CN ITS BEHALF:
a.
MAKES ANY WARRANTY OR REPRESENTATION, EXPRESS OR IMPLIED INCLUDING THE WARRANTIES OF FITNESS FOR A PARTICULAR PURPOSE OR MERCHANTABILITY, WITH RESPECT TO THE ACCURACY, COMPLETENESS, OR USEFULNESS OF THE INFORMATION CONTAINED IN THIS RESPONSE, OR THAT THE USE OF THE INFORMATION CONTAINED IN THIS RESPONSE, OR THAT THE USE OF ANY INFORMATION, APPARATUS, METHOD, OR PROCESS DISCLOSED IN THIS RESPONSE, OR THAT THE USE OF ANY INFORMATION, APPARATUS METHOD, OR PROCESS DISCLOSED IN THIS RESPONSE MAY NOT INFRINGE PRIVATELY 0WNED RIGHTS; OR b.
ASSUMES ANY LIABILITIES WITH RESPECT TO THE USE OF, OR FOR DAMAGES RESULTING FROM THE USE OF, ANY INFORMATION, APPARATUS, METHOD OR PROCESS DISCLOSED IN THIS RESPONSE.
Page 2 of 25 I
i
- s
=
ABSTRACT Phase II Testing is perfonned on the Single Channel Facility CPC/CEAC System to (1) verify that the CPC and CEAC software modifications have been properly integrated with the CPC and CEAC software and system hardware and (2) provide confirmation that the static and dynamic operation of the integrated system as modified is consistent with that predicted by design analyses, which provide design inputs to CPC/CEAC Functional Design Specifications.
This report presents the Phase II test results for the. Arkansas Nuclear One -
Unit 2(ANO-2)PlantCPC/CEACRevision05 software. This revision is applicable to ANO-2 Cycle 5.
The Phase II Software Verification Tests have been performed as required in Reference 1.
In all cases, the test results fell within the acceptance criteria, or are explained. The test results are that both the CPC and CEAC software have no indication of software error and that the operation of the integrated system is consistent with the performance predicted by design analyses.
Page 3 of 25
~"~ ~~~
(~L~
._..i..
. 27"
~ ~=' ~ ~ ~ '
~ '^
~ = '
..= :
i TABLE OF CONTENTS-Section Title Page No.
1
1.0 INTRODUCTION
5 I
1.1 OBJECTIVES 5
i
1.2 DESCRIPTION
OF PHASE II TESTING 6
l 1.3 APPLICA8ILITY 6
j_
2.0 CPC/CEAC INPUT SWEEP TESTS 7
2.1 CPC INPUT SWEEP 7
2.1.1 CPC Input Sweep Test Case Selection 7
2.1.2 CPC Processor Uncertainty Results 7
l 2.1.3 -
Analysis of CPC Input Sweep Test Results 9
2.2 CEAC INPUT SWEEP TEST 10 2.2.1 CEAC Input Sweep Test Case Selection 10 2.2.2 CEAC Processor Uncertainty Results 10 2.2.3 Analysis of CEAC Inpu't Sweep Test Results 10 i
i 3.0 DYNAMIC SOFTWARE VERIFICATION TEST 11 3.1 DSVT TEST CASE SELECTION 11 3.2 GENERATION OF DSVT ACCEPTANCE CRITERIA 13 3.3 ANALYSIS'0F DSVT TEST RESULTS 18 i
).
4.0
-LIVE INPUT SINGLE PARAMETER TEST 21 4.1 LISP TEST CASE SELECTION 21 l
4.2 GENERATION OF LISP ACCEPTANCE CRITERIA 21 4.3 LISP TEST RESULTS 22 5.0 PHASE II TEST RESULTS
SUMMARY
24 l
6.0 REFERENCES
25 i
Page 4 of 25 s
-,-,r,e e - -
---:,.---.-.-mn a.-,
,,-v e
n
,.r-,
~
- ~~
~r-A _._...__; N: _;- -~- -~
~
= "~ A x
1.0 INTRODUCTION
The verification of software modifications of the CPC/CEAC System consists of several steps which address two major areas of the modification process:
(1) Definition of software modifications (2)
Implementation of software modifications The definition of software modifications is documented in Software Change Procedures (References 1 and 2), CPC and CEAC Functional Design Specifications (References 3 & 4), which are modified by References 5 and 6, and the Data Base Listing, (Reference 8). The applicability of References 3, 4, 5 and 6 to ANO-2 Cycle 5 is documented in Reference 7.
All specifications are verified by design analyses contained in recorded calculations. The implementatinn of software modifications is documented in Software Design Specifications and program listings.
The verification process for the modified software implementation is two-phase: Phase I testing (Reference 9), must be performed before Phase II. Successful Completion of Phase I Testing verifies the correct implementation of the modified software. Phase II testing completes the software modification process by validating that the integrated CPC System responds as expected.
This document contains the test results and conclusions for the Phase II software verification test.
1.1 OBJECTIVES The primary objective of Phase II Testing is to validate that the CPC and CEAC software modifications have been properly integrated with the CPC and CEAC' software and system hardware.
In addition, Phase II testing provides confirmation that the static and dynamic Page 5'of 25
"? s"
" ' ^ * - - -
. ~~"**~'**swom**~~~~~~~~~
=~ ~-
s~+s
-e
~ - ~
operation of the integrated system as modified is consistent with that predicted by design analyses. These objectives are achieved by comparing the response of the integrated system to the response predicted by the CPC/CEAC FORTRAN Simulation Code. This comparison is perforined for a selected set of simulated static and dynamic input conditions.
1.2 DESCRIPTION
OF PHASE II TESTING Phase II testing consists of the following tests:
(1) Input Sweep Tests for the CPC and the CEAC, (2) Dynamic Software Verification Test, and (3) Live Input Single Parameter Test.
These tests are performed on a Single Channel Facility (SCF) of the CPC/CEAC System with integrated software that has undergone successful Phase I testing.
I 1.3 APPLICABILITY This report applies to the Phase II Testing performed on the Arkansas Power and Light (ANO-2) CPC/CEAC System Software. The i
software revisions documented in this report are designated as Revision 05 to the ANO-2 CPC/CEAC System Software. This revision is applicable to ANO-2 Cycle 5.
i l
Page 6 of 25 n
c,____________
,emma.
w w
+...
A ~b = -E A*-
E.
de
. b, a '.4w -
4
'=
2.0 CPC/CEAC INPUT SWEEP TESTS The Input Sweep Test is a real-time exercise of the CEAC and CPC application software and executive software with steady-state CPC and CEAC input values read from a storage device. These tests have the following objectives:
(1) To detennine the processing uncertainties that are inherent in the CPC and CEAC designs.
(2) To verify the ability of the CPC and CEAC algorithms used in the system hardware to initialize to a steady state after an auto-restart for each of a large number of input combinations within the CPC/CEAC operating space.
(3) To complement Phase I module testing by identifying previously unnoticed abnormalities in the CPC and CEAC algorithms used in the system hardware.
2.1 CPC INPUT SWEEP 2.1.1 CPC Input Sweep Test Case Selection
(
)testcases,eachinvolvingdifferentcombinationsof process inputs and addressable constants, were used for CPC design qualification testing of the Revision 05 software.
2.1.2 CPC Processor Uncertainty Results For each test case, differences in the results of the CPC FORTRAN Simulation Code and Single Channel Facility (SCF) were calculated.
A statistical analysis of these differences produced the processing uncertainties.
l Page 7 of 25
QQ _
,.Ll--.l, 7_
The DN8R statistics did not include those cases for which the DNBR ascalculatedoneithersystemwasatthelimits(
).Thisis because a difference of zero (or close to zero) would be computed e
and would incorrectly weight the distribution of differences. A total of[
] cases remained after these cases were eliminated.
The LPD statistics did not include those cases for which the LPD as l
calculated on either system was equal to or greater than the upper Ifmitof[
]coreaveragekW/ft(=((kW/ft). Atotalof[
]
cases remained after these cases were eliminated.
Although[
)caseswerenotincludedinthecomputationof DN8R and LPD statistics, respectively, they were still included as Input Sweep Test cases for the purpose of identifying potential software errors.
The processor uncertainties for DN8R and LPD are defined as the one-sided tolerance limits which encompass 95% of the distribution of DN8R and LPD differences for all test cases with a 95% confidence level. The processor uncertainties detemined from Input Sweep for DN8RandLPDrespectivelyare[
)
DN8Runits,and(
]coreaveragelinearheat rate. Sincethelargestdifferenceamongthe[
ltestcasesareof j
],Thischoiceminimizesthefalse l
indication of software errors while ensuring that real software errors are trapped. Thus defined the processor uncertainties from l
the Input Sweep Testing of the Revision 05 CPC software for DN8R and j-LPDare[
)DNBRunitsand(
)
and(
)coreaveragelinearheatrate.
Page 8 of 25
._-._i ~.-_,.-_L
=,..,~y..=.=.
2.1.3 Analysis of CPC Input Sweep Test Results The results of the test cas'es exceeding the 95/95 tolerance limit were analyzed for evidences of software errors. For DNBR there were
()casesbelowthelowertolerancelimitof[
)(DNBRunits) and(ltestcasesabovetheuppertolerancelimitof[
]
(DN8Runits). For these[
) test cases the difference between the SCF and the CPC FORTRAN Simulation Code is within the accuracy of the two systems. These differences do not show a significant l
commonality since the differences are absolute (not relative) and it l
should be expected that the largest differences should occur at high The largest percent (relative) error among the,(
) cases DN8R's.
was[
}
It is therefore concluded that no errors are indicated i
in the CPC Single Channel DN8R program, For the LPD cases examined, there were(
cases with differences i
below the lower 95/95 tolerance limit of
](%ofcoreaverage l
LinearHeatRateand()caseswithdifferencesgreaterthanthe l
uppertolerancelimitof(
)(%ofcoreaverageLinearHeat
)
Rate). The common input to these test cases was found in other test cases with less maximum difference and less percent error.
f Examination of the inputs t,o al1[
]LPD cases outside the tolerance l
limits showed that the inputs covered a wide spectrum. No commo,n,
i area was found. Thelargestpercenterror(relative)amongthe caseswas( ~}
l Therefore it is concluded that the Input Sweep test results do not indicate software errors either in the CNBR or in the LPD calculations.
1 Page 9 of 25 l
l
..~_;...-.....,-
.. _ m _,_ _. _
l 2.2 CEAC INPUT SWEEP TEST 2.2.1 CEAC Input Sweep Test Case Selection
[
]testcases,eachinvolvingdifferentcombinationsof CEAC process inputs were used for CEAC design qualification testing of the ANO-2 Revision 05 software. These test cases covered all CEAC operating space.
2.2.2 CEAC Processor Uncertainty Results For each test case, differences between the CEAC FORTRAN Simulation Code and CEAC Single Channel system results were calculated. The processor uncertainties for DNBR and LPD are defined as the one I
sided tolerance limits which encompass 95% of the distribution of DNBR and LPD penalty factor differences for all test cases with a 95% confidence level.
The processor uncertainties for the DNBR and the LPD penalty factor differencesarelessthan(
}
respectively.
2.2.3 Analysis of CEAC Input Sweep Test Results The results were reviewed for representativeness and for any evidence of computational differences between the CEAC FORTRAN Simulation and the Single Channel Facility (SCF). [
~
}
For all test cases no differences were found between the SCF and CEAC FORTRAN Simulation Code.
It is therefore concluded that the CEAC Input Sweep Test results do not indicate software errors either in the DNBR or in the LPD paralty factor calculations.
l Page 10 of 25,
,w
--r
m _, _._ _e -
o :-
.:_ ;.. ~_.
_ _.. c 3.0 DYNAMIC SOFTWARE VERIFICATION TEST The Dynamic Software Verification Test (DSVT) is a real time exer-cise of the CPC application and executive software with transient CPC input values read from a storage device. This test has two objectives:
(1) To verify that the dynamic response of the integrated CPC software is consistent with that predicted by design analyses.
(2) To supplement design documentation quality assurance, Phase I module tests, and Input Sweep Tests in assuring correct imple-mentation of software modifications.
Further information concerning DSVT may be found in Reference 1.
3.1 DSVT TEST CASE SELECTION Test cases for DSVT are selected to exercise dynamic portions of the CPC software with emphasis on those portions of the software that have been modified.
DSVTrequiresthat,asaminimum, cases [
]be selected for testing (Reference 1). These cases are from the Phase
~
IItestseries(identifiedinReference1)andconsistof[
However, because of extensive software modifications the entire series of applicable DSVT Test cases was executed using the CPC/CEAC FORTRAN Simulation Code and the Single Channel Facility (SCF)withtheRev.05CPCSoftware.
l i
i Page 11 of 25 i
i
~;...
O*?TCTY.U:$
- o.. -:
A
?WCCIMTTSSAC
<m
~es TE.T.T* Y '.
"W ' ' ~~
~
~
- r 3.2 GENERATION OF DSVT ACCEPTANCE CRITERIA Acceptance criteria for' DSVT. are defined in Reference 1 as the trip times and initial values of DNBR and LPD for each test case. These Acceptance Criteria are generated using the certified CPC/CEAC FORTRAN Simulation Code and the Data Base Listing for ANO-2,- Cycle 5.
Processing uncertainties obtained during Input Sweep testing are factored into the acceptance criteria for initial values of DNBR and LPD where necessary. Trip times are affected by program execution lengths as well as by the processing uncertainties. The minimum, average, and maximum execution lengths (in milliseconds) calculated for the Revision 05 software are listed below.-
CPC Application Program Execution Lengths
(
Program Minimum Avera e Maximum (msec)
(msec (msec)
FLOW UPDATE POWER STATIC l
Each DSVT case is initially executed once with the nominal program
- execution. lengths (valuesbetweentheminimumandmaximum)andthe data base values of trip setpoints. During this phase, it is verified that the. test data executed for each test case produces the
]
l
-intended initialization and transient output. Once the test cases I
. have been adjusted appropriately for the given plant and CPC/CEAC l
1
.Page 12 of 25
' ~. _
M5 configuration, they are executed on the Single Channel Facility (SCF).
The test case is executed with the CPC/CEAC a
FORTRAN Simulation Code once with minimum execution lengths and the most conservative trip setpoints and once with maximum execution I
~
lengths and/or least conservative trip setpoints. This process
~
produces a band of trip times for the test cases which contains the effects of processing uncertainties. The largest band of acceptable 7-trip times will be obtained if the modified execution lengths and
~
adjusted trip setpoints are used simultaneously.
The software DSVT program includes a ' [ millisecond interrupt cycle, 2
to check for DNBR and LPD trip signals. This results in a
,} millisecond-intervallimitintriptimeresolutionwhichis factored into the acceptance criteria. The following tables contain the final DSVT acceptance criteria for initial values and trip times
{
I for DNBR and LPD and the corresponding test results.
{
T m:
s--
r 1
4 D
-f f
p--
f Page 13 of 25
_4 e
b
1 Q, --.... - -...
Acceptance Criteria for DNBR and LPD Initial Values (DNBR Units and kW/ft., respectively)
DNBR DNBR LPD LPD Test Case (Min.)
(Max.)
(Min.)
(Max.)
4 Page 14 of 25
[,_z... ;_.,=,.?,.
~
.,.,c.-
Acceptance Criteria for DNBR and LPD Initial Values (DNBR Units and kW/ft., respectively)
(Continued)
DNBR DNBR LPD LPD Test Case (Min.)
(Max.)
(Min.)
(Max.)
4 Page 15 of 25
g;,. : - --...
Acceptance Criteria for DNBR and LPD Trip Times (seconds)
DNBR DNBR LPD LPD Test Case (Min.)
(Max.)
(Min.)
(Max.)
Page 16 of 25
t L___
.A.
Acceptance Criteria for DNBR and LPD Trip Times (seconds)
(Continued)
DNBR DNBR LPD LPD Test Case (Min.)
(Max.)
(Min.)
(Max.)
Page 17 of 25
U.
,=_.=~ -. = -..
f 3.3 ANALYSIS OF DSVT RESULTS Results of DSVT are listed in the table on the following pages.
Therefore, it is concluded that the DSVT does not indicate any errors in the CPC software.
Page 18 of 25
DSVT Results Initial Initial DNBR LPD DNBR Trip LPD Trip Test Case (DNBR Units)
(kW/ft.)
(sec.)
(sec.)
l
~
i
/
Page 19 of 25
u ;. ~ ~ ~ ~2.....T2 '... '.1 -. ~ ~.
=~~
~~~ ~~-^ = - ~--' = -
=-
DSVT RESULTS (Cont.)
Initial Initial DNBR LPD DNBR Trip LPD Trip Test Case (DNBR Units)
(kW/ft.)
(sec.)
(sec.)
Page 20 of 25
l
.u.
y
_ - ; ~ ~. m.--. _
- . m-,7
=
4 7
4.0 LIVE INPUT SINGLE PARAMETER TEST The Live Input Single Parameter test is a real-time exercise of the CPC/CEAC application and executive software, with transient CPC/CEAC I
input signals generated from an external source and read through the CPC/CEAC input hardware. The objectives of this test are:
1 (1) To verify that the dynamic response of the integrated CPC/CEAC software and hardware is consistent with that predicted by design analyses.
y (2) To supplement design documentation quality assurance, Phase I L
module tests Input Sweep Tests, and DSVT in assuring correct implementation of software modifications.
k I
(3) To evaluate the integrated hardware / software system during y
operational modes approximating plant conditions, i
I 4.1 LISP TEST CASE SELECTION I
Reference 1 identifies the test cases to be used for LISP. These cases are the single variable dynamic transient test cases from the Phase II test series, i
These test cases, which are applicable to ANO-2 consist of a t
I h
4.2 GENERATION OF LISP ACCEPTANCE CRITERIA
+
(
The acceptance criteria for LISP are based on trip times for the i
dynamic test cases.
h
?
h Page 21 of 25 V
- .._.-.~.. -....-
c j
These cases are simulated within an updated CPC/CEAC FORTRAN Simulation Code and contain the following adjustment components.
Application program execution lengths used for LISP testing were the same as those for DSVT, with the addition of CEAC minimum and maximum execution lengths of msec, respectively.
The final acceptance criteria for LISP (generated by the updated CPC/CEAC FORTRAN Simulation Code and adjusted for the above i
components and the variation in trip time due to data recording uncertainty) are contained in the following table.
Test Case Minimum Trip) Time Maximum Trip) Time (seconds (seconds 4.3 LISP TEST RESULTS The dynamic transients were executed on the Single Channel Facility (SCF). The recorded trip times (in seconds) for each case are listed in the following table:
s l
Page 22 of 25 i
-.;...__._z______
= _ _ _. _ _
?
....a a : a---.. ai
-a..
5 Run All recorded trip times met the final acceptance criteria for LISP.
Major aspects g the system diagnostic features were verified.
These include The addressable constant range limit check and all aspects of automated reentry of Addressable constants i
were also tcsted.
Therefore, all testing was detennined to be acceptable and the system diagnostic features were correctly implemented.
E i
s i
h
[
Page 23 of 25 P
U
m _,_. _m,
5.0 PHASE II TEST RESULTS
SUMMARY
The Phase II software verification tests have been performed as required in Reference 1.
The test results are that both the CPC and CEAC software have no indication of errors and that the operation of the integrated system is consistent with the perfonnance predicted by design analyses, which provide design inputs to CPC/CEAC Functional Design Specifications.
Page 24 of 25
6.0 REFERENCES
1.
CPC Protection Algorithm Software Change Procedure, CEN-39(A)-NP, Revision 02, December 1978.
2.
CPC Protection Algorithm Software Change Procedure Supplement, CEN-39(A-NP, Supplement 1-NP, Revision 02, April,1984.
3.
Functional Design Specification for a Core Protection Calculator, CEN-147(S)-NP, February 1981.
4.
Functional Design Specification for a Control Element Assembly Calculator, CEN-148(S)-NP, January 1981.
5.
CPC/CEAC Software Modification for San Onofre Nuclear Generation Station Unit No. 2 and 3, CEN-281(S)-NP, Revision 01, November 1984.
6.
Dockets STN-50-470F, Enclosure 1-NP to LD-82-039, CPC/CEAC Software Modification for System 80, March, 1982.
7.
CPC Methodology Changes for Arkansas Nuclear One Unit 2 Cycle 5, Docket No. 50-368, CEN-288(A)-NP, Revision 00, October, 1984.
8.
ANO-2 CPC and CEAC data Base Listing, CEN-296(A)-NP, Revision 00, 1985.
9.
CPC/CEAC System Phase I Software Verification Test Report, CEN-298(A)-NP, Revision 00, March, 1985.
page 25 of 25
r COMBUSTION ENGINEERING, INC.
i
_ _ _ _ _ __ _ _ _ _ _ _