ML20063D205
| ML20063D205 | |
| Person / Time | |
|---|---|
| Site: | Waterford |
| Issue date: | 06/30/1982 |
| From: | ABB COMBUSTION ENGINEERING NUCLEAR FUEL (FORMERLY |
| To: | |
| Shared Package | |
| ML19268B078 | List: |
| References | |
| CEN-208(C)-NP, CEN-208(C)-NP-R, CEN-208(C)-NP-R00, NUDOCS 8207010369 | |
| Download: ML20063D205 (28) | |
Text
__-
WATERFORD 3, CYCLE 1 CPC/CEAC PHASE ll SOFTWARE VERIFICATION TEST REPORT i
l WATERFORD STEAM l
ELECTRIC STATION UNIT NO. 3 i
JUN's?,1982 i
E $ s?ErEus eg7gg,ggg ccoeus1c~ e~a seea:~o.se A
l LEGAL NOTICE THIS REPORT WAS PREPARED AS AN ACCOUNT OF WORK SPONSORED BY COM80ST10N ENGINEERING, INC. NEITHER COMBUSTION ENGINEERING i
NOR ANY PERSON ACTING ON ITS BEHALF:
A.
MAKES ANY WARRANTY OR REPRESENTATION, EXPRESS OR IMPLIED INCLUDING THE WARRANTIES OF FITNESS FOR A PARTICULAR PURPOSE OR MERCHANTA81UTY, WITH RESPECT TO THE ACCURACY, COMPLETENESS, OR USEFULNESS OF THE INFORMATION CONTAINED IN THIS REPORT, OR THAT THE USE OF ANY INFORMATION, APPARATUS, METHOO, OR PROCESS DISCLOSED IN THIS REPORT MAY NOT INFRINGE PRIVATELY OWNED RIGHTS;OR l
- 8. ASSUMES ANY LIABILITIES WITH RESPECT TO THE USE OF, OR FOR DAMAGES RESULTING FROM THE USE OF, ANY INFORMATION, APPARATUS, l
METHOD OR PROCESS DISCLOSED IN THIS REPORT.
t
__._v.,_
u.
m-
.y_..w.-
y
WATERFORD-3 CEN-208(C)-NP REVISION 00 CPC/CEAC SYSTEM PHASE II SOFTWARE VERIFICATION TEST REPORT JUNE 1982 i
i t
t Combustion Engineering, Inc.
Nuclear Power Systems Power Systems Group j
Windsor, Nnnecticut
ABSTRACT Phase II Testing is performed on the CPC/CEAC System to (1) verify that the CPC and CEAC software modifications have been properly integrated with the CPC and CEAC software and system hardware and (2) provide confirmation that the static and dynamic operation of the integrated system as modified is consistent with that predicted by design analyses.
This report presents the Phase II test results for the Louisiana Power &
Light Co., Waterford-3 CPC/CEAC Rev. 00, software.
The Phase II Testing was performed according to previously issued procedures (Reference 1).
The test results indicate that the CPC and CEAC software modifications have been properly integrated with the CPC and CEAC software and hardware and that the operation of the integrated system as modified is consistent with the performance predicted by design analyses.
Dis ~cussions of the investigation of test results falling outside acceptance criteria are provided.
2
TABLE OF CONTENTS Section Title Page No.
1.0 INTRODUCTION
4 1.1 Objectives 4
1.2 Description of Phase II Testing 5
1.3 Applicability 5
2.0 CPC/CEAC INPUT SWEEP TESTS 6
2.1 CPC Input Sweep Test Case Selection 6
2.1.1 CPC Processor Uncertainty Results 6
2.1.2 Analysis of CPC Input Sweep Test Results 8
2.2 CEAC Input Sweep Test Case Selection 10 2.2.1 CEAC Processor Uncertainty Results 10 2.2.2 Analysis of CEAC Input Sweep Test Results 10 3.0 DYNAMIC SOFTWARE VERIFICATION TEST 11 3.1 DSVT Case Selection 11 3.2 Generation of DSVT Acceptance Criteria 12 3.3 DSVT Results 18 3.4 Analysis of DSVT Results 20 4.0 LIVE INPUT SINGLE PARAMETER TEST 22 4.1 LISP Test Case Selection 22 4.2 Generation of LISP Acceptance Criteria 22 4.3 LISP Test Results 23 4.4 Analysis of LISP Results 24 5.0 PHASE II TEST RESULTS
SUMMARY
26
6.0 REFERENCES
27 3
1.0 INTRODUCTION
The verification of software modifications of the CPC/CEAC System consists of several steps which address two major areas of the modification process:
(1) Specification of software modifications (2)
Implementation of software modifications The specification of software modifications is documented in the CPC and CEAC Functional Descriptions and the Data Base Document and is verified by design analyses contained in recorded calcu-lations. The implementation of software modifications is documented in Software Design Specifications and assembly listings.
The verifi-cation process for the modified software implementation includes Phase I and Phase II software verification tests.
The requirements of the Phase II software verification testing are based on the fact that the Phase I testing has been previously performed.
Successful completion of Phase I testing verifies the correct implementation of the modified software.
Phase II testing completes the software modification process by verifying that the integrated CPC System responds as expected.
This document contains the test results and conclusions for the Phase II software verification test.
1.1 Objectives The primary objective of Phase II testing is to verify that the CPC and CEAC software modifications have been properly integrated with the CPC and CEAC software and system hardware.
In addition Phase II testing provides confirmation that the static and dynamic operation of the integrated system as modified is consistent with that predicted by design analyses.
These objectives are achieved l
l 4
t l
L
by comparing the response of the integrated system to the response predicted by the CPC/CEAC FORTRAN simulation code.
This comparison is performed for a selected range of simulated static and dynamic input conditions.
1.2 Description of Phase II Testing Phase II testing consists of the following tests:
(1)
Input Sweep Test, (2)
Dynamic Software Verification Test, and (3) Live Input Single Parameter Test.
These tests are performed on a single channel CPC/CEAC System with integrated software that has undergone successful Phase I testing.
1.3 Applicability This report applies to the Phase II testing performed on the Louisiana Power & Light Co., Waterford-3 CPC/CEAC system software.
The software revisions documented in this report are designated as Revision Number 00 to the Waterford-3 CPC/CEAC system software.
e 5
2.0 CPC/CEAC INPUT SWEEP TESTS The Input Sweep Test is a real time exercise of the CEAC and CPC 1
application software and executive software with steady-state CPC and CEAC input values read from a storage device.
This test has the following objectives:
(1) To determine the processing uncertainties that are inherent in the CPC and CEAC designs.
(2) To verify the ability of the CPC and CEAC algorithms used in the system hardware to initialize to a steady state after an auto-restart for each of a large number of input combinations within the CPC/CEAC operating space, and (3) To complement Phase I module testing by identifying any abnormalities in the CPC and CEAC algorithms used in the system hardware which were not previously uncovered.
- 2. l' CPC Input Sweep Test Case Selection
, test cases, each involving different combinations of process inputs and addressable constants, were used for CPC___
design qualification testing of the Revision 00 software.
2.1.1 CPC Processor Uncertainty Results l
For each test case, differences in the results of the FORTRAN simulation code and CPC system were calculated.
A statistical analysis of these differences produced the processing uncertainties.
In-m mue 6
The DNBR statistics did not include those cases for which the DNBR as calculated on either system was at the limits This is because a difference of zero (or close to zero) would be computed and would incorrectly weight the distribution of differ-ences.
A total of cases remained after these cases were eliminated.
The LPD statistics did not include those cases for which the LPD as calculated on either system was equal to or.
greaterthantheupperlimitof[
jcoreaveragekw/ft(=
kw/ft).
A total of [
[ cases remained after these cases were
~
eliminated.
Although I cases were not included in the computation of DNBR and LPD statistics, respectively, they were still included as Input Sweep test cases for the purpose of identifying potential
" software errors.
The processor uncertainties for DNBR and LPD are defined as the one-sided tolerance limits which encompass 95% of the distribution of DNBR and LPD differences for all test cases with a 95" confidence level.
The processor uncertaintie_s determined from Input Sweep for DNBR and LPD respectively are DNBR units and core average kw/ft.
- However, 7
e-1
1 since the distribution of differences is so restrictive the maximum error may be used (that is, the limits which encompass 100% of the difference).
This is more conservative and yet still results in small processor uncertainties.
Thus defined the processor uncert inties for Revision 00,on DNBR 2
and LPD are DNBR units and and core average kw/ft, respectively.
2.1.2 Analysis of CPC Input Sweep Test Results The results of the test cases exceeding the 95/95 tolerance limit were analyzed for evidences of software errors.
The review results of the DNBR and LPD test cases.outside the 95/95 tolerance limit will now be discussed.
For DNBL there were cases below the lower tolerance limit of (DNBR units) and test cases above the upper tolerance limit of fxceptions,thecaseswiththe~
~
~~
(DNBRunits).
With
.a s
' absolute differences in DNBR occurred with DNBR's largest greater than [, ], Only of the exceptions had DNBR's of less than at a power level of This is not a significant commonality since the differences are absolute (not relative) and it should be expected that the largest differences should occur at high DNBR's.
It is therefore concluded that n6, errors are indicated in the CPC Single Channel DNBR program.
For LPD the cases examined were:
leases with differences below
-,(% of core average f
the lower 95/95 tolerance limit of a
kw/ft),
cases with differences greater than the upper tolerance limit of The largest percent error among the cases was The comon input to these test cases was found in other test cases 8
with less maximum dif,fer'ence and less percent error.
Examination LPD cases outside the tolerance limits of the inputs to all
~ ~
showed that the inputs covered a wide spectrum.
No comon area was found.
It is therefore concluded that there is no indication of software errors in the Single Channel calculation of LPD.
e S
e e
9
... ~
2.2 CEAC Input Sweep Test Case Selection
[
] test cases, each involving different combinations of CEAC process inputs were used for CEAC design qualification testing of the Revision 00 software.
These test cases covered all CEAC operating space.
2.2.1 CEAC Processor Uncertainty Results For each test case, differences between the CEAC FORTRAN simulation code and CEAC single channel system results were calculated.
The processor uncertainties for DNBR and LPD are defined as the one-sided tolerance limits which encompass 95% of the distribution of DNBR and LPD penalty factor differences for all test cases with a 95% confidence level.
The processor uncertainties for the DNBR and the LP_D penalty factor differences are respectively.
2.2.2 Analysis of CEAC Input Sweep Test Results The results of the test cases exceeding the 95/95_ tolerance limit were analyzed for evidences of software errors.
ltest
_J cases had values of the DNBR or LPD penalty factor which exceed the limit.
However none showed evidence of _ software, errors and none had differences exceeding the limit of In addition
~
~
none of the cases produced differences in the big penalty factor flag or the penalty factor word outputs.
It was concluded that the results of the test cases did not indicate the existence of software errors".
~
10
.---..._7-..
3.0 DYNAMICSOFTWAREVERIFIchTIONTEST The Dynamic Software Verification Test (DSVT) is a real time exercise of the CPC application software and executive software with transient CPC input values read from a storage device.
This test has two objectives:
(1) To verify the dynamic response of the integrated CPC software is consistent with that prediced by design analyses, and (2) To supplement design documentation quality assurance, Phase I module tests, and Input Sweep Tests in assuring correct implementation of software modifications.
Further information concerning DSVT may be found in Reference 1.
3.1 DSVT Case Selection Test cases for DSVT are selected to exercise dynamic portions of the CPC software with emphasis on those portions of the software that have been modified.
DSVT requires that as a minimum cases be selected for testing (Reference 1).
These cases are from the Phase II test series (identified in Reference 1) and consist of a All of the DSVT test cases were executed using the CPC/CEAC FORTRAN simulation code and the Single Channel facility with the 1
Rev_._00 CPC software.
In addition, cases jeachconsisting
~
of subcases were executed to test the CPC/CEAC response to
~
reac" tor power cutback.
11
I 3.2 Gen ration of DSVT Acceptance Criteria l
Acceptance criteria for DSVT are defined (in Reference 1) as a trip time and initial values of DNBR and LPD for each test case.
These trip times and initial values are generated using the i
certified CPC/CEAC FORTRAN simulation code.
Processing uncer-t tainties obtained during Input Sweep testing are factored into
[
the acceptance criteria for initial values of DNBR and LPD where l
necessary.
Trip times are affected by program execution lengths as well as the Input Sweep uncertainties.
The minimum, maximum and average application program execution lengths calculated for i
the Revision 00 software modifications were used in this DSVT.
These execution lengths (in milliseconds) are listed below.
CPC Application Program Execution Lengths t
Program Minimum Average Maximum l
(msec)
(msec)
(msec) r-l FLOW i
UPDATE POWER STATIC I
Each DVST case was initially executed once with the average program execution lengths and nominal trip setpoints using the CPC/CEAC FORTRAN simulation code.
Following execution of the same cases using the Single Channel facility, cases which did not yield trip times equivalent to those calculated by the CPC FORTRAN code were re-analyzed.
Each of these DSVT cases were re-executed once with the minimum L
i execution lengths and most conservative DNBR and LPD trip setpoints i
and once with the maximum execution lengths and least conservative
(
DNBR and LPD trip setpoints.
This results in a bandwidth of trip 12
times for each test case which contains the effects of processing uncertainties and variations in application program execution lengths.
The software DSVT program also includes a millisecond interrupt cycleinorde,r,tocheckforDNBRandLPDtripsignals.
This results in a millisecond interval limit on trip time resolu-tion which is [actored in the acceptance criteria.
The following
~
tables contain the final DSVT acceptance criteria for initial values and trip times for DNBR and LPD.
The values listed are those calculated and truncated to the same number of digits observed on the Single Channel facility Operator's module.
b b
e e
e f
13
l Acceptance Criteria for i
pNBR and LPD Initial Values (DNBR Units and kw/ft., respectively)
{
i l
DNBR DNBR LPD LPD Test Case (Min.)
(Max.)
(Min.)
(Max.)
~
I
[
J, 1
t 14
,,--.-,-,,.-.-+y.,.
._._,..,,,_._,y___.__y,
Acceptance Criteria for DNBR and LPD Initial Values (DNBR Units and kw/ft., respectively)
(Cont.)
DNBR DNBR LPD LPD Test Case (Min.)
(Max.)
(Min.)
(Max.)
s hm 15
Acceptance Criteria for DNBR and LPD Trip Times (seconds)
DNBR Trip DNBR Trip LPD Trip LP0 Trip Test Case (Min.)
(Max.)
(Min.)
(Max.)
M 16
Acceptance Crteria for DNBR and LPD Trip Times (seconds)
(Cont.)
DNBR Trip DNBR Trip LPD Trip LPD Trip Test Case (Min.)
(Max.)
(Min.)
(Max.)
j I
~
17
3.3 DSVT Results The Dynamic Software Verification Test was executed on the Single Channel Facility using the Revision 00 CPC software.
The DSVT results are contained below.
Initial Initial DNBR LPD DNBR Trip LPD Trip Test Case (DNBR Unitsl (kw/ft.)
(sec.)
(sec.)
~
6 M
18
DSVT Results (Cont.)
Initial Initial DNBR LPD DNBR Trip LPD Trip Test Case (DNBR Units)_
(kw/ft.)
(sec.)
(sec.)
O v
/
m 19
3.4 Analysi's of DSVT Results-For all test cases, with the exception of the initial valuer of DNBR and LPD were within those defined by the CPC/CEAC FORTRAN simulation code generated band widths which include the Erocessing uncertainties obtained from the CPC Input Sweep Test.
a:
Test cases
~
are cases with the plant initially in a trip condition.
In the CPC FORTRAN simulation code, one program execution cycle is needed to generate a trip output.
This implies an acceptance criterion of
~
sec. for minimum and maximum time-to-trip, while the actual trip times for the CPC single channel were sec.
These FORTRAN cases wg examined to verify that a trip condition existed at timef,,, justifying the indicated acceptance criteria for time-to-trip of 0.0 sec. consistent with the expected CPC single charinel response.
20
a e
..-a
.g---
1
\\.
M 3
M g
_,x
,\\
i s.s
(
\\
\\-
e N
\\
~
w
's 3
3
\\
es.
Q 6
f sg*,
L s
q s
. c A
S'y j
t k
5 3
3 i
l i
s a
T d
, S g
4 i
b 3
}
.~
k*
4
- i.. '%.
,4 s
g
- \\.
"e u
w
.m s
g g
s t *, -
s N.-
, g6 4
4 4
1 s
6 g
EiB l 9
I
\\%
w
~
(
, * \\ m.,
S s s.
y,
~~
s, i
3
~
- u.,
g
=
en i
iy<
s o.
s
\\
t t
i l
?
i
,.g-
\\
=
r,,\\
. 3 h
N 3-'\\
s N
4,'.
4
\\
N.I f
[
,x.
W%
T
\\
\\
k
\\ h
.3,r i
., ' \\
g s
f f.
.e s
g I
93 5'
N g,L 3
g
[
k
, I s
3 s
l s
s-I s
4.,,
\\
k 1
g 1
\\
\\.'
z T
s\\
I
\\
n
~
i x.
> T
\\
\\s S
5
- \\
s 1
21
.~,
\\
t
\\
b
...L.-
-.,m-+.
F
4.0 LIVE INPUT SINGLE PARAMETER TEST The Live Input Single Parameter test is a real-time exercise of the CPC/CEAC application and executive software, with transient CPC/CEAC input values generated from an external source and read through the CPC/CEAC input hardware.
The objectives of this test are:
(1) To verify that the dynamic response of the integrated CPC/CEAC software and hardware is consistent with that predicted by design analyses.
(2)
To supplement design documentation quality assurance, Phase I module tests, Input. Sweep Tests, and DSVT in assuring correct implementation of software modifications.
(3) To evaluate the integrated hardware / software system during operational modes approximating plant conditions.
4.1 LISP Test Case Selection Reference 1 identifies the test cases to be used for LISP.
These cases are the single variable dynamic transient test cases from the Phase II test series.
In addition, a new test case is included to test the Reactor Power Cutback (RPC) feature.
These test cases, which are applicable to Waterford-3, consist of l
1 4.2
-Generation of LISP Acceptance Criteria f
The acceptance criteria for LISP are based on trip times for the dynamic test cases.
For the new RPC test case, there should be no tripping during RPC.
22
These cases are simulated in the CPC FORTRAN simulation code and contain the following adjustment compunents.
Application program execution lengths u,ed for LISP testing were the same as those for DSVT, with the ac'dition of CEAC minimum and maximumexecutionlengths{
] msec respectively.
The final acceptance criteria (generated by the CPC FORTRAN simulation code and adjusted for the above components) for LISP are contained in the following table.
Test Case Minimum Trip Time Maximum Trip Time (seconds)
(seconds) m 4.3 LISP Test Results The[)dynamictransientswereexecutedontheCPCSingleChannel facility.
The recorded trip times (in seconds) for each case are listed in the following table:
23
Run I
l All recorded trip times meet the final acceptance criteria for Theresultoftestcase[,[showsthattheRPCfeature LISP.
meets the design requirements.
Major aspects of the operator's module operation, particularly the Point ID verification and addressable constant range limits were tested within the full capability of the CPC/CEAC Single Channel test facilities.
As part of the testing, the CPC Point ID table was checked to assure that the CPC Point ids displayed on the operator's module are the same as those listed in the Point ID table. The addressable constants and the addressable constant range limits were verified to be correct.
CEAC Point ID verification includes checking of all the, inputs and outputs of the CEAC.
All aspects of automated reentry of Addressable Constants were tested and were determined to have been correctly implemented.
4.4 Analysis of LISP Results a
24
4
- ww e
9N O
25
5.0 PHASE II TEST RESULTS
SUMMARY
The Phase II software verification tests have been performed as
]
required in Reference 1.
In'all cases, the test results fell within the acceptance criteria or the results were analyzed to identify the reason for the discrepancy.
Based on these results, the Phase II test objectives have been successfully achieved.
It is concluded that the CPC/CEAC software modification described as Revision 00 has been properly integrated with the CPC and CEAC software and system hardware and that the static and dynamic operation of the integrated system as modified is consistent with that predicted by design analyses.
G l
26
6.0 REFERENCES
1.
CPC Protection Algorithm Software Change Precedure CEN-39(A)-P, Revision 02, December 21, 1978.
2.
CPC/CEAC Software Modifications for Waterford-3 CEN-197(C)-P, March, 1982.
3.
Waterford-3 Cycle 1 CPC/CEAC Phase I Software Verification Test Report, CEN-209(C)-P, Revision 00, June, 1982.
9 l
t 27
I WATERFORD 3, CYCLE 1 CPC/CEAC PHASE ll SOFTWARE VERIFICATION TEST REPORT h
WATERFORD STEAM ELECTRIC STATION UNIT NO. 3 JUNE,1982 ST MS COMBUSTION ENGiNEEmiNG INC
LEGAL NOTICE THIS REPORT WAS PREPARED AS AN ACCOUNT OF WORK SPONSORED BY COMBUSTION ENGINEERING, INC. NEITHER COMBUSTION ENGINEERING NOR ANY PERSON ACTING ON ITS BEHALF:
A.
MAKES ANY WARRANTY OR REPRESENTATION, EXPRESS OR IMPLIED INCLUDING THE WARRANTIES OF FITNESS FOR A PARTICULAR PURPOSE OR MERCHANTASIUTY, WITH RESPECT TO THE ACCURACY, COMPLETENESS, OR USEFULNESS OF THE INFORMATION CONTAINED IN THIS REPORT, OR THAT THE USE OF ANY INFORMATION, APPARATUS, METHOD, OR PROCESS DISCLOSED IN THIS REPORT MAY NOT INFRINGE PRIVATELY OWNED RIGHTS:OR B. ASSUMES ANY LIABILITIES WITH RESPECT TO THE USE OF, OR FOR DAMAGES RESULTING FROM THE USE OF, ANY INFORMATION, APPARATUS, METHOD OR PROCESS DISCLOSED IN THIS REPORT.
l w+1-wg9e w h wve go.m.-
e
=
,-p.
, e,
l I
l l
I I
l l
l I
i I
i 1
1 l
l l
l l
COMBUSTION ENGINEERING, INC.
i I
)
i l
l l
l l
l l
1 l
l l
i e
m aggy-go-r wym y e,mw,,.
%==
-