ML19277A068: Difference between revisions

From kanterella
Jump to navigation Jump to search
(Created page by program invented by StriderTol)
 
(StriderTol Bot change)
 
Line 17: Line 17:


=Text=
=Text=
{{#Wiki_filter:}}
{{#Wiki_filter:ARKANSAS NUCLEAi< ONE - UNIT 2 DOCKET 50-368 CEN-162(A)-NP
                              ,          REVISIDH 00 CPC/CEAC SYSTEM PHASE II SOFTWARE VERIFICATION TEST REPORT
  .                                  JUNE /c ,1981 i
Combustion Engineering, Inc Nuclear Power Systems Power Systems Group Windsor, Connecticut 8107300146 810720 I PDR ADOCK 05000368-p                PDR-
 
LEGAL NOTICE This response was prepared as an account of work sponsored by Combustion Engineering, Inc. licither Combustion Engineering nor any person acting on its behalt':
: a. Makes any warranty or representa: ion, express or implied it;cluding the warranties of fitness for a particular purpose or merchantability, with respect to the accuracy, completenest, or usefulness of the information contained in this response, or that the use of any information, apparatus, method, or p ocess disclosed in this respense may not infringe privately owned rights; or
: b. Assumes any liabilities'with rerpect to the use of, or for damages resulting from the use of, any information, apparatus, method or process disclosed in this response.
            . r,                                                                                .
Page 1 of 27 1
 
IdtS1It.*yQ phase 11 Testing is performed on the D'll;R/LPD Calculator System to (1) verify that the CPC and CEAC sof tware modifications have been properly integrated with the CPC and CEAC software and the system hardware and (2) provide confinaation that the static and dynamic operation of the integrated system as .'odified is consisterit with that predicted by design analyses.
This report presents the Phase 11 test results for the Arkansas Power and Light Af!0-2 plant CPC/CEAC Rev. 04 software.
The Phase 1: Testing was performed according to previously issued procedures (Reference 1). The test results inhicate that the CPC and CEAC software modifications have been properly integrated with the CPC and CEAC software and hardware and that the operation of the integrated system as modified is consistent with the performance predicted by design analyses.
This document was prepared and reviewed in accordance with sections 5.2 and 5.4 respectively of QtsDP, Revision 14.
e
*e G
Page 2 of 27
 
l ARI.L OF CD:n fl!15 Section                                Ti tie _            Pa,ge_po.
1.0  -
It!TRODUCTION                                        4 1.1        Objectives                                          4 1.2        Description of Phase II Testing                      5 1.3      Applicability                                        5 2.0        CPC/CEAC IMPUT S'.EEP TESTS                          6 2.1        CPC Input Sweep Test Case Selection                  6 2.1.1      CPC Processor Uncertainty Results                    6 2.1.2      Analysis of CPC Input Sucep Test Results              7 2.2        CLAC Input Sweep Test' Case Selection              10 2.2.1      CEAC Processor Uncertainty Results                  11 2.2.2      Analysis of CEAC Input Sweep Test Results          11 3.0        DYl!A!11C SOETiu,RE YERIFICAT10M TEST              12 3.1        DSVT Test Case Selection                            12 3.2        Generation of DSVT Acceptance Criteria              13 3.3        DSVT Test Results                                  19 4.0 LIVEflNPUT SII!GLE PARAMETER TESTING                23 3.1        LISP Test Case Selection                    .
23 4.2        Generation of LISP Acceptance Criteria              23 4.3        LISP Test Results                                  24 5.0        PJIASE 11 TEST RESULTS
 
==SUMMARY==
26 6.0        nErERENCES                                          27 Page 3 of 27
 
1.0    tilII:0DUCTio'l The verification of software modifications of the DNCR/LPD Calcula-
                  ' tion System consists of several steps which address two major areas of the codification process:
                    ~
(1) Specification of sof tware codifications (2)      Implementation of software modifications The specification of the software modifications is documented in the CPC/CEAC Functional Descriptions and Data Base Document and is verified by design analysis contained in recorded calculations.
The implementation of softijare modifications is documented in Software Design Specifications and assembly listings. The verifi-cation process for the modified software implementation includes Phase I and Phase'll software verification tests.
The requirements of the Phase 11 Software Verification Testing are based on 'the fact that the Phase I Testing will be performed.
Successful completion of Phase 1 Testing verifies the correct implementation of the modified software. Phase 11 Testing completes the software modification process by verifying that the integrated CPCSysteIFrespondsasexpected.
This document contains the test results and conclusions for the Phase 11 Testing.
1.1    Obiectives                                            .
The primary objective of Phase 11 testing is to verify that the CPC and Cf.AC sof tware modifications have been properly integrated with the CPC and CEAC sof tware and the system hardware. In addition Phase 11 testing provides confirmation that the static and dynamic operation of the integrated system as modified is consistent with that predicted by design analyses. These objectives are chieved by comparing the response of the integrated system Page 4 of 27
 
    .          to the respon,e predicted by the CI'C f0R1Pldi Simulat ion Code.
This comparison is performed for a selected range of simulated Static and dynamic input conditions.
1.2 ' pe.sgiption..of e              Phase 11 Testing Phase 11 testing consists of the following tests:
(1)  Input Sweep Test, (2)  Dynamic Sof tware Verification Test, and (3) Live input Single Parameter Test.
These tests are performed on a single channel CPC/CEAC System with integrated softtlere 'that has undergone successful Phase i testing.                            -
1,3  /pplicability This report. applies to the Phase 11 testing performed on the Artansas Poveer and Light AND-2 plant CPC/CEAC system sof ttrare.
The software revisions documented in this report are designated as Iiodification llumber 4 to the AND-2 CPC/CEAC system software.
                          . ,y                .
Page fi of P7
 
2.0    CPCLCl AC IUPill SurLP TESTS The Input Sweep Test is a real time exercise of the CEAC and CPC application sof tware and executive sof tware with steady-state CPC
                    'and CCAC input values read from a storage device. This test has the following objectives:
(1)  To determine the processing uncertainties that are inherent in the CPC and CEAC designs.
(2) To verify the ability of the CPC and CEAC algorithms used in the system hardware to initialize to a steady state after an auto-restart for eat,h of a large number of input combinations within the CPC/CEAC operating space, and (3)  To complement Phase 1 module testing by identifying any abnormalitiesinth5CPCandCEACalgorithmsusedinthe system hardware which were not uncovered previously.
2.1      CPC Input Sweep Test Case Selection m
                                  . --l test cases, each involving different combinations of
                  -proco's's* inputs and addressable constants, were used for CPC design qualification testing of the Revision 04 software.
2.1.1  CPC Processor Uncertaicty Results for each test case, differences between FORTRAN simulation and CPC system results,were calculated. A statistical analysis of these differences produced the processing uncertainties.
The DN11R statistics did not include those cases for which the
  ,                DNiiR as calculated on either system was at the limits                .
This is because a difference of zero (or close to zero) would he computed anil would incorrectly weight the distribution of differ-ences.
A total of [ ] cases remained af ter these cases were rage f, 0f n
 
eliminated.            The LPD statistics did not include those cases for which the IPD as calculated on either system was equal to or greater than the upper limit of                                                I cure average Lw/f t ( L -
1.w/ f t ) .        A total of _      ].- cases remained after these cases were
                    'climinated.
Although                            cases were not included in the computation of DNBR and LPD statistics, respectively, they were still present for the purpose of identifying software errors.
The processor uncertainties for DHCR and LPD are defined as the one-sided tolerance limits which encompass 95% of the distribution of DMBR and LPD differences for all test cases with a 95% confidence level. The processor uncertainties determined from Inp't Sweep
                    ~for DNBR and LPD respectively are                                DNBR units and
                                  ,)coreaveragekw/ft, llowever, since the distribution of
      '.            differences is so tight the maximum error may be used (that is, the limits which encompass 100:; of the difference).                      This is more conservative and yet still results in low processor uncertainties.
Thus defined, the processor uncertainties for Revision 04 on DNDR and LPD are
* D'!BR units and          core average kw/ft,
                                    ..                  J respective.ly.-                          '
2.1.2    Analysis of CPC Input Sweep Test Results The results of the test cases exceeding the 95/95 tolerance limit were analyzed for evidences of software errors.
A review of the stdtistical analysis of the                          test cases indicated            .
                -                                                                                        ~
P. ige 7 o f .??
 
e 9
4 8 4
0 I
l'. ige 8 of P7
 
The review re:ults of the DNt:R and LPD test car.cs outside the 95/95 tolerante limit will now be dir. cussed, for DNtik there were l cases below the lower tolerance limit of,              (DNBR
                      ..                              I-o
        'uIiits) andl_ ]Lest cases above the upjier tolerante limit of
                      ~1(DNDR units).
J s
e e es -
* Page 9 of ??
 
                            ~      ~
for the
                                      .l DUCR cases above the 95/% tolerance level the greatest percent error was                        lhe remaining -. l test cases
_ _4                          -
had percent errors less than                _l'.(ab:.ol ute) . The co: mon input
                . data to these test cases was found in other tesi cases with less maximum difference and less percent error. It is therefore concluded that no errors are indicated in the CPC Single Channel DNDR program, for LPD the cases examined were: ~ l cases with differences below thelower95/95tolerancelimitof                            ] (% of core average lal/f t), ''
                                .fcaseswithdifferencesgreaterthantheuppertolerance limit of              ' land .jcaseswithLPDvaluesgreaterthan                  -
of core average kw/ft and with differences outside the above__
stated tolerance limits.          For the LPD cases with values above V                                                                                    1 g      jof core average Ix/ft, the largest percent error was{                      g.
The size of this percent error term indicates that the differences between the CPC Single Chcnnel and the FDRTPM simulation are due to machine dif ferences ir. ccuracy when calculating large numbers.
For the    - _ .
test cases with LPD values less than                of core average ku/ft, I' cases had percent errors greater than-_ _,i. The
                                                                                        -. m largest percent error was                    1 ,    Examination of the inputs to theseCtasesshowednocom!aninput.                        Examination of the inputs to all  _
                              ~jlPDcasesoutsidethetolerancelimitsshowedthatthe inputs covered a wide spectrum.              L'o co:miun area was found. It is therefore concluded that there is no indic tion of software errors in the Single Channel calculation of LPD, 2.2  CEAC_ Input Swefep. Test Case _ Selection
                                    ] test cases, each involving different combinations of CLAC process inputs were used for CLAC design qualification testing of the kevision 04 software.                  These test cases covered all CEAC operaling space.
Page 1D of 27
: 2. '2.1  CEAC Processor Uncertainty Results for each test case, differences between 10ll1RAtt siinulation and CEAC system results were calculated. The processor uncertainties for Dl!!!R and LPD are defined as the orie-sided tolerance limits which encompass 95'/ of the distribution of D';IR and 1.PD penalty factor differences for all test cases with a 95% confidence level.
The processor uncertainties for the DulR and the LPD penalty factor differences are                and            ]respectively.
2.2.2    Analysis of CEAC loput Sweep Test Results The results of the test cases exceeding the 95/95 tolerance limit were analyzed for evidences of software errors.
                                                                        ]        ] test cases had differences in the big penalty factor flag.
The
[          ]testcaseswithdifferencesinthebigpenalty factor flag were examined. Results indicated that these differ-ences were due to implementation differences between the CEAC software and the CEAC FORTRAN and not due to software or FORTRAft progra.r.tnipg errors. These implementation differences do r.ot impact calculation of the CEAC penalty factor output words.      It r-was concluded that the results of thel _' test cases did not indicate the existence of software errors.
Page 11 of ?7
 
3.0    DYt hMIC 501 (1.'A1:E VLltifICAllDl TESI The Dyo mic Software Yerification lest (DSVT) is a real time exercise of the CPC application software and executive software with transient CPC input values read from a storage device. This test has tuo objectives:
(1)    To verify the dynamic response of the integrated CPC software is consistent with that prediced by design analyses, and (2) To supplement design documentation quality assurance, Phase I module tests, and Input Suecp lests in assuring correct implementation of software modifications.
Eurther information concerning DSVT may be found in Reference 1.
    ,3.1    DSVT Test Case Selection Test cases for DSVT are selected to exercise dynamic portions of the CPC software with emphasis on those positions of the software that have been modified. The major modifications made by the Revision 4 changes are:
(1)    CEAC i.ogic and Data Base changes to allow proper operation with plants containing 2-CEA subgroups.
(2)    Replacement of the COSM0/h'-3 based Df!BR calculation with CETOP2 based on the TORC /CE-1 DNDR correlation. This change totally replaces the STATIC program and modifies the DNDR UPDATE calcul'ation.
(3)    Changes to curve-fitting routines, modelling core power distribution for more precise CI:A configurations and the use of additional corrections and offsets to yield improved D CCura cies .
page 12 of P'/
 
I (4) Algorithm :,implifications in the POL'LR program yielding improved computer efficiency.
(S) Additional addressable c90stants te facilitate changing constants lil:ely to vary during plant life, and to allow clearance of the CE/C snapshot buffer and rewriting of the entire CRT display.
For more detail on Revision 04 sof tware modifications, see Reference 2.
DSVT requires that as a minimum cases                            be selected for test (Reference 1). These cases are from the Phase _
Il test series (identified in Reference 1) and consist of
                                                                                    ._l
            ~5ccause the changes to the program algorithms were significant all of the DSVT test cases                  i were' executed on the CPC TORThAN simulation code and on the Singic Channel facility.
This ensures that all dynamic portions of the CPC software are adequately exercised.
3.2    Generation of DSVT Acceptance Criteria Acceptance criteria for DSVT are defined (in Reference 1) as a trip time and initial values of DNBR and LPD for each test case.
These Trip Times and initial values are generated using the certified CPC FORTRAN simulation code. Processing uncertainty obtained during Input Sweep Testing is factored into the acceptance criteria. Trip times are also affected by program execution lengths. The minimum and maximum progran e.secution lengths for the Revision 01 sof tware modifications t,vre calculated and were used in DSVT. These execution lengths (in milliseconds) are listed below.
Page 13 of 27
 
Program                  liinimum                liar.imum l'LDW U?DATE
          ' P0..'E R STATJC                                          -
Each DSVT case was executed once with the minimum execution lengths and most conservative D!!BR and LPD trip setpoints and once with the maximum execution lengths and least-conservative DiiBR and LPD trip setpoints. This results in a bandwidth of D: BR and LPD trip times.
The final DSVT acceptance criteria bandwidths contain the effects of processing uncertainties and program execution lengths. The software DSVT program also includes a []raillisecond interrupt cycle in order to chech'for D:BR and LPD trip signals. This
                            ~
results in a ,] millisecond interval limit an Trip Time resolution which is factored in the acceptance criteria. The following tables contain the final DSVT acceptance criteria for initial values and trip times of DNBR and LPD.
Page 14 of 27
 
          !)ti!!!! and 1.PI) 1rii t ial Valut '. (li:1PR linitt, atu! lat/f t. re!.pectively),
D lilR                DilllR            LPD Tett Car.e                                                                              LPD iliiid,              j!jax.J.          ,Qliid              ,(lj.u ._),
9 e 8 0
e
                                                                                                    .m l'a!!c l!i of E7
 
D!il;R iinti 1.PD h.itial Values (D:!!!R tinitt. ioni heIit . re".1u c t ivel).).
(Cont.)
DimR              DNiiR                LPD              LPD Ter.t ' Ca se            ,(lii n. )        ,(lia x .),        ,(lii n_.j, Mg . ),
hh
                              ..  ,,                  g e
* l' age 16 of 27
 
t D!!!;1bipd i PD T_r_iy_lig ;dr.econdg Dl!Pl! Trip      Dillill Tri      (Pl> Trip Ter.t Car.e  __(liijk)_.,                                          LPD Trip
__(Ito:. )p__    _ jliin ._)_        __(liqx l_
I i
. "                                                                            ____l l' age 17 of P7
 
D:liti: and I.PD 1 rip,li!rlL(acpjig,1 (Cont.)
Dh'l;R 1 rip      D!!!'R Trip    LPD Tri          Li'D Trip Test Case            _(liijhj_          ._(li<ix. L_    _(ljin.)p        ._(ljh)_
J._.
O g S
Page 18 of 27
 
3.3        DSVT 'lest I:et.ul t r.
The Dynamic Sofluore Verification lest was executed on the Single Channel facility using the ikvision 04 CPC software. The DSVT Test results are contained l>elow.
Dh'DR .        LPD        DilDR Trip        LPD Tri
_l.est Cqte          Q,'hR Un,ilts_).  .U.w/ f t .]. Accd_            jsec )p ow..
O g                                              .
4 4%
M l' age 19 of I'/
 
s
[11;im                Li'D  Dl;liR Trip Te!,t Cat.e                                                              U'D Trip
                                , (1)ilbl: lin i t r. ).  , (lw/ft-[
i      _(ML)_          . EE)._
e
      *e h
a g g O4                        94 4
4
* e k
 
for some test cases, the initial values of D:R;R or LPD, or the Dfil'R or LPD trip times, were outside those defined ley the 10R11:h:1 generated bandwidths. These cases weie evaluated individually and there is no evidence of sof tware errors.        Thus, the objectives
        .of DSVT have been met.
All the initial values of D:'!!R and LPD vere within those defined by the FORTRAtt generated bandwidths with the exception of[_)
cases. These were test cases                          -
with initial LPD values at  -
                                                    ]Lw/ftrespectively, correspondingto{_              _ gof ratec average power density.
The processing uncertainties used to generate the bandwidths are valid only for normal power levels up to[ ]of rated average power. The magnitude of the differences between the FORTRAi1 value and the Single Channel value was extremely snall when compared to their initial values.        These differences are due to the differences in machine precision between the CPC hardware (i.e., the Interdata 7/16) and the CDC 7500 (on which the CPC FORTRA!! simulator is executtd) in conjunction with the processing unccriainty range mentioned above. In each of these cases, both the single channel and the FORTRAf1 initialized with a D::DR and LPD trip because of the magnitudes of the inputs.
                  .. 3 Test cases                                              were cases with the plant in the trip condition initially.        In the FORTRAf; simula-tion, one program execution cycle was required to generate a trip output, therefore the minimum and maximum trip times from the FORTRA!1 were[_lseconds while the trip times from the Single Channel were{ ] seconds. Proper software initialization was verificd for these cases by checking the FORTRAf1 outputs at time seconds and verifying that the FORTRA!l did indeed calculate a trip conth Lion.
Page 21 of 27
 
i
                                                    ~          ~
The t.PD trip time:, in text cases                ond the D:lBI: trip
                                                              ~
time:, in test cases [_          _  IIere[                  ' ~~ l outside of the FORTI:li!! generated bandwidth. Investigation oi this anomaly revealed the following:
It was concluded that the DSVT results did not indicate the existence of sof tware error.
l' age P2 of '7
 
4.0      LIVE IllPUI Sil!CI E PAf'AMr1ER TES11HG The Live Input Single Parameter test program is a real-tic:e exercise of the CPC/CEAC application and executive software, with
                        ' transient CPC/CEAC input values generated from an external source and read through the CPC/CEAC input hardware. The objectives of this test are:
(1) To verify that the dynamic response of the integrated CPC/CEAC software and hardware is consistent with that predicted by design analyses.
(2) To supplement design documentation quality assurance, Phase I module tests, input sweep tests, and DSVT testing in assuring correct implementation of software modifice. ions.
(3) To evaluate the integrated hardware / software system during operational modes approximating plant conditions.
4.1      LISP Test Case Selection Reference 1 i entifies the test cases to be used for LISP. These cases 'are'.the single variable dynamic transient test cases from the Phase 11 test series.
These test cases, which are applicable to ANO-2 cycle 2, consist of                                                                  "
I~    .
4.2      Generation of LISP Acceptance Criteria _
The accrptance criteria for LISP is bared on trip times for the dynamic test cases.
l' age 23 of P7
 
These cases are simulated in f0RTR/J4 and contain the following adjustment compone'ils.
Program execution lengths ,used for LISP testing were the same as those for DSVT, with the addition of CEAC minimum and maximum executionlengths([                    ]insec respectively).
The final acceptance criteria (generated by the FORTRAN code and            -
adjusted for the above components) for LISP is contained in the following table.
Test Case                Minimum Trip Time        Maximum Trip Time (seconds)                (seconds) 4.3  LISP Test Results The ]      } dynamic transients were executed on the Singic Channel facility.        The recorded trip times (in seconds) for each case are listed in the following table:
Page N of 27
 
liun All recorded trip times mee't the final acceptance criteria for LISP.                              .
liajor aspects of the operator's module operation, particularly the point ID verification and addressable constant range limits were tested. As part of the testing, the CPC and CEAC Point ID tables were checked to assure that the Point ids displayed on the operator's module are the same as those listed in the Point ID
                                                                          ~
tables. During the check of the CPC Point ID table, The aspects of automated reentry of addressable constants ucre
  . tested and found to be acceptable.
l' age 25 of 27
 
5.0      Pil/tSE 11 Tf:ST kl Stil.TS StlMMARY The Phase 11 sof tware verification tests liave beco performed as required in Reference 1. In all cases. the test .esults fell
                ' ithin the acceptantc criteria, except those cases discussed in it Chapter 3. These cases were analysed and results indi"ated that they were not due to software or FORTRAN errors. Based on these results, the Phase 11 test objectives have been successfully achieved.              It is concluded that the CPC and CEAC sof tware modifica-tions described as Revision 04 hav,; been properly integrated with the CPC and CEAC sof tware and system hardware and that the static and dynamic operation of the integrated system as modified is consistent with that predicted by design analyses.
e Page P6 of 27
            -.            ,                    - . . . . . . . . . . - - - - ~
 
4 0.0  !!JIO11:llCI:5
: 1. CPC Protection Algorillun Sof tware change Procedure Cell-39(A)-P, ,
Itevision 02, Deccin!>cr 21, 1978, e
: 2. CPC/CEAC Sof tware Modifications for Arkansas l'uclear One -
Unit 2, Cell-143(A)-P Revision 00, December,1980.
e I  g 9 e
e 9
9 Pa!,e 27 of 27}}

Latest revision as of 02:44, 22 February 2020

Nonproprietary Version of Cpc/Ceac Sys Phase II Software Verification Test Rept.
ML19277A068
Person / Time
Site: Arkansas Nuclear Entergy icon.png
Issue date: 06/16/1981
From:
ABB COMBUSTION ENGINEERING NUCLEAR FUEL (FORMERLY
To:
Shared Package
ML19277A067 List:
References
CEN-162(A)-NP, NUDOCS 8107300146
Download: ML19277A068 (27)


Text

ARKANSAS NUCLEAi< ONE - UNIT 2 DOCKET 50-368 CEN-162(A)-NP

, REVISIDH 00 CPC/CEAC SYSTEM PHASE II SOFTWARE VERIFICATION TEST REPORT

. JUNE /c ,1981 i

Combustion Engineering, Inc Nuclear Power Systems Power Systems Group Windsor, Connecticut 8107300146 810720 I PDR ADOCK 05000368-p PDR-

LEGAL NOTICE This response was prepared as an account of work sponsored by Combustion Engineering, Inc. licither Combustion Engineering nor any person acting on its behalt':

a. Makes any warranty or representa: ion, express or implied it;cluding the warranties of fitness for a particular purpose or merchantability, with respect to the accuracy, completenest, or usefulness of the information contained in this response, or that the use of any information, apparatus, method, or p ocess disclosed in this respense may not infringe privately owned rights; or
b. Assumes any liabilities'with rerpect to the use of, or for damages resulting from the use of, any information, apparatus, method or process disclosed in this response.

. r, .

Page 1 of 27 1

IdtS1It.*yQ phase 11 Testing is performed on the D'll;R/LPD Calculator System to (1) verify that the CPC and CEAC sof tware modifications have been properly integrated with the CPC and CEAC software and the system hardware and (2) provide confinaation that the static and dynamic operation of the integrated system as .'odified is consisterit with that predicted by design analyses.

This report presents the Phase 11 test results for the Arkansas Power and Light Af!0-2 plant CPC/CEAC Rev. 04 software.

The Phase 1: Testing was performed according to previously issued procedures (Reference 1). The test results inhicate that the CPC and CEAC software modifications have been properly integrated with the CPC and CEAC software and hardware and that the operation of the integrated system as modified is consistent with the performance predicted by design analyses.

This document was prepared and reviewed in accordance with sections 5.2 and 5.4 respectively of QtsDP, Revision 14.

e

  • e G

Page 2 of 27

l ARI.L OF CD:n fl!15 Section Ti tie _ Pa,ge_po.

1.0 -

It!TRODUCTION 4 1.1 Objectives 4 1.2 Description of Phase II Testing 5 1.3 Applicability 5 2.0 CPC/CEAC IMPUT S'.EEP TESTS 6 2.1 CPC Input Sweep Test Case Selection 6 2.1.1 CPC Processor Uncertainty Results 6 2.1.2 Analysis of CPC Input Sucep Test Results 7 2.2 CLAC Input Sweep Test' Case Selection 10 2.2.1 CEAC Processor Uncertainty Results 11 2.2.2 Analysis of CEAC Input Sweep Test Results 11 3.0 DYl!A!11C SOETiu,RE YERIFICAT10M TEST 12 3.1 DSVT Test Case Selection 12 3.2 Generation of DSVT Acceptance Criteria 13 3.3 DSVT Test Results 19 4.0 LIVEflNPUT SII!GLE PARAMETER TESTING 23 3.1 LISP Test Case Selection .

23 4.2 Generation of LISP Acceptance Criteria 23 4.3 LISP Test Results 24 5.0 PJIASE 11 TEST RESULTS

SUMMARY

26 6.0 nErERENCES 27 Page 3 of 27

1.0 tilII:0DUCTio'l The verification of software modifications of the DNCR/LPD Calcula-

' tion System consists of several steps which address two major areas of the codification process:

~

(1) Specification of sof tware codifications (2) Implementation of software modifications The specification of the software modifications is documented in the CPC/CEAC Functional Descriptions and Data Base Document and is verified by design analysis contained in recorded calculations.

The implementation of softijare modifications is documented in Software Design Specifications and assembly listings. The verifi-cation process for the modified software implementation includes Phase I and Phase'll software verification tests.

The requirements of the Phase 11 Software Verification Testing are based on 'the fact that the Phase I Testing will be performed.

Successful completion of Phase 1 Testing verifies the correct implementation of the modified software. Phase 11 Testing completes the software modification process by verifying that the integrated CPCSysteIFrespondsasexpected.

This document contains the test results and conclusions for the Phase 11 Testing.

1.1 Obiectives .

The primary objective of Phase 11 testing is to verify that the CPC and Cf.AC sof tware modifications have been properly integrated with the CPC and CEAC sof tware and the system hardware. In addition Phase 11 testing provides confirmation that the static and dynamic operation of the integrated system as modified is consistent with that predicted by design analyses. These objectives are chieved by comparing the response of the integrated system Page 4 of 27

. to the respon,e predicted by the CI'C f0R1Pldi Simulat ion Code.

This comparison is performed for a selected range of simulated Static and dynamic input conditions.

1.2 ' pe.sgiption..of e Phase 11 Testing Phase 11 testing consists of the following tests:

(1) Input Sweep Test, (2) Dynamic Sof tware Verification Test, and (3) Live input Single Parameter Test.

These tests are performed on a single channel CPC/CEAC System with integrated softtlere 'that has undergone successful Phase i testing. -

1,3 /pplicability This report. applies to the Phase 11 testing performed on the Artansas Poveer and Light AND-2 plant CPC/CEAC system sof ttrare.

The software revisions documented in this report are designated as Iiodification llumber 4 to the AND-2 CPC/CEAC system software.

. ,y .

Page fi of P7

2.0 CPCLCl AC IUPill SurLP TESTS The Input Sweep Test is a real time exercise of the CEAC and CPC application sof tware and executive sof tware with steady-state CPC

'and CCAC input values read from a storage device. This test has the following objectives:

(1) To determine the processing uncertainties that are inherent in the CPC and CEAC designs.

(2) To verify the ability of the CPC and CEAC algorithms used in the system hardware to initialize to a steady state after an auto-restart for eat,h of a large number of input combinations within the CPC/CEAC operating space, and (3) To complement Phase 1 module testing by identifying any abnormalitiesinth5CPCandCEACalgorithmsusedinthe system hardware which were not uncovered previously.

2.1 CPC Input Sweep Test Case Selection m

. --l test cases, each involving different combinations of

-proco's's* inputs and addressable constants, were used for CPC design qualification testing of the Revision 04 software.

2.1.1 CPC Processor Uncertaicty Results for each test case, differences between FORTRAN simulation and CPC system results,were calculated. A statistical analysis of these differences produced the processing uncertainties.

The DN11R statistics did not include those cases for which the

, DNiiR as calculated on either system was at the limits .

This is because a difference of zero (or close to zero) would he computed anil would incorrectly weight the distribution of differ-ences.

A total of [ ] cases remained af ter these cases were rage f, 0f n

eliminated. The LPD statistics did not include those cases for which the IPD as calculated on either system was equal to or greater than the upper limit of I cure average Lw/f t ( L -

1.w/ f t ) . A total of _ ].- cases remained after these cases were

'climinated.

Although cases were not included in the computation of DNBR and LPD statistics, respectively, they were still present for the purpose of identifying software errors.

The processor uncertainties for DHCR and LPD are defined as the one-sided tolerance limits which encompass 95% of the distribution of DMBR and LPD differences for all test cases with a 95% confidence level. The processor uncertainties determined from Inp't Sweep

~for DNBR and LPD respectively are DNBR units and

,)coreaveragekw/ft, llowever, since the distribution of

'. differences is so tight the maximum error may be used (that is, the limits which encompass 100:; of the difference). This is more conservative and yet still results in low processor uncertainties.

Thus defined, the processor uncertainties for Revision 04 on DNDR and LPD are

  • D'!BR units and core average kw/ft,

.. J respective.ly.- '

2.1.2 Analysis of CPC Input Sweep Test Results The results of the test cases exceeding the 95/95 tolerance limit were analyzed for evidences of software errors.

A review of the stdtistical analysis of the test cases indicated .

- ~

P. ige 7 o f .??

e 9

4 8 4

0 I

l'. ige 8 of P7

The review re:ults of the DNt:R and LPD test car.cs outside the 95/95 tolerante limit will now be dir. cussed, for DNtik there were l cases below the lower tolerance limit of, (DNBR

.. I-o

'uIiits) andl_ ]Lest cases above the upjier tolerante limit of

~1(DNDR units).

J s

e e es -

  • Page 9 of ??

~ ~

for the

.l DUCR cases above the 95/% tolerance level the greatest percent error was lhe remaining -. l test cases

_ _4 -

had percent errors less than _l'.(ab:.ol ute) . The co: mon input

. data to these test cases was found in other tesi cases with less maximum difference and less percent error. It is therefore concluded that no errors are indicated in the CPC Single Channel DNDR program, for LPD the cases examined were: ~ l cases with differences below thelower95/95tolerancelimitof ] (% of core average lal/f t),

.fcaseswithdifferencesgreaterthantheuppertolerance limit of ' land .jcaseswithLPDvaluesgreaterthan -

of core average kw/ft and with differences outside the above__

stated tolerance limits. For the LPD cases with values above V 1 g jof core average Ix/ft, the largest percent error was{ g.

The size of this percent error term indicates that the differences between the CPC Single Chcnnel and the FDRTPM simulation are due to machine dif ferences ir. ccuracy when calculating large numbers.

For the - _ .

test cases with LPD values less than of core average ku/ft, I' cases had percent errors greater than-_ _,i. The

-. m largest percent error was 1 , Examination of the inputs to theseCtasesshowednocom!aninput. Examination of the inputs to all _

~jlPDcasesoutsidethetolerancelimitsshowedthatthe inputs covered a wide spectrum. L'o co:miun area was found. It is therefore concluded that there is no indic tion of software errors in the Single Channel calculation of LPD, 2.2 CEAC_ Input Swefep. Test Case _ Selection

] test cases, each involving different combinations of CLAC process inputs were used for CLAC design qualification testing of the kevision 04 software. These test cases covered all CEAC operaling space.

Page 1D of 27

2. '2.1 CEAC Processor Uncertainty Results for each test case, differences between 10ll1RAtt siinulation and CEAC system results were calculated. The processor uncertainties for Dl!!!R and LPD are defined as the orie-sided tolerance limits which encompass 95'/ of the distribution of D';IR and 1.PD penalty factor differences for all test cases with a 95% confidence level.

The processor uncertainties for the DulR and the LPD penalty factor differences are and ]respectively.

2.2.2 Analysis of CEAC loput Sweep Test Results The results of the test cases exceeding the 95/95 tolerance limit were analyzed for evidences of software errors.

] ] test cases had differences in the big penalty factor flag.

The

[ ]testcaseswithdifferencesinthebigpenalty factor flag were examined. Results indicated that these differ-ences were due to implementation differences between the CEAC software and the CEAC FORTRAN and not due to software or FORTRAft progra.r.tnipg errors. These implementation differences do r.ot impact calculation of the CEAC penalty factor output words. It r-was concluded that the results of thel _' test cases did not indicate the existence of software errors.

Page 11 of ?7

3.0 DYt hMIC 501 (1.'A1:E VLltifICAllDl TESI The Dyo mic Software Yerification lest (DSVT) is a real time exercise of the CPC application software and executive software with transient CPC input values read from a storage device. This test has tuo objectives:

(1) To verify the dynamic response of the integrated CPC software is consistent with that prediced by design analyses, and (2) To supplement design documentation quality assurance, Phase I module tests, and Input Suecp lests in assuring correct implementation of software modifications.

Eurther information concerning DSVT may be found in Reference 1.

,3.1 DSVT Test Case Selection Test cases for DSVT are selected to exercise dynamic portions of the CPC software with emphasis on those positions of the software that have been modified. The major modifications made by the Revision 4 changes are:

(1) CEAC i.ogic and Data Base changes to allow proper operation with plants containing 2-CEA subgroups.

(2) Replacement of the COSM0/h'-3 based Df!BR calculation with CETOP2 based on the TORC /CE-1 DNDR correlation. This change totally replaces the STATIC program and modifies the DNDR UPDATE calcul'ation.

(3) Changes to curve-fitting routines, modelling core power distribution for more precise CI:A configurations and the use of additional corrections and offsets to yield improved D CCura cies .

page 12 of P'/

I (4) Algorithm :,implifications in the POL'LR program yielding improved computer efficiency.

(S) Additional addressable c90stants te facilitate changing constants lil:ely to vary during plant life, and to allow clearance of the CE/C snapshot buffer and rewriting of the entire CRT display.

For more detail on Revision 04 sof tware modifications, see Reference 2.

DSVT requires that as a minimum cases be selected for test (Reference 1). These cases are from the Phase _

Il test series (identified in Reference 1) and consist of

._l

~5ccause the changes to the program algorithms were significant all of the DSVT test cases i were' executed on the CPC TORThAN simulation code and on the Singic Channel facility.

This ensures that all dynamic portions of the CPC software are adequately exercised.

3.2 Generation of DSVT Acceptance Criteria Acceptance criteria for DSVT are defined (in Reference 1) as a trip time and initial values of DNBR and LPD for each test case.

These Trip Times and initial values are generated using the certified CPC FORTRAN simulation code. Processing uncertainty obtained during Input Sweep Testing is factored into the acceptance criteria. Trip times are also affected by program execution lengths. The minimum and maximum progran e.secution lengths for the Revision 01 sof tware modifications t,vre calculated and were used in DSVT. These execution lengths (in milliseconds) are listed below.

Page 13 of 27

Program liinimum liar.imum l'LDW U?DATE

' P0..'E R STATJC -

Each DSVT case was executed once with the minimum execution lengths and most conservative D!!BR and LPD trip setpoints and once with the maximum execution lengths and least-conservative DiiBR and LPD trip setpoints. This results in a bandwidth of D: BR and LPD trip times.

The final DSVT acceptance criteria bandwidths contain the effects of processing uncertainties and program execution lengths. The software DSVT program also includes a []raillisecond interrupt cycle in order to chech'for D:BR and LPD trip signals. This

~

results in a ,] millisecond interval limit an Trip Time resolution which is factored in the acceptance criteria. The following tables contain the final DSVT acceptance criteria for initial values and trip times of DNBR and LPD.

Page 14 of 27

!)ti!!!! and 1.PI) 1rii t ial Valut '. (li:1PR linitt, atu! lat/f t. re!.pectively),

D lilR DilllR LPD Tett Car.e LPD iliiid, j!jax.J. ,Qliid ,(lj.u ._),

9 e 8 0

e

.m l'a!!c l!i of E7

D!il;R iinti 1.PD h.itial Values (D:!!!R tinitt. ioni heIit . re".1u c t ivel).).

(Cont.)

DimR DNiiR LPD LPD Ter.t ' Ca se ,(lii n. ) ,(lia x .), ,(lii n_.j, Mg . ),

hh

.. ,, g e

  • l' age 16 of 27

t D!!!;1bipd i PD T_r_iy_lig ;dr.econdg Dl!Pl! Trip Dillill Tri (Pl> Trip Ter.t Car.e __(liijk)_., LPD Trip

__(Ito:. )p__ _ jliin ._)_ __(liqx l_

I i

. " ____l l' age 17 of P7

D:liti: and I.PD 1 rip,li!rlL(acpjig,1 (Cont.)

Dh'l;R 1 rip D!!!'R Trip LPD Tri Li'D Trip Test Case _(liijhj_ ._(li<ix. L_ _(ljin.)p ._(ljh)_

J._.

O g S

Page 18 of 27

3.3 DSVT 'lest I:et.ul t r.

The Dynamic Sofluore Verification lest was executed on the Single Channel facility using the ikvision 04 CPC software. The DSVT Test results are contained l>elow.

Dh'DR . LPD DilDR Trip LPD Tri

_l.est Cqte Q,'hR Un,ilts_). .U.w/ f t .]. Accd_ jsec )p ow..

O g .

4 4%

M l' age 19 of I'/

s

[11;im Li'D Dl;liR Trip Te!,t Cat.e U'D Trip

, (1)ilbl: lin i t r. ). , (lw/ft-[

i _(ML)_ . EE)._

e

  • e h

a g g O4 94 4

4

  • e k

for some test cases, the initial values of D:R;R or LPD, or the Dfil'R or LPD trip times, were outside those defined ley the 10R11:h:1 generated bandwidths. These cases weie evaluated individually and there is no evidence of sof tware errors. Thus, the objectives

.of DSVT have been met.

All the initial values of D:'!!R and LPD vere within those defined by the FORTRAtt generated bandwidths with the exception of[_)

cases. These were test cases -

with initial LPD values at -

]Lw/ftrespectively, correspondingto{_ _ gof ratec average power density.

The processing uncertainties used to generate the bandwidths are valid only for normal power levels up to[ ]of rated average power. The magnitude of the differences between the FORTRAi1 value and the Single Channel value was extremely snall when compared to their initial values. These differences are due to the differences in machine precision between the CPC hardware (i.e., the Interdata 7/16) and the CDC 7500 (on which the CPC FORTRA!! simulator is executtd) in conjunction with the processing unccriainty range mentioned above. In each of these cases, both the single channel and the FORTRAf1 initialized with a D::DR and LPD trip because of the magnitudes of the inputs.

.. 3 Test cases were cases with the plant in the trip condition initially. In the FORTRAf; simula-tion, one program execution cycle was required to generate a trip output, therefore the minimum and maximum trip times from the FORTRA!1 were[_lseconds while the trip times from the Single Channel were{ ] seconds. Proper software initialization was verificd for these cases by checking the FORTRAf1 outputs at time seconds and verifying that the FORTRA!l did indeed calculate a trip conth Lion.

Page 21 of 27

i

~ ~

The t.PD trip time:, in text cases ond the D:lBI: trip

~

time:, in test cases [_ _ IIere[ ' ~~ l outside of the FORTI:li!! generated bandwidth. Investigation oi this anomaly revealed the following:

It was concluded that the DSVT results did not indicate the existence of sof tware error.

l' age P2 of'7

4.0 LIVE IllPUI Sil!CI E PAf'AMr1ER TES11HG The Live Input Single Parameter test program is a real-tic:e exercise of the CPC/CEAC application and executive software, with

' transient CPC/CEAC input values generated from an external source and read through the CPC/CEAC input hardware. The objectives of this test are:

(1) To verify that the dynamic response of the integrated CPC/CEAC software and hardware is consistent with that predicted by design analyses.

(2) To supplement design documentation quality assurance, Phase I module tests, input sweep tests, and DSVT testing in assuring correct implementation of software modifice. ions.

(3) To evaluate the integrated hardware / software system during operational modes approximating plant conditions.

4.1 LISP Test Case Selection Reference 1 i entifies the test cases to be used for LISP. These cases 'are'.the single variable dynamic transient test cases from the Phase 11 test series.

These test cases, which are applicable to ANO-2 cycle 2, consist of "

I~ .

4.2 Generation of LISP Acceptance Criteria _

The accrptance criteria for LISP is bared on trip times for the dynamic test cases.

l' age 23 of P7

These cases are simulated in f0RTR/J4 and contain the following adjustment compone'ils.

Program execution lengths ,used for LISP testing were the same as those for DSVT, with the addition of CEAC minimum and maximum executionlengths([ ]insec respectively).

The final acceptance criteria (generated by the FORTRAN code and -

adjusted for the above components) for LISP is contained in the following table.

Test Case Minimum Trip Time Maximum Trip Time (seconds) (seconds) 4.3 LISP Test Results The ] } dynamic transients were executed on the Singic Channel facility. The recorded trip times (in seconds) for each case are listed in the following table:

Page N of 27

liun All recorded trip times mee't the final acceptance criteria for LISP. .

liajor aspects of the operator's module operation, particularly the point ID verification and addressable constant range limits were tested. As part of the testing, the CPC and CEAC Point ID tables were checked to assure that the Point ids displayed on the operator's module are the same as those listed in the Point ID

~

tables. During the check of the CPC Point ID table, The aspects of automated reentry of addressable constants ucre

. tested and found to be acceptable.

l' age 25 of 27

5.0 Pil/tSE 11 Tf:ST kl Stil.TS StlMMARY The Phase 11 sof tware verification tests liave beco performed as required in Reference 1. In all cases. the test .esults fell

' ithin the acceptantc criteria, except those cases discussed in it Chapter 3. These cases were analysed and results indi"ated that they were not due to software or FORTRAN errors. Based on these results, the Phase 11 test objectives have been successfully achieved. It is concluded that the CPC and CEAC sof tware modifica-tions described as Revision 04 hav,; been properly integrated with the CPC and CEAC sof tware and system hardware and that the static and dynamic operation of the integrated system as modified is consistent with that predicted by design analyses.

e Page P6 of 27

-. , - . . . . . . . . . . - - - - ~

4 0.0  !!JIO11:llCI:5

1. CPC Protection Algorillun Sof tware change Procedure Cell-39(A)-P, ,

Itevision 02, Deccin!>cr 21, 1978, e

2. CPC/CEAC Sof tware Modifications for Arkansas l'uclear One -

Unit 2, Cell-143(A)-P Revision 00, December,1980.

e I g 9 e

e 9

9 Pa!,e 27 of 27