ML20042G145

From kanterella
Jump to navigation Jump to search
Eagle 21 Process Protection Sys Replacement Hardware Verification & Validation Final Rept
ML20042G145
Person / Time
Site: Sequoyah  Tennessee Valley Authority icon.png
Issue date: 04/30/1990
From: Erin L
WESTINGHOUSE ELECTRIC COMPANY, DIV OF CBS CORP.
To:
Shared Package
ML20042G144 List:
References
WCAP-12588, NUDOCS 9005110142
Download: ML20042G145 (19)


Text

-' -

p

- - -- i.....

.g j

x..

2 a

.+

g 4

s j

A 4,

4

-g-

.L

~

6 y

y.,..

47

'.k

.g

-Q.

. Q.< r.+ = ' '* -..

c

-. y G

W

-~[

[

9 9

.4 4

}

^-

g%

  • y,.

x Sk p

+

..e' m.

A..

y.

,e.,

3 g

5.

^ 4

- N 4:.

,,4'.

~,,

~

j a

s

^

.e r sg - o.we g.

.g

. +,,.,, g g y..,_

M e,

h.

]*

,?

.; !_tfNfp q :,

, q.3.

%p S'

i

~

.f e

w j.

g 1

a g.

j M.

-y.

h.

g

,4, G-

_ s

~

<g '

M

. *4,,-

E '

I g

  • f t

.pA [*.

,T.

  • '.i g

e

g w.-

7,'g' 1

y.

~.

..+-..

. p~.'.- - :

94. -

u..

s g.

e 4

.. i

.,k, t

2 4,..w

,.c, -

a

(

e l

r-v

, e, s

,4

+

J w

.9

,k h

i st k

p.,

,9 y

t

=

~

g k

gg

..,f e, 1

, w.

\\

- h g

hu 3,

.'Q

'. g ak W'

0

,aar -

p.

e 4

. i.

g-

'?

M 4

2.

i.

4.. (

4-p' g -

+

k w'

4 y

3 4

.+'

g g

- k, g U+

t

,., m x

, Q.,

f * *

. ~

f

.. i -

y 7 9;wy 1 evy m -

g,

.r. % y u:%, 4,,.

p q v@t.*wwy*9,n,...y..

..f A.

  • 4 4~

" wg',, NPre,7"- m+p. y yw,'.- w,,..... 4 m.~$wj.<,

. o.,

  • pg,.. g m i gg g 4 n

. s

_9,, g......;,

,g...

.a y '.-

,. r

  • g.. v.v.

4 3. w >y g-. + 3 y.

s,,,.

,4,.

n n

.v.

1.%,

g~

g.

s

?.

g, M

g s -

g L

-Y g., " ',

A ", s.

- y

,g.

, 4

~

+ %. '

~

y r

~.

~

m

~ ~...

-r t, j 9,

g

~ g' g,,

y a *,

r., y,

.,Y i

n.

e n

w.

g a

+

k.; )i; 1 %*,, -

m a

r 4.i

+

p

,r..

r.,. z

, h4,

3

\\

n t

'N c

f 4

1.

4

. g 2

T

+

t 4

. ~

p 4

m.

ga ks

'4= h y.,,.

g 1.,h,

Js

[,i' 3

x j' 5 g -

1.

.4 i

7.., *

%f

' g s.

T

, a.. s i

) "

+)

,i.,.

-4. ;

-j *

' q 4

F

?

- l 3

. -,,.~

4

.i -

I

- 4

  • .'n

' =

,,f

,g<

wt t

.{

, r

..x

.. -. +.

9 9

c,,,

..s y

",r-3.

W^

'W l'

.h u-g 4

,a.

i

~7

.s I

4, 1g' q

' I$

~JE.'. '

w

  • h

,i.

  • M *,

~

v.

~

'y,

., ' - W S '. ' !

hl,:..

L ' '$ d

-,W'<

, /.,. # +y-

. I n,

- +

s

..y 4 - ~

.t

+;,,,.., '.,

' ' Q 4. :

9 E_

n. -~.. '

.,.c+-

c..

N s' [',

r -

,,',.,f'

, ' [ ':-

4

% +

'r

,,b

[',..P'. 4 '.- -

,l w y,7

,,p 1t

/J.i -

1

.s a

m

'8

~

q.

3

.gx%. ',I'I f

te s

=

- e,,

e.

s.

t

+-

6 A

e [

A..-

,.f

>g

  1. 4

[".

(

I Y - : / ?My..An :::%

% R W g h*% f M.6.] ; W T % 1 i

i r( -

k n.p/k.,PG y. _p g

' q k.*. l' '-

.;{

R8. g').p n.[y-

,j

' M~.

,, ^_ q ;

s

"' %ig,4. W-f.- }. i,[p,[,-.

-s, V.

'.,y

'y q, g,.,

v j

g 1.:

~.. _.

..- ;. x w. ' y..,g..,,4,,ty, z.

i..h, p

  • ,2 V,,,.

t

  • 'q'

.,3

.4..

m.

Q t<

t

,e/,

o _-

' 2*.

ID g

= ;n, u.A{,

,A h,..'

s

R' ;. L -, e.,' ;, < f a va,{.6,', "

  • s.

. ^ i

gis
)

i.m

+.

?,

t.

-{-

-.:., g, 7 d, N g. 4 3 3

,a.-

~,.r1

. '.fi,,...

g.--

5..,

A y

-fp,','

'4',.

...,. 4 *..? e

'~.

n

/A.., lif M

'\\

,i q_' y#%

e

',,t 3

, i

<, p, 6,%g;,..,:.3{ "," I?f ;.{t 3k,

.. g r 3,

- yu 2-

, y,. ? s.c '5 ~

y e,

,N e.

u; f

f

,L

y ',.$

T - [

q

t..i.1.g, ya.p l y?:MQ,

.7 <.

y$

e ' a u v.

1.

w

%'.

  • 7.. %. f.fp.p 3..'
. 3 -

y.

- y. 93p;;

p;2

} l,'b' &

^... :

= ~..

...l.

V c.y,

r. w :..

c & }n;.

.,,.g. < ;

,,. i q '.f '. 4 J.v

...a. 4, s a 4,4.. c o

cv-f-

[

A., ' o

. n.., m

-,<. n. rn.,.g,g :n,.,d.,-_....

- s_

-a f

.,,o/c)

.f },

h'

~, -T.~ hh.

,l.

['

h

...+ s.,,

c.

y'

. w

  • .n ~.

, -.. ~

- ~

g..

, '?

y

,.:....,, g ' lw; y. #.

,5

,m d

3 w,f;,. i.. -

c.3 s. q

%%'J..

s q '.

' w' F

I5b,

n a

m y

y ; V[

- m 4, _.-

. b.,. c 3 v.4 d I #.-

t i

e i s.. '5 -

y

  • T eq s

.[ k* >' '*,,j.~. 4

,,,.,. i,.,.y'Y.c. i s r,.

-5

'..l*,,,

W

. '..g!.,,., i i ! '

n d

..,j os

.. - v

  • ).l - [j/

5 %.,-

-y',..,.,. ej, y s

a.

1

.,,.,,s. af

....:,,. j g ( 3. j

  • r f _j y r,.-

.g 9

,t

,'.,, ' - h

,[r y.,, '.

s

'r

" '.-7

< y

,e 5

(

  1. g p[" + 4UI,

f.

1
+

.i,y._

...'.,g,,

,s4,g 4

2*

, 4 "

.,v., -, 'y',

j, '

,,!'[,,- - - m w*,

l 44s g

1 8

{.'.

, / *,

't g

,{

a

=5 s

f,,,

g

  • : N,. p -

...m..

p4 s-

_'l %,

.k ',.,",,

/ '[ {, ]~

=

<}

, x-

- w

_...['.'

c v

G(( zg'c,?..,1 A

, M, w. "..

' @p'p.J /. ', 4. '.*

~

.: - r -." v

, ~i ?

,~-

,p.

\\. ~,$,

mj.N'.,4[%. M.g h.. ty$%$a,,.h/[g y fld.'..

,e.

nn v w '. +N b

+

143 w n..t

.,., > we t., ;o,

[

f

,4, J q.

m e b c.k f

mem -% s.

asm g.. 3:e

,. r 9

.M s f.4 c s,1. t. 3 L ~- 7.,s. :.p.g A i u..e y l/.

x m:. "%e -

..... w.

"l:.

f Yh a m.

. j

, -~ w m

ma w,"www@-.. ~M,k OYy m.. w m_

a u, g gn g 7 -. 'i

,T' %

M,l7..

oi ' '

YJ.h

.n il,3 h 'W,..Ms W:..

el:3.p,p

n. hm$;[$ M)7

,7 4

.f '.

A w @jw@g W

g i sWestin ou.se: Energy

.v;.7 a s

w f v.

. p, 1.g:... y7 m,

.se.

n jf,5, l, '

.,g aw.

qa Wy

~.g ;.p$ y.

  • ip ' J i 'i W~

ty s

1 i:

a

. y

'~

{. -- z

} ',.

,,. W&

  • " ~. aggm, /n$..6 3g.

q

'l' v [ _. h A.

y n m-.

9

- %.y y, ('.,,

((,,,.. %

' gy,g, {.,

~_, g.P$.p.

a+..

y 1.. q q

f z

... j i j'y G m

=

4, 4.'

f;,

3 L. J K. ; : - :..,.

.,3 W.

fjy.

  • Y i 4., '-

x j

.. e 4

2 1 u7

. =.

t w.

3 y ', f L. &.f'}-

v y

T.

j [

r1..

y j;.

..,.a s

. g g

.z.

.y t*

'.g hl.)((;, i;. \\ i '

e

.s_

Westinghouse Proprietary Class 3

-J.

WCAP-12588 t

C

$EQU0YAH EAGLE 21 PROCESS PROTECTION SYSTEM REPLACEMENT HARDWARE VERIFICATION AND VALIDATION FINAL REPORT by L. E. Erin Approved by D. G. Theriault Manager, Software Reliability April, 1990

l ABSTRACT

.A This report documents the implementation of the Eagle 21, Replacement liardware Design, Verification and Validation Plan for Sequoyah.

The report summarizes the Verification and Validation program results which demonstrate that Sequoyah Eagle 21 process protection system functional upgrade has been satisfactorily completed in accordance with all of its functional and design requirements.

i

TABLE OF CONTENTS SECTION IIILE P.AE 1.0 SUM 4ARY l-1 2.0 EAGLE 21 SYSTEM FUNCTION OVERVIEW 2-1 3.0 VERIFICATION AND VALIDATION PROCESS PHILOSOPHY 3-1 4,0

SUMMARY

OF VERIFICATION ACTIVITIES 4-1 5.0

SUMMARY

OF VALIDATION ACTIVITIES 51 9.

1 ii

LIST OF FIGURES e

ElGRE PNii 3-1 Design, Verification and Validation Process 3-4 4-1 Verification Problem Reports 4-3 5-1 Validation Problem Reports 5-2 0

4 iii

^

r. li 1.0 $UMMARY The Tennessee Valley Authority has purchased and will install a microprocessor based system to replace all 13 racks of the analog process protection system at Sequoyah Unit 1.

j The Microprocessor based equipment is the Eagle 21 Process Protection System Replacewnt Hardware. This equipment performs the following major functions:

{

l. Reactor Trip Protection (Channel Trip to Voting Logic)
2. Engineered Safeguard Features (ESF) Actuations.
3. Isolated Outputs to Control Systems, Control Panels, and Plant Comput a
4. Isolated Outputs to iMor>:.ation displays for Post Accident Monitoring (PAM) indication
5. Automatic Surveillance Testing to verify channel performance.

]

A brief description of the Eagle 21 System hardware architecture and related functions is given in Section 2.0.

A comprehensive Verification and Validation (V&V)-program was conducted in accordance with Regulatory Guide 1.152 and ANSI /IEEE/ANS 7-4.3.2 to ensure the functionality of the system to a level commensurate with that described in the system requirements.

The Eagle 21 Replacement Hardware Design, Verification and Validation Plan is documented by Design Specification 408A47, Revision 3, dated May 12, 1989. A brief discussion of the V&V program is provided in section 3.0 of this report.

This final report presents the results of the V&V program conducted on the Eagle 21 System.

1-1 i

3 a

c The sof ttare verification for the Eagle 21-System tas co:pleted in April, 1990 with the total number of software units involved being 1100 Jcr these units, a total of 658 verification problem reports were generated.

All verification problem reports generated were resolved. All changes to the software documentation were reviewed and/or tested to demonstrate i

successful resolution of the problems found.

The system validation program for the Eagle 21 System was also completed in April, 1990, including 21 comprehensive tests and 47 hardware / software reviews. The hardware / software reviews and validation tests hqve been satisfactorily completed. All validation problem reports generated were successfully resolved.

It should be noted that none of the errors identified in the validation problem reports were errors that would be expected to be identified during the verification process. All problem reports generated during the validation process are in areas specific to validation.

The Eagle 21 functional upgrade implemented for Sequoyah Unit 1 is demonstrated to meet its functional and design requirements.

'1 l

l I

L I

l

\\

V

-1 1-2

s

\\

2.0 EAGLE 21 SYSTEM FUNCTION OVERVIEW The Westinghouse Eagle 21 microprocessor based process protection upgrade 1-system is applicable for those instrument systems which are

" safety-related" as defined by IEEE Std. 279 1971, " Criteria for Protection Systems for Nuclear Power Generating Stations". The Eagle 21 portion of process instrumentation includes all necessary devices with the exception of transmitters, indicators, and recorders.

The Westinghouse Eagle 21 microprocessor-based process protection system is a functional replacement for existing analog process protection equipment used to monitor process parameters at nuclear generating stations and initiate actuation of the reactor trip and engineering safeguards systems.

Features of the Eagle 21 equipment include the following:

A.

Automatic surveillance testing to significantly reduce the time required to perform surveillance tests.

L B.

Self calibration to eliminate rack drift and time consuming calibration procedures.

l L

C.

Self diagnostics to reduce the time required for troubleshooting.

l D. 'Significant expansion capability to easily accommodate functional l -

upgrades and plant improvements.

E.

Modular design to allow for a phased installation into existing process racks and use of existing field terminations.

s k

The Eagle 21 System 8 rdware consists of three basic subsystems per rack:

Loop Processor Subsystem, Tester Subsystem and Input /0utput Subsystem.

l l

I l

2-1 p

1;

,y

'1. Loop Processor Subsystem The Loop Processor Subsystem receives a subset of the process signals, performs one or more protection algorithm,'and drives the j-required isolated outputs.

2. Tester Subsystem The Tester Subsystem serves as the focal point of the human interaction with the protection rack.

It provides a user friendly interface that permits test personnel to configure (adjust setpoints and tuning constants), test, and maintain the system.

3. Input /0utput (I/0) Sutisystem The microprocessor based system interfaces with the field signals through various input / output (1/0) modules. These modules accommodate the plant signals and test inputs from the Tester Subsystem, which regularly monitors the integrity of'the Loop Processor Subsystem.

In an Eagle 21 Process Protection Instrument Channel, field sensors are connected to cabinet mounted terminal blocks. The process electronics power the sensors and perform signal conditioning, calculation, and isolation operations on the input signals. However, each element of the process is not an individual electronic module or printed circuit board assembly. A multiple channel Analog Input module is used to power the field sensor (s) and perform signal conditioning. All calculations for the process channel functions are performed by-a centralized Loop Calculation a

Processor (LCP). Typical functions performed by the Loop Calculation Processor are as follows:

summation, lead / lag, multiplication, comparator, averaging, and square root conversion. Trip logic-is provided through multiple channel Partial Trip Ntput modules. Multiple channel isolated analog' outputs are provided tf Analog Output modules.

In addition, all Eagle 21 process protection channals are configured to.

puform automatic surveillance testing via a centralized Test Sequence Processor (TSP).

2-2

Protection channels processed aith the Eagle 21 process protection system are as follows:

A.

Average Temperature and Delta Temperature B.

Pressurizer Pressure C.

Pressurizer Water Level D.

Steam Flow and Feedwater Flow E.

Reactor Coolant Flow F.

Turbine Impulse Chamber Pressure G.

Steam Pressure H.

Containment Pressure I.

Reactor Coolant Wide Range Temperatures J.

Reactor Coolant Wide Range Pressure K.

Refueling Water Storage Tank level L.

Containment Sump Level M.

Steam Generator Narrow Range and Wide Range Water level The Eagle 21 equipment has been designed to fit into existing process racks and.to interface with other plant systems in a manner identical to the existing analog equipment.

This design maintains the existing field terminals to avoid new cable pulls or splices within the rack. The components for each rack are built into subassemblies which are easily installed into the existing racks. All internal rack cabling is prefabricated. The subassemblies are tested in a factory mock-up to verify proper fit and operation. Detailed installation procedures and drawings are provided with each system.

2-3

3.0 VERIFICATION Als VALIDATION PROCESS PHILOSOPHY 3.1 Verification Philosophy y:

With the application of programmable digital computer systems in safety systems of nuclear power generating stations, in order to ensure the functionality of software to a level commensurate with that described in the system requirements, designers are obligated to conduct independent reviews of the software associated with the computer rystem.

Figure 3-1 illustrates the integration of the system verification and validation with the system design process. The verification process was divided into two distinct phase;:

verification of design documentation and verification of software.

Figure 3-1 illustrates where an independent review and sign-off of design documentation was performed.

After completed software was turned over to the verifier by the design team, an independent review and/or test of each software unit was performed to verify the software unit met the applicable Software Design Specification.

As part of the software unit review, the unit was linked with other interfacing software units where appropriate.

Structural testing was performed on the software units.

This Structural testing comprehensively exercised the software program code and its component logic structures.

This process required the verifier to inspect the code against its associated documentation and understand how it functions before selecting the test inputs and predicting the test outputs consistent with code documentation. The test inputs were chosen to exercise all executable lines-of code within the software entity.

3.2 Validation Philosophy Whereas the system verification process was performed.to verify the software entities, the system validation process was performed to demonstrate the ' system functionality. The system validation testing results demonstrated that the system design completely satisfied the system functional requirements. Hence, any inconsistencies that may have occurred during the system development in this area that were not I

3-1

~

-discovered during the software verification activities w>re identified through the validation process.

3 f.

During the verification process each software entity within the system was thoroughly and individually reviewed and/or tested. Validation -

e, compliments the-verification process by ensuring that the system met its functional requirements by conducting testing from a total systems perspective.

-The major phases of the validation process included the following:

A.

Functional Requirements / Abnormal-Mode Testing Phase B.

Prudency Review and/or Testing of the Design and Implementation Phase C.

Specific Man Machine (MMI) Testing Phase The functional requirements / abnormal-mode testing process treated the system as a olack box, while prudency review and/or testing required that the internal structure of the integrated software / hardware system be understood and analyzed in detail. This dual approach to the validation process provided a level of thoroughness and testing accuracy which ensured the functionality of the system commensurate with that described in the system requirements.

The Validation Plan defines the methodology utilized to perform a series of reviews and tests which compliment the verification process.

Four independent types of reviews and/or tests were conducted to ensure overall system integrity:

1. Func:f onal requirements testing -- ensured that the final system satisfied the functional requirements. A comprehensive functional decomposition was prepared for the system functional requirements and used as a basis for the validation test procedures.

l 3-2

_-__L

2. Abnormal mode testing -- ensured that the design operated properly under abnormal-mode conditions.

l ().

3. System Prudency Review / Testing - ensured that good design practice was utilized in the design and implementation of critical design areas of the system. These tests required that the internals of the system design and implementation be analyzed in detail.
4. Specific Man-Machine Interface testing -- ensured that the operator interface utilized to modify the system data-base performed properly under normal-mode and abnormal mode data entry sequences. This is an important area due to the impact on that portion of the system level information which can be modified via this interface.

h 3-3

DESIGN VERIFICATION, AND VALIDATION PROCESS l0 Ogr!NITIm e'

.MTIO44L,

't g gggg

.i..E.

,. _ _ _.e. _ _ j p a

t w Ti eA.

'*'Ja.,

m,,

[

IPECIFICAT!W SPECIFICATION b

l essisN y

I I

,,,,saats,ai IEll4N SPECIFICATION g

i 1

y o

I

" Lag _.mT_ _,

j

=

wT==

IISION ESI9d g

j l

I I

I y

H pegers WTwM m tFICATION I

ltsfle 1EBTIS _

~

PIERISS l

N

/

i l

sin,enATles U

u l

i 8'8 8 teNFlasuTtes I

MT CDifa0L l

l IW'LDENTAT10N A

l mt BTWIDI l

COSLEft i

l 1

0 u

l I

11148.1947.I.D8 l

1 I

U U l

._______='"_'**"_______J' a

vs.innum exas

@ ODams ledI8ENDD(T WlFICAT10N HEVIEW 3-4

~

4.0

SUMMARY

OF VERIFICATION ACTIVITIES The verification process was performed in accordance with the Eagle 21 pg Replacement Hardware Design, Verification and Validation Plan. All Eagle 21 system software was verified using Level 1 (safety related) type of testing and review. The overall scope of the verification effort on the Eagle 21 System consisted of 1100 units of software.

Related software units were grouped together into software modules.

Each software module consisted of a single source code file.

When any anomally-was discovered during the source code review or during testing, a verification problem report was issued from the verificntion team to the

. design team for resolution. These problem reports consisted of three types, depending upon the scope of the discovered anomally; 1) unit level problem reports, 2) module level problem reports, and 3) generic problem reports. The unit level problem reports addressed anomallies specific to a single' unit of code. The module level problem reports addressed anomallies covering critire modules (typically due to formatting standards concerns).. Generic problem reports covered issues which spanned multiple modules (again typically due to formatting standards concerns). A total of 658 problem reports were generated consisting of 454 unit level problem reports,136 module level problem reports, and 68 generic problem reports, j

All verification problem reports were satisfactorily resolved.

The verification problem reports were assigned error codes as the reports were generated. Working from a list of 10 possible error codes, error types were assigned to problem reports. A probl6.t report may contain more than one error type.

A significant portion of the total unit problem reports 83%, war made y of 4 error types:

A.

Design Requirements Implemented Incorrectly-(Type B) 6.8%

B.

Logic Anomally (Type E) 18.5%

4-1

.-~

E C.

Data Handling Anomally (Type G) 8.1%

D.

Header /Comnent Anomally (Type J) 49.3%

i A categorized breakdown of all software verification problem reports is provided in Figure 4 1.

Based upon Westinghouse and industry experience, these were to be expected as dominant error types, l

O lI 4-2

Sequoyah E21 Verification Prob. Reports,

Total of 658 Reports Generic (10.Wo)

Other (11.9%)

t

=: M.

75 Module (20.7%)

'llii::D.. '

Type J (34.0%)-

ijjjjjjjiij. !!!it "~~

Type B (4.7%)

Type G (5.6%)

Figure 4-1 4-3

(

5.0'

SUMMARY

OF VALIDATION ACTIVITIES 1

The validation-process was performed in accordance with the Eagle 21 Replacement Hardware Design, Verification and Validation Plan, by a team j

of individuals independent from the design team. The overall scope of the validation effort on the Eagle 21 System consisted of performing 21 comprehensive tests and 47 hardware / software reviews.

When any validation test result failed the applicable acceptance criteria, a problem report was issued from the validation team to the design group for resolution. A total of 13 validation problem reports were generated.

All-validation problem reports were satisfactorily resolved.

It should be noted that none of the errors precipitating a validation problem report would have been found during the verification process. All problem reports were in areas specific to validation.

The number of problem reports generated by phase were:

Functional Requirements / Abnormal 12 Prudency Phase 1

MMI' Phase 0

The validation and design teams identified four avenues for resolving the proble;n reports:

Software changes, test setup changes, external influence correction and validation test procedure changes. The number of validation problem reports resolved by each avenue were (Figure 5-1):

Software Changes 1

Test Setup Changes 5

Test Procedure Correction 6

External Influence Correction 1

5-1

~

i 4:

Sequoyah E21 Validation Prob. Reports Total of 13 Reports Test Setup (38.5%)

Procedure (46.2%)

Share F.7%)

Externalinfluence (7.7%)

Figure 5-1 5-2 i

m

...m

_ _ _ _ _. _ _ _. - - -, _ _ _ _