ML20214S628

From kanterella
Revision as of 09:27, 4 May 2021 by StriderTol (talk | contribs) (StriderTol Bot insert)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigation Jump to search
Verification & Validation:Final Rept for Plant Safety Monitoring Sys
ML20214S628
Person / Time
Site: Vogtle  Southern Nuclear icon.png
Issue date: 06/30/1987
From:
SOUTHERN COMPANY SERVICES, INC.
To:
Shared Package
ML20214S590 List:
References
NUDOCS 8706090348
Download: ML20214S628 (23)


Text

V; - .. . : ,

VERIFICATION AND VALIDATION

,i FINAL REPORT l

FOR THE i

- PLANT V0GTLE PLANT SAFETY MONITORING SYSTEM .

JUNE 1987 8706090348 870601 PDR ADOCK 05000424 P PDR 2928n:14/ GEL /487

c .- . . ...

PLANT V0GTLE PSMS VERIFICATION AND VALIDATION PROCESS ~

FINAL REPORT TABLE OF CONTENTS I.

SUMMARY

II. PSMS FUNCTION OVERVIEW

~

~

III. VERIFICATION AND VALIDATION PROCESS PHILOSOPHY IV.

SUMMARY

OF VERIFICATION ACTIVITIES V.

SUMMARY

OF VALIDATION ACTIVITIES VI. REFERENCES 2920n:15/ GEL /487

~

I.

SUMMARY

The Georgia Power Company has installed a microprocessor based system which performs post accident monitoring display at the Plant Vogtle proj ect.

A comprehensive verification and validation (V&V) program was conducted on the PSMS to ensure the functionality of the system to a level commensurate with that described in the system requirements. A brief description of the V&V program is given in Section III.

The software verification program on the PSMS was completed in May 1987 with the total number of units involved being 1018. A comparison was made between the software utilized in the Plant Vogtle PSMS and the South Texas Project QDPS to determine the commonality of software units. The study revealed that 653 units (64 percent) of the total sof tware in the PSMS was already tested in the QDPS V&V program, which is described in detail in Reference A.

The 365 units that were specific to the Plant Vogtle PSMS were due to the following plant features: plasma display formats; the Reactor Vessel Water Level Indication System (RVLIS); and the Neutron Flux Monitoring System (NFMS).

A total number of 64 verification trouble reports were issued on the Plant Vogtle specific software. All verification trouule reports have been resolved at this time. The verification team identified 42 possible error types that could result in the generation of a trouble 2928n:16/ GEL /487

7;Q , . .. .. .. .. ..

- report. A total of 80 errors of various types were reported in' the 64 trouble reports issued. Of the 42 possible error types, only four error types contain a significant portion of the total errors.

The scope of the validation effort on the Plant Vogtle PSMS consisted of conducting approximately 735 tests. When any validation test failed the applicable acceptance criteria, a validation trouble report was issued from the validation team to the design group for resolution. A total number of 14 trouble reports were issued. A total of.30 errors of various types were reported in the 14 trouble reports. It should be noted that none of the errors precipitating a validation trouble report should have been found during the verification process. All trouble reports were in areas specific to validation.

Eight (8) of the errors were generated during the prudency testing phase and 22 were generated during the functional requirement testing phase. The validation and design team identified five mechanisms for resolving the trouble reports: software changes; hardware changes; ,

functional requirement changes; validation test procedure / decomposition changes; and no problem identified. The majority of-the trouble reports were resolved via software changes (43%), and functional requirement changes (40%).

One of the validation trouble reports remains open. The issue addressed in the open trouble report concerned Human Engineering Deficiencies (HEDs) on the plasma display pages. The severity of the identified HEDs was judged to be minor. As such, no safety concern is introduced via system operation with the current plasma display pages.

2928n:17/ GEL /487

r. . . . i ., , . _ ..... . . - . . ..__ _. . . . . . . ....

l II. PSMS Function Overview The Plant Safety Monitoring System (PSMS) is a microprocessor based system which performs post accident monitoring display at the Plant Vogtle project.

k:

The PSMS performs the following functions:

1. Data acquisition, processing and qualified (Class lE) display for 4

c 1.1 Implements qualified monitoring channels to comply with post accident monitoring Category 1 equipment design and qualification criteria.

~-

1.2 Provides safety grade signal processing for inadequate core cooling (ICC) instrumentation as defined in NUREG-0737, item

'~

This includes signal processing for reactor vessel water II.F.2.

6 level, core exit thermocouple temperature and RCS subcooling.

1.3 Isolates Class lE and associated signals for input to non-Class lE equipment including the plant Safety Parameter Display System (SPDS),

1.4 Provides consolidated, unambiguous, human-factored displays of appropriate variables.

52%kMVEMMmW

i%-

The PSMS consists of the following hardware: 'four Class lE remote o

'processingca'binets;twoClasslEdatabaseprocessingunits;twoClass1E

<. . 4 . a

, plas,ra' display unitt, and one non-Class 1E' remote processing unit.

1. Remote Processing Cabinets (RPUs),

> a

'Each'~edJndant r channelized RPU contains signal conditioning / buffering 7

aga.irment and associated DC power supplies for field inputs. Data is output to the Database Processing Unit and non Class lE.via datalinks and individual analog signals as required. ,

J"E - 2. Database Processing Units (DPUs)

Each DPU contains signal processing equipment, signal isolation / buffering equipment and a DC power supply. 'The DPUs

- receive data inputs from each of the RPUs and transmit data outputs to the Class 1E plasma display units, analog outputs to analog indicators and contact outputs to provide qualified status

information.
3. Plasma Display Units (PDUs) l L Each plasma display unit contains microprocessor equipment and a DC L

l power supply necessary to receive data from each DPU and generate ,

i i graphic and alphanumeric display pages. A function keyboard attached

! to each display unit allows the operator selection of specific display pages.

I L

. ,, _. 2928n:19/ GEL /487 __ _ _ __ ._ _ _ _ _ _ _ _ _ _ . , _ . _ _ _ _ _ _ . _ _ _ _ . , _ _

g . ..

I -

n', '

i 4.. Non-Class lE Remote Processing Unit (RPU N)

The single non-Class lE RPU N, providing data acquisition for certain non-Class lE signals, is used for logical completion of graphic displays.

III. Verification and Validation Process Philosophy III.1 Verification Philosophy With the application of programmable digital. computer systems in safety systems of nuclear power generating stations, designers are obligated to conduct independent reviews of the software associated with the computer system to ensure the functionality of software to a level commensurate with that described in the system requirements.

Figure 1 illustrates the integration of the system verification and validation process with the system design process. During the implementation stage, when the writing, testing, assembly and documenting associated with each software entity is completed by the design team, the software entity is officially turned over to the verification team. At this point, the verification team performs an independent review and/or test of the software entities to verify that the functionality of the software entities meet the applicable Software Design Specifications.

. _ _ -___-___-____-__________b

c. :V

~After'the verification team is satisfied that all requirements are met, the software is configured for use in the final system and subsequent system validation process.

' Figure 2 illustrates the philosphy utilized in conducting the verification process. The verification process begins at the unit software level, ie., the simplest building block in the software. After all software units that are utilized in a software module are verified, the verification team proceeds to verify that module. Not only is the software module verified to meet the module Software Design Specification, but the .

~

verification team ensures that the appropriate unnts are utilized in generating the software module.

After all software modules necessary to accomplish a software subprogram are verified to meet the applicable Software Design f

Specifications, the verification team proceeds to verify that

subprogram. As in the case of the software module, the l

l verification team not only verifies that the subprogram meets the applicable Software Design Specifications, but the team verifies l

that the appropriate software modules were utilized in generating the subprogram entity. The verification philosophy ensures that the verification team tests and/or reviews the interface between l

the sof tware unit, module and subprogram entities.

Two levels of verification software testing were utilized as defined in the PSMS verification and validation plan: structural testing and functional testing. Structural testing, which pef &reMNEU)&R1

r;-- ..- - . . .

attempts to comprehensively exercise the software program code and its component logic structures, is usually applied at the unit level. The functionality of the program is verified along with the internal structure utilized within the p'rogram to implement the required function. The expectation is that most of the errors will be discovered and corrected at this level, where the cost of doing so is minimal.

Structural testing requires that the verifier inspect the code and understand how it functions before selecting the test

-inputs. The test inputs are chosen to exercise all the possible paths within the software entity.

In the functional approach to program testing, the internal structure of the program is ignored during the test data selection. Tests are constructed from the functional properties of the program which are specified in the Design Specification.

III.2 Validation Philosophy l

l Whereas the system verification process verifies the I functionality of the software entities beginning from the 1

smallest software entitity and progressing to the program level.

l The system validation process is performed to demonstrate the l

l system functionality. By conducting the system validation test,

~

the testing results demonstrate that the system design meets the l

2928n:22/ GEL /487

pr .

.c.

system functional requirements. Hence, any inconsistencies that occurred during the system _ development in this area that were not discovered during the software verification activities would be identified through the validation process.

During the_ software verification process, a bottom-up microscopic approach is utilized to thoroughly and individually review and/or test each software entity within the system. This required a significant effort and verifies that each software element performs properly as a stand alone entity.

Validation compliments the verification process by ensuring that the system meets its functional requirements by conducting top-down testing, first from the subsystem level and then from a total system perspective. This is illustrated in figoce 2.

The major phases of the validation process include the following:

a. Top-down functional requirements testing
b. Prudency review of the design and implementation
c. Specific Man-Machine Interface (MMI) testing The macroscopic top-down functional requirements phase of validation testing treats the system as a black box while the prudency review phase requires that the internal structure of the integrated software / hardware system be analyzed in great detail.

Due to the dual approach, validation testing provides a level of

. __2928n:23/ GEL /487 _

thoroughness and testing accuracy which ensures detection of any deficiencies that occurred during the design process but not discovered during verification. Validation is performed on

~

I verified software residing within the final target hardware as shown in figure 1.

The Validation Plan defines a methodology that must be followed to perform a series of top-down functional requirement reviews s.

and tests which compliment the bottom-up approach utilized during the verification testing phase.

Four independent types of reviews and/or tests are to be conducted to ensure over-all system integrity:

1. Functional requirements testing - ensures that the final system meets the functional requirements. A comprehensive functional requirement decomposition was conducted on all system functional requirements from which the validation test requirements originated.
2. Abnormal-mode testing - ensures that the design operates properly under abnormal-mode conditions.
3. System Prudency Review / Testing - ensures that good design practice was utilized in the design and implementation of critical design areas of the system. These tests require that the internals of the system design and implementation be analyzed in detail.

2928n:24/ GEL /487

C .

l' .

.u. ... ___

4. Specific Man-Machine Interface testing - ensures that the operator interface utilized to modify the systems data-base performs properly under normal-mode and abnormal-mode data entry sequences. This is a critical area requ' iring special attention due to the impact on the software of the system level information which can be modified via this interface.

W l

l l

l f

l-I-

p..g.t:q : .: - 3 .- . . _ _ . . . . . . . . . . . . _ - - . _ . . . . _ _ .

~

IV. Summary of~ Verification Activities The;overall scope of the verification effort on the Plant Vogtle PSMS

~

consisted of investigating 1018 units of software. Of the 1018 units, 653 units were equivalent to those tested as part of the verification l program conducted on the South Texas Project QDPS. The remaining 365 units were specific to the Plant Vogtle PSMS due_to the following plant features: plasma display formats; Reactor Vessel Water Level Indication System (RVLIS); and the Netron Flux Monitoring System (NFMS).

I When any software unit failed the verification activity, a trouble report was issued from the verification team to the design group for i resolution. A total number of 64 trouble reports were issued on the

-Plant.Vogtle specific software for resolution. Of the 64 trouble I

reports, all have been resolved at this time.

i- ,

i In addition to the issuance of trouble reports, clarification reports were issued when a verifier found. typographical or other documentation i

errors of a minor nature, or when something noteworthy had occurred during testing that the verifier felt beneficial to bring to the attention of the design engineer, but was not significant enough to fail the unit. A total of 92 clarification reports were generated. All clarification reports have been resolved at this time.

As the verification team documented trouble reports, the type of error responsible for the generation of the trouble report was identified.

Such coding permitted error types to be analyzed with respect to MYivffMPfGdffl

r. .

frequency of occurrance. This permitted the verification team to anticipate areas that resulted in frequent trouble and attempt to implement actions to resolve the problem areas.

The verification team identified 42 possible error types that could result in the generation of a trouble report. A total of 80 errors of various types were reported in the 64 trouble reports issued, i.e., a trouble report may have contained more than one error type.

Of the 42 possible error types, only four error types contain a significant. portion of the total errors. Figure 3 illustrates the error

. types expressed as a percentage of the total number of errors identified. Analysis of figure 3 reveals that the four predominant error types account for 81 percent of the total number of errors reported.

i i

i i

l

. 2928n:27/ GEL /487

y ; .s. ,

'V. Summary of. Validation Activities The scope of the validation effort on the Plant Vogtle PSMS consisted of conducting approximately 735 tests. -When any validation test result

[ failed the applicable acceptance criteria, a trouble report was issued from the validation team to the design group for resolution. A total number of 14 trouble reports were issued during the validation process.

It should be noted that none of the errors precipitating a validation trouble report would have been found during the verification process.

All trouble reports were in areas specific to validation.

^

As the verification team documented trouble reports, the type of error responsible for the generation cthe trouble report was identified by error type. The validation team utilized the same 42 possible error types that were identified in the verification process. A total of 30 errors of various types were reported in the 14 validation trouble reports issued. One predominant error type accounts for 50 percent of the total number of errors. Figure 4 illustrates the error types expressed as a percentage of the total number of errors identified.

No trouble reports were generated in the Man Machine Interface phase of validation testing. Eight (8) errors were generated in the prudency testing phase. Finally, 22 errors were generated in the functional requirement testing phase. A total of 16 validation clarification reports were issued.

As the design team evaluated the trouble reports, the method of resolution utilized in resolving each of the trouble reports was documented. The validation and design team identified five mechanisms 91Vandn2fPRMKG

. ~. _ _ . .

for resolving the reports: sof tware changes; hardware changes; functional requirement changes; validation test procedure / decomposition changes; and no problem identified.

Figure 5 illustrates the percentage of the trouble reports that were categorized into each of the above five areas. As seen from the figure, the majority of the trouble reports were resolved via software changes (43%), and functional requirement changes (40%).

All validation clarification reports have been closed. One of the validation trouble reports remains open at this time. The issue addressed in the open trouble report concerned Human Engineering Deficiencies (MEDs) on the plasma display pages. The specific deficiencies identified include the following: '

a) The word HEAD is spelled out instead of using the acronym HD on DETAIL DATA page 1.

i b) The words UPPER and LOWER are spelled out instead of using the acronyms UPPR and LOWR on the NUCLEAR POWER display and on page 4

, of PRIMARY TRENDS, I

c) The label for NUCLEAR POWER UPPR RNG is % PWR on the NUCLEAR POWER display and only 5 on other display pages, d) Wide range containment water level is denoted as CNMT XiND RNG ,

i WTR LVL instead of CNMT WIDE RNG WTR LYL.

1 2928n:29/ GEL /487 l t

1,....... . . ,. . . . . . . . . . -

~

The severity of the identified HEDs was judged to be minor and as such, no safety concern is introduced via system operation with the current plasma displays.

C-O e

1 i

FDJ3ittdTW8fGdflW

(;.

VI. REFERENCES A. M. R. Wisenburg to U.S. Nuclear Regulatory Consnission, " Submittal of Supplement 1 to the QOPS Verification and Validation Program Final Report ", ST-HL-AE-1988, March 19, 1987.

]

4 l

t 4

i i

I i

i 2928n:31/ GEL /487

r , . _

, c , ..

y___:,.* ,

_ __ _ _ . . . _ . . _ ~ . . . ._

l tr e o e e e . . e e e e e e e o e e e e e e TRIELE IEPIRr

  • FUCrIME.
  • 4 IE25eENT5
  • 1 - + st:Tm_ _ _ _ _ _ _ _ _ _ _ _ _._ _ _ _,

sysTIN - + numeens I

.> 25IM SOFTW a l

== e i- m - --4 anun. i  !

g --> pgoczss e , = ).

I "

f  !

l I I

e g

I , o WE sunWW IISIM ESIM - .) I ,

e m M l g ,,

I i i g I

s0FTW E l * {

i carIa m rim mW sonWa 1 i mmu.

i CESIM DEEIM l 1 1 l ma i

arTw a I l

i l:-

,e mrx caona ao ->1 -.

DeumDs 1 I e 8 g vmTIm issy I ,

I me

- - - .> - - - ., EED I se,ga i e

- 8 sysim

  • umrIm i i e

,- 8

  • systm I e

mr i i -e FM"-

i e

. Ef5fth 1 a

1

(- --I I

e I

  • e v a rim , e ,

a rzss QDPSDESIGNVERIFICATIONANDVALIDATIONPROCESS imar ne a r i

Figure 1 e

P 1 - -

,'e .

_ ,3 . - .. _ . _ _ .

o O

G , O e

O O e e

9 r .

VERIFICATION VALIDATION

. tEsIm I g axuen t==xi g l E

g R

e n -

y

. FLNCTIONAL IE25 BENT E

. _ __ _ _. .. g.

Nm ei P9tS . (DML . SEWIS/ TAS 525YSTEM IESIM WEC

. __ .__.m _

g.

M M M M M W W M M M M M M M M M M M e .

hN 4%

,, ,r $r ,,

= a. e em. em. -.m .m. e e.m a e

QDPS VERIFICATION & VALIDATION PHILOSOPHY

( Figure 2 e

e 9

[, ,

1 Verificotlori Error Type Summary errorType43(2.95) m-ErrorType41(17.05)

ErrorType8(33,85) i I'rer Type 33 (5.0s)

==:::::: e w ==::::: ErrorType13(3,3) l l

ErrorType40(eg,gg) b - '")

L trrorType33(s.es) l .

64 Trouble Reports

' 80 Total Error Types i-Figure 3 i

0 1

i I

Validation Ermr Type Summary Error Type 30(10.05)

ErrvrTypeat(6.75)

, /_

/

/

f I-ErrorTypeIt(10.05)

Error Type 40(50.05) --

ErrorType36(8.Fs)

\

\

ErrorType37(13.3s) d ErrorType36(3.35) 14 Trouble Reports 30 Total Error Types e

Figure 4 O

t

Validation Resoluffon Summary ePenReport(3,35)

No M les Mentified (13. 2)

Software thanges (43.M)

FunctlanelseguntsChanges(40.05) e 14 Trouble Reports

. 30 Total Error Types I

{

Figure 5

,.-,-y- , _ . , ....c, _ . .-m_,.__-.-_,,__.-_____,c.- , _ _ _ _ _ _ - . -