ML20207H965

From kanterella
Jump to navigation Jump to search
SPDS Software Verification & Validation Summary Package, Catawba Nuclear Station
ML20207H965
Person / Time
Site: Catawba Duke Energy icon.png
Issue date: 05/13/1986
From: Fairweather D
DUKE POWER CO.
To:
Shared Package
ML20207H868 List:
References
NUDOCS 8607250148
Download: ML20207H965 (11)


Text

,.

Attachment 7 SPDS SOFTWARE VERIFICATION AND VALIDATION SUBBEARY PACKAGE CATAWBA NUCLEAR STATION Section I: SPDS Software Verification and Validation Test Program Section II: Verification of SPDS Software Section III: Validation of SPDS Software Section IV: Listing of Applicable Correspondence Douglas E. Fairweather Process Computer Systems Engineering Design Engineering Department i

May 13, 1986 Page 1 of 11 8607250148 860715 4 DR ADOCK 05000

Attachment 7 I. SPDS Software Verification And Validation Test Program Introduction Purpose one of the requirements for the successful implementation of a Safety Parameter Display System (SPDS) is an independent verification and validation (V&V) of the system software.

The purpose of this summary document is to describe the Software V&V Test Program implemented by the Design Engineering Department to provide appropriate design review and performance testing to assure that the SPDS performs its intended function.

Scope The scope of the independent Software V&V Test Program includes software design review and performance testing on the Safety Parameter Display Systems being implemented at the Catawba Nuclear Station. Section I of this report describes the elements of the generic test program implemented to perform the software verification and validation tasks. Subsequent sections of this report detail the V&V testing, testing results and corrective actions which occurred for each of the software V&V activities.

Definitions The following definitions apply in this report:

Verification - The detailed review of software documentation generated during the design process to insure that the SPDS l design is a correct implementation of the system requirements.

Validation - The test and evaluation of the completed software to ensure the SPDS meets all the system requirements and functions as designed.

Page 2 of 11

Attachment 7 Verification and Validation Process Overview The approach taken by Duke Power Company in designing and implementing the Safety Parameter Display System is consistent with the long-standing practice of utilizing in-house capabilities. The SPDS was designed, reviewed, implemented, and tested on the Operator Aid Computer (OAC) by the Duke Power Nuclear Production and Production Support Departments. A Human Factors Review and a Task Analysis were performed on the SPDS by the Control Room Review Team to validate the SPDS as part of the total operating system.

The Design Engineering Department was selected to perform an independent review of the SPDS software.

A review program was developed considering the guidance contained in NSAC/39 and was designed to provide an independent review of the SPDS software design and implementation on the Operator Aid Computer. Verification and validation activities included in the program are described as follows:

Software Design Review Activity The objective of this activity was to verify that the SPDS software design is a correct implementation of the system functional requirements. The Software Design Review activity consisted of a detailed review of software

( documentation generated during the design process. This i included:

l. Logic diagrams, segment definition tables, and parameter specification tables derived from the Westinghouse Critical Safety Function Status Trees.
2. Safety Parameter Display (SPD) Program Description.
3. Assembly Language (PAL) Code Listing of SPDS Program implemented on the OAC.

Page 3 of 11 l - . . _ - _ _ - _ _ _ _ _ ___

. I i

Attachment 7 l

This V&V activity examined the design of the SPDS software in terms of its logical integrity, ability to satisfy the performance requirements, data manipulations and timing requirements. The review addressed the following items:

o Software architecture o Input / output interfaces - correct data base point

l. ID's.

o System and execution control o operating sequences - initialization, error detection, etc.

o Testability - use of data tapes, simulations, static test cases, etc.

o Timing analysis - scan rates, run frequency,

- response time, etc.

I o Algorithm - design and data verification i'

o Information flow - communication between data

~

acquisition subsystem and SPD for required input / outputs (analog, digital, and pseudo-digital points)

! o Special calculations The major method used in performing the software design review was direct evaluation and analysis of the design i documentation (desk checking). One of the most important objectives of the review was the independent assessment of the ability of the design to meet performance requirements.

such capabilities as time response, availability, man / machine interface, data validation, operating l

environment, dynamic range, testability, etc., were analyzed as part of the design documentation review. The V&V team

! reviewed this information for correctness, feasibility, and consistency.

Another result of the software design review was the

! identification of design specific test information. This information was required for the design of the test bed, i test environment, and test procedures used during the i Softwarc Validation Test activity.

l

' Page 4 of 11 i

. 1 1

Attachment 7 The results of the Software Design Review activity were documented with any deficiencies identified for resolution.

Upon resolution of deficiencies, additional followup evaluation and analysis of revised design documentation was conducted to ensure full resolution of all design deficiencies.

Software Validation Test Activity The objective of this activity was to validate that the completed SPDS sof tware met all the system requirements and functioned as designed. There were two distinct subactivities:

1) Software Validation Test Plan Development, and .
2) Software Validation Test Execution and Results Analysis.

The foundation for these subactivities lay in the information derived from the Software Design Review activity.

Software Validation Test Plan The most important item in performing system software validation was the development of the Software Validation Test Plan. The plan provided an organized test procedure to demonstrate that the completed system meets all the system requirements.

Items covered in the Software Test Plan included:

o Test Requirements n Test Philosophy e Test Environment o Test Specifications o Test Procedure o Test Evaluation Approach Page 5 of 11

i .

Attachment 7 ,

The V&V team prepared a formal Software Validation Test Plan prior to validation testing.

Software Validation Test Execution and Results Analysis This activity included software validation testing, recording of test results, and the analysis of results for i acceptability. ,

4 Records were kept during each validation test to ensure that test inputs were correctly set up and that the resulting SPDS response was observed and recorded. Particular attention was paid. to the analysis of acceptability of results during testing. The results of the Software Validation Test activity were documented with any deficiencies identified for resolution. Upon resolution of deficiencies, additional followup testing and analysis of

acceptability of results was conducted to ensure full resolution of all software deficiencies. Complete test results, analysis, and nonconformances to acceptability
criteria were documented.

II. Verification of SPDS Software (Design Review) t The objective of the SPDS software verification phase as previously outlined in Section I, SPDS Sof tware Verification and Validation Test Program, was to verify that the SPDS software design is a correct implementation of the systems functional requirements on the Operator Aid Computer (OAC).

I i

The first step in verifying the SPDS sof tware was to gather the documentation requiring review. The Process Computer Unit of the Production Support Department supplied the current SPD software Program Description and the SPD Program I Listing.

l The Program Description contained information on the color

coding of the six (6) Critical Safety Functions (CSF's), a listing of the OAC point ID's used as inputs to develop the CSF output color blocks, a listing of the SPD output point
ID's, and a description of the special calculations l

performed by the SPD. Also included in the Program Description were special timing rc quiremente , the Alarm Video Display layout and the SPD log"c diagrams with their logic segment definition tables.

The SPD Program Listing contained the Assembly Language (PAL) coding of the SPD program as resident on the OAC.

Page 6 of 11 1

Attachment 7 Other support documentation required for the software verification was the Input / Output Summary of all OAC points and the current SPD System functional requirements derived from the Westinghouse Critical Safety Function Status Trees.

Once these documents were gathered and verified to be current, the task of comparing the SPD logic diagrams to the SPD functional requirements began.

The logic diagrams were checked to ensure that the logic selected was a faithful representation of the SPDS functional requirements, that all input and output descriptions were correct, and that all timing constraints of the functional requirements were incorporated. The verified software logic diagrams and the Input Parameter Specification Tables for each CSF were then reviewed in conjunction with the OAC I/O Summary to verify that the correct OAC input point descriptions and point ID numbers were used. The Input Parameter Specification Tables were companion documents to the CSF logic diagrams.

The verification of correct SPD software logic established the SPD logic diagram as the base document for the balance of the software review. This was an important step of the verification activity.

The Segment Definition Tables for each CSF logic diagram were next reviewed for correctness. These tables were English language translations of the logic diagram's input, intermediate, and output segments. Key aspects reviewed in these tables were proper segment numbering, input setpoints, conditional statements, and the logical combinations of individual segments. Special logic segments were also

, covered in the Segment Definition Table review. An example of a special logic segment is the introduction of a specific l' timing cycle requirement.

The next step in the software verification process was to compare the SPD Program Listing to the logic diagram base documents. The program listing was reviewed line by line to ensure the SPD logic was correctly implemented in OAC l software. Overall software architecture was reviewed. Each input and output interface was checked for proper data base point ID's and status as previously determined from the

! logic diagram review. The data base bit matrix within the l SPD software program was verified to ensure the data base reflected the established I/O summary and SPD system l

requirements.

' Page 7 of 11

f Attachment 7 Each initial logic packet entry was checked for internal point ID numbers, logic test types, status indicators, point ID types and last initial packet indicators. The final logic packet entries were reviewed for correct number of packet words,. number of test bits, bits to set if test is true, the test type and last final packet indicators. These packet reviews verified the actual inputs, logic gates, interconnects, and outputs of each CSF logic diagram was correctly represented in OAC software.

The special logic tests portion of the SPD Program Listing were reviewed next to ensure all timing, algorithms, and status flag indicators were correctly incorporated into the SPD logic. All special calculation software was also checked. Finally, the software system operating sequences and execution control were analyzed in detail to ensure the i

completed SPD OAC sof tware would fully perform its intended task as defined by SPDS performance requirements.

All documentation reviewed during the software verification phase was checked for correctness, feasibility and consistency.

This detailed software design review encompassed documents that were based on the McGuire Nuclear Station Safety i

Parameter Display System which had been verified previously.

There were no errors or descrepancies noted during this review.

III. Validation of SPDS Softwara l

The objective of the SPDS software validation phase was to certify that the completed SPDS software functioned as designed and met all the system requirements. This software l

, validation process encompassed the SPDS software installed on the Operator Aid Computer (OAC) and consisted of two distinct parts - the development of a software validation i

test plan and the actual execution and results analysis of the validation test cases.

l Software Validation Test Plan In order to thoroughly validate the SPDS software, an organized and comprehensive test plan was developed. This i

Page 8 of 11 i

{. _ _ _ . _ ..._ _ _ _ __ _ _ _ _ _ _ _ _ _ _ _ _ _ , _ . _ _ _ ,

. ...~ , . - - .- -. .- - . - . . .

Attachment 7 test plan outlined a systematic approach to certify the six (6) Critical Safety Functions (CSF's) listed below:

CSF(02)...SUBCRITICALITY CSF(03)... CORE COOLING l CSF(04)... HEAT SINK CSF(05)... REACTOR COOLANT SYSTEM INTEGRITY CSF(06)... CONTAINMENT INTEGRITY CSF(07)... REACTOR COOLANT INVENTORY The test plan included details of ea.:h test case to be executed. .These details included test case conditions under which the test would be executed, the test case execution sequence, the test case input point ID's and their status,

and the expected CSF output condition. As an aid to develope these test cases an off-line simulation program was developed. This off-line simulation program provided a means to analyse the SPDS program output results. The SPDS's program inputs and subsequent logic were emmulated to produce output statuses for all possible combinations of SPDS parameter inputs for each of the six (6) CSF's. The
' off-line simulation program created a table of input conditions and subsequent CSF outputs which was used to i develope a set of actual test cases. The off-line f

simulation program also helped rule out those test

- conditions which could not physically occur such as a valve i simultaneously indicating both "OPEN" and " CLOSED", or a l tank level simultaneously indicating both "HI" and " LOW".

This off-line program was used to develope test cases where the number of independent input variables numbered eleven (11) or less. Test cases which involved twelve (12) or more

! independent input parameters generated an off-line simulation input / output table too cumbersome to allow test case development. For these instances, the individual CSF was reviewed on a case by case basis to determine logic

. paths which would verify the proper operation of the SPDS.

, Where this approach was deemed necessary, the following three (3) guidelines applied:

l I

3

1) Redundant inputs were identified and grouped together to yield a single T/F toggle.
2) A list of test cases was prepared which confirmed e each branch of the SPDS software for that p '

particular CSF.

Page 9 of 11 l  :<

l

/

---. .. .__,_--_....m__

\

s Attachment 7

3) Input parameters which feed multiple branches were reviewed to assess their effects on the CSF's output development.

Also included in the test plan were details on any special test setups which were required and procedures to follow if the CSF output results were not as expected when analyzed.

The following items were validated for each of the CSF's:

o SPDS inputs o True Status of each input point J o SPDS software logic o Through-put of SPDS inputs to CSF outputs o Correct alarm conditions o Correct video display of each CSF  %

Software Validation Test Execution and Results Analysis once the test cases were fully developed, the actual input point ID's were inserted into the SPDS software residing on the OAC. This was accomplished via the OAC Point Lockout l Program. This process yielded CSF outputs at the bottom of l

the Alarm Video screen in the form of six (6) rectangular l blocks, one for each of the CSFs. The colors of these blocks were either red, orange, yellow, or green, depending on the condition of the safety function. The alarm conditions displayed on the video were then compared to the

" expected" alarm conditions, confirming correct logic implementations, correct true status of selected input parameters, and finally, that the SPDS generated a true indication of the current status of the six (6) CSFs. A l hardcopy output was also generated on the station line printer which served as an archival document.

Where CSF output results were not as expected, the Production Support Process Computer Unit was notified of the problem and validation testing suspended until the area of concern had been resolved. Upon problem resolution sof tware validation then proceeded by retesting the problem area to ensure the software had been corrected. Upon completion of Page 10 Of 11

Attachment 7 the validation test, a hardcopy of the CSF's parameter inputs and outputs were obtained from the printer for documentation and validation history. The complexity of the SPDS required a software testing procedure be developed and conducted which validated the correct software implementation while minimizing the the number of test cases performed. SPDS software validation testing included the following:

o Execution of the SPDS program modules o Display and documentation of the SPDS transitions o operations in accordance with design specifications o Human factors principles As noted in Part II of this summary, the SPDS sof tware was based on the McGuire Nuclear Station Safety Parameter Display System which had been validated previously. There were no software discrepancies noted during this validation phase. Validation test results for the Catawba Nuclear Station were documented in the Duke Power Company Filing system under file CN-1345.01. The validation tests for both Catawba units were conducted on the Catawba Operator Aid a Computer by C.B. McFadden and R.D. Krenzer on April 12th and t 13th, 1984. The Catawba Nuclear Station's version of the Safety Parameter Display System program software were certified for use on April 13th, 1984.

IV. Listing of Applicable Correspondence Date To From Subject 03-06-84 T C McMeekin R C Collins Independent H L Davenport Verification 04-18-84 R C Collins C B Mcfadden Static Validation Results REFERENCES

1. " Verification and Validation for Safety Farameter Display System", Nuclear Safety Analysis Center, NSAC/39, December 1981.
2. " Duke Power Company Response to Supplement 1 To NUREG-0737, Emergency Response Capability For Catawba Nuclear Station", Duke Power Company, April, 1983.

Page 11 of 11

- __ _ . _ -