ML20087A964

From kanterella
Jump to navigation Jump to search

Procedure 503-8500000-51, Verification & Validation Plan
ML20087A964
Person / Time
Site: Cooper 
Issue date: 03/01/1984
From: Thomas N
SCIENCE APPLICATIONS INTERNATIONAL CORP. (FORMERLY
To:
Shared Package
ML20087A940 List:
References
503-8500000-51, SAI-84-1525-264, TAC-51232, NUDOCS 8403080273
Download: ML20087A964 (67)


Text

AGREEMENT NO. 83A-C5 1-323-05-766-XX NEBRASKA PUBLIC POWER DISTRICT Plant Management information System Cooper Nuclear Station VERIFICATION AND VALIDATION PLAN DOCUMENT NO. 503-8500000-51 S Al-84/1525-264 g-

-,.s-

..,b

. w ; ;

~.,- - -

w~:uzm

[ C ClitD.c n g;,.._

~

DN Qw&(/

( M, D {gs.f 4

4 I

/

l l

1 l

SCIENCE APPLICATIONS,INC.

2109 W. Clinton Avenue, Suite 800, Huntsville, AL 35805 * (205) 533-5900 l

8403080273 840301 PDR ADOCK 05000298

__ P D R_, _ ___.

F

AGREEMENT NC. 83A-C5 1-323-05-766-XX NEBRASKA PUBLIC POWER DISTRICT Plant Management Information System Cooper Nuclear Station VERIFICATION AND VALIDATION PLAN DOCUMENT NO. 503-8500000-51 sal -84/1525-264 MARCH 1,1984 Technical Review Author Nina Thomas Date Technical Reviewer Date Documentation manager

'I d A< [dd--

l!' b Date l

J J :'

I Configuration Manager Date i

Q. A. Manager Date Principal investigator

' / 'S' 1

Date 3

'/

[O Date Division Manager i

8 SCIEl4CE APPLICATIONS,114C, da 2109 W. Clinton Avenue, Suite 800,Huntsville, AL 35805 e (205) 533-5900

I t

VERIFICATION AND VALIDATION PIAN EUR

~

NEBRASKA PUBLIC PCKER DISTRICT COOPER NUCLEAR STATION PLANT NJBGDE3T INFORfATION SYSTD1 Prepared BY Science Applications, Inc.

l Lynchburg, Virginia i

l March 1, 1984 4

i l-e

+

-,-,.w,,

,y,

-p

- - -. ~, ~,.., +, -

n..m v.,

..-,y

sAr-84/1525-264 03/01/e4 RECOPD OF REVISIONS g

DATE SECTION INVOLVED PAGE NUMBER REVISION NUMBER ON01/94 coxPLErt REvzszott o

l l

l I

SAI-264-84/1

SAZ-84/1525-264 03/01/84 PAGE COUNT SUM 98Y Section Title Page Count 1

INTRCDUCTION - NPPD PIANT MANAGDIEtTT INEOPFATION S'ISTDi V&V PIAN 1-1 THRU 1-6 2

V&V APPROACH 2-1 THRU 2-2 3

SYSTD!S AND SUBSYSTES CCVERED BY V&V 3-1 THRU 3-2 4

V&V ACTIVITIES 4-1 THRU 4-18 5

PMIS V&V SCOPE 5-1 THRU 5-6 6

REIATIONSHIP OF V&V TO PRCUECT DEVEIOPMENT 6-1 THRU 6-7 7

INTEGRATION OF V&V WITH PPNECT DEVEIOPMEtTf ACTIVITIES 7--l 8-REFERENCES 8-1 THRU 8-2 APPENDIX A WID DIAGRN4 A-1 THRU A-2 APPENDIX B NPPD VERIFICATICN AND VALIDATICN SCHEDULE B-1 THRU B-2 APPEDIX C LIST OF FIGURES C-1 THRU C-ll i

SAI-84/1525-264 03/01/84 TABLE OF CONTENTS Page 1.

INTRODUCTION - NPPD PIANT MANAGEMENT INFCRMATICN SYSTD1 VERIFICATION & VALIDATION PIAN 1.1 V&V Objective 1-1 1.2 V&V Background 1-1 1.3 V&V Plan Purpose 1-2 1.4 V&V Plan References 1-3 1.5 V&V ?lan Organization and Centent 1-4 2.

V&V APPPOACH 2-1 3.

SYSTENS AND SUBSYSTD!S COVERED BY V&V

.3.1 V&V Ephasis 3-1 3.2 ERF Upgrade Activities Outside of V&V Secpe 3-2 i

4.

V&V ACTIVITIES I

I 1

4.1 System Requirements Verification 4-2 4.2 System Design Verification 4-4 4.3 System Validation 4-7 4.3.1 validation Test Plan Developnent 4-7 4.3.2 Validation Testing 4-9 4.4 Field Installation Verification 4-10 4.5 Final V&V Report 4-11 4.6 V&V Documentation 4-12 ii

SAI-84/1525-264 03/01/84 TABLE OF CCNTH ES (CCNTINUED) 5.

PMIS V&V SCOPE 5.1. System Requirements Verification 5-1 5.2 Syst m Design Verification 5-3 5.3. System Validation 5-4 5.4 Field Installation Verification 5-5 5.5 Final V&V Report 5-6 6.

REIATIONSHIP OF V&V 'IO PRCGECT DEVEICPMENT 6.1 Vav Teant Independence 6-1 6.2 V&V Requirements 6-1 6.3 Developnent Documentation Required / Supplied fer V&V 6-2 6.4 Developnent Documentation A'pproval and Centrol 6-4 6.5 Discrepancy Handling 6-4 6.6 Configuration Management ard Quality 6-5 Assurance Documentation 7.

INTEGRATION OF V&V WITH PRCUECT DEVEIOPMENT ACTIVITIES 7.1 V&V Milestenes and Schedule 7-1 7.2 V&V Schedule Ccnstraints 7-1 8.

REFERHEES 8-1 iii

SAI-84/1525-264 03/01/84 APPDDICES Appendix A WID Diagram A-1 Appendix B NPPD Verification and Validation Schedule B-1 Appendix C List of Figures C-1 O

IV

SAI-84/1525-264 03/01/84 LIST OF FIGURES Figure Title Page i

1-1 Relationship of V&V Activities 1-6 4-1 Overview of V&V Activity Perforrance 4-13 4 Perform System Requirenents Verification 4-14 4-3 Perform Design Verification 4-15 4-4 Perform System Validation 4-16 4-5 Perform Validation Testing 4-17

'4-6 Verification and Validation Doctmentation 4-18 6-1 V&V and Develognent Team Organizational Independence 6-6 6 Relationship of V&V to Project Develegnent 6-7 l

A-1 Conventions Used In V&V Interactive A-2 Descripticn Diagran4 I

l v-i

SAI-84/1525-264 03/01/84 1.

NPPD PIANT MANN2MENE INFORMATICN SYSTEM VERIFICATION A!O VALIDATION PIAN i

1.1 V&V Objective The objective of the verification and Validation (V&V) Program for the Nebraska Public Power District (NPPD) Cooper Nuclear Station Plant Management Information System (CNS PMIS) is' to provide a quality system through independent technical review and evaluation.

The V&V effort described in this Plan meets the basic objective of providing necessary and sufficient documentation to support demonstration to the Nuclear Regulatory Commission (NRC) that an adequate independent technical

, evaluation has been made on the PMIS ccuputer system.

1.2 V&V Backaround Due to the evolving nature of software quality as a technical di'scipline, end because of the continuing concern with software quality, the Nuclear Regulatory. Commission (NRC) has focused on the more technical approach of V&V for improving the quality of software systems versus an emphasis on the procedural aspects of the nuclear quality assurance program.

In addition, the NRC has indicated in several internal memoranda that their review of the

-licensee's compliance with NUREG-0696 (Reference 5) criteria will be based primarily on the V&V documentation. By taking this approach, the NRC is relying on the licensee to provide technical review documentation acquired through an independent V&V program to eliminate the need for a detailed technical review and independent testing by the NRC staff. Thus, the V&V approach and level of effort described in this V&V Plan is c::iented toward providing documentation for review by the NRC of the V&V proc' T. utilized to insure that quality has been built into the PMIS.

The level cf tha V&V activity prescribed is a balance betwcen standard industry procedures for 1-1 I

SAI-84/1525-264 03/01/84 nonsafety software and the industry approach utilized for safety system software. The V&V team staff is independent of the develcpment team and quality assurance program.

This independence ensures that a separate technical evaluation of the system will be performed without programmatic bias.

1.3 V&V Plan Purpose Figure 1-1 shows the relationship of this V&V Plan to the V&V activities.*

This V&V Plan defines the basic framework for V&V of the PMIS and shows how it is integrated with PMIS development.

The contents of this V&V Plan includes an elaboration of the required V&V activities, an outline of the V&V documentation, and a summary of implementation requirements.

'Ihis V&V Plan is to be included in the SPDS Implementation Plan submitted to the NRC for review and approval (Figure 1-1, Activity 2).

This plan will be modified,. if necessary, to reflect feedback as a result of the evaluations performed by NPPD and the.NRC.

V&V procedures will be developed (Figure 1-1, Activity 3) to define the detailed activities required to implement V&V evaluatien and tests outlined in this plan.

Tools and techniques to be used for conducting those activitie's will be presented in the V&V Precedures document.

Development and use of a functional capabilities matrix, evaluation checklists, discrepancy reporting system, structured graphic techniques for analysis and presentation, and design walk-throughs are examples of techniques which are employed to implement a systematic balanced V&V program.

i.

l

  • Note: V&V Interactive Description (VVID) diagrams are used throughcut this t

Plan to depict the V&V activities and activity interactions.

Appendix A describes the conventions used in these diagrams.

1-2

SAI-84/1525-264 03/01/84 In the V&V Procedures document procedures and appropriate tools and techniques will be defined for each of the following V&V activities:

e System Requirements Verification e

System Design Verification e

System Validation Validation Test Plan Preparation Validation Testing and Evaluation e

Field Verification The V&V Procedures document may also be useful to NPPD staff on future Gfforts which incorporate V&V as a tecnnical evaluation method for assuring E7 stem quality.

The V&V Plan and V&V Procedures will be followed throughout the V&V Activities (Figure 1-1, Activity 4).

Each of these V&V activities will be discussed in Section 4.

1.4. V&V Plan References 7he following PMIS documents were referenced in preparing this Plan:

o NPPD Plant Management Information System Statement of Work (document number 1-323-05-766-00) o Nebraska Public Power District Plant Management Information System, Cooper Nuclear Station Project Schedules (document 540-8500000-00) 1-3

'SAI-84/1525-264 03/01/84 i

NPPD CNS PMIS Software Development Plan - Draft e

1 Verification and Validation for Safety Parameter Display Systems, e

NSAC-39, December 1981.

l.5 V&V Plan Organization and Content Section 2 of this Plan defines the V&V approach taken for V&V of the CNS PMIS and subsystems.

Section 3 discusses the systems and subsystems to be covered by V&V and identifies the areas of primary enphasis.

Section 4 describes each of the V&V activities to be applied in evaluating the PMIS: System Requirements Verification, Design Verification, System Validation, Field Installation Verification and V&V Documentation.

- Section 5 discusses the scope of V&V for the PMIS.

Each of the V&V activities are discussed and the V&V reports to be provided at the end of each activity are identified.

- The V&V Team's relationship to Project Development, Quality Assurance and Configuration Management is discussed in Section 6.

mis section describes the roles and responsibilities of NPPD, the SAI Project Management Team,-

Development Team and V&V Team in the product development, evaluation, and

. approval process. ' he development documentation required for each phase of V&V is identified in this section.

l he V&V Milestones and Schedules are presented in Section 7.

Coordinatien of V&V activities with development activities is discussed.

References used throughout this document are listed in Section 8.

+

l l.

L l-4 P


m y

,m-

---.,---m,

~SAI-84/1525-264 03/01/84

- Appendix' A describes the conventions used in the VVID diagrams which are k

used to define V&V activities and their in'errelationships.

The V&V t

. milestone and schedule chart is included as Appendix B.

4 O

4 1-5 l

f NRC VSV

'd

'f REr)UIREMENTS' GU NCE Cl LES y

y

~

/'

d

( >h '

NPPD STATEMENT

\\

W OF WORK i

DEV LOP NSAC-19 ygy p, g 3

3 PLAN PMIS SOFTWARE ff DEVELOPMENT PLAN (-

V&V

" ^ "

l EVALUATE

/- APPROVAL OF

. g V&V PLAN (1-NRC & NPPD A>

V&V EEEDBACK PLAN

]

A x

V j

PRO.

TEAM YEY DEVELOP PRPCE004 s d

(

4 ygy PROCEDURES l

Y&V TEAM b

V SYSTEM DEVELOPMENT DOCUMENTATION

  • PERFORM V1V FINAL REPORT NRC STANDARDS AND recut ATIONS V&V h

A A

  • SYSTEM DEVELOPMENT DCCUMENTATION ENCOMPASSES SYSTEM PHIS NPPD REGUIREMENTS, DESIGN DOCUMENTATION AND DEVELOPER'S TEST T

DOCUMENTATION Ob RELATIONSHIP OF VEEV ACTIVITIES A

FIGURE 1-1 2-2384

SAI-84/1525-264 03/01/84 L

2.

V&V Approach t

he V&V program for the CNS PMIS is based on the USAC--39 report prepared by

. SAI for the Nuclear Safety Analysis Center with a few specific refinements.

The refinements are a result of the V&V Team's experience which has shown that the V&V effort; (1) must be tailored to the specific hardware and software development approach; (2) should not be on the critical paths; (3) should be integrated with the project quality assurance program; and (4) must not interfere with the approval procedures of the customer.

To meet V&V objectives, the V&V effort is fully integrated with'the development process. Some key issues which influence the selection of SAI's approach are the documentation requirements defined in the Statement of Work, the DISTRICT approval process, the quality assurance program, and the configuration management system.

The integration of V&V with the development process is discussed in Sections 6 ard 7.

A ke'y aspect of the V&V approach is the use of both formal and informal V&V L

cctivities.

An informal review is performed prior to the formal V&V cctivity.

W e informal review permits a brief verification effort prior to cubmittal of documentation to the DISTRICT for approval. The purpose of this informal activity is to identify obvious discrepancies prior to DISTRICT review. Se detailed formal verification effort is performed after customer approval; i.e., after all DISTRICT identified deficiencies are resolved. As a result, there will be fewer discrepcies identified in the formal V&V efforts that must be resolved by the design team and that will require DISTRICT approval.

2-1 L

L e

SAI-84/1525-264 03/01/84 r

l The balance between informal and formal review is cnosen to provide the design team the benefit of having an external technical evaluatien and yet reduce the amount of effort required to resolve discrepancies which are formally identified.

Also, it will be advantageous to the DISMICT to have results of the informal V&V activity prior to starting their approval

process, In addition to identifying discrepancies as early as possible in the development effort, this two-step V&V approach provides assurance to the DISTRICT that V&V will not be on the critical path and thus will not interfere with the project schedule. With this approach, the participation of the DISTRICT staff in the V&V effort may also be instrumental in reducing the approval effort since the same staff may participate in V&V activities and DISTRICT approval.

mis approach will provide a significant benefit to the DISTRICT in both manpower utilization and in schedule conformance.

Experience in implementing this two-step approach indicates that an improved product is achieved with minimum ccst of the V&V, effort Documentation produced during the V&V effort provides assurance that an adequate technical eva' ttion has been performed.

This approach is dependent on utilizing the documentation produced by the develcpment team without unnecessary duplication of effort.>

'Ihe specific documentation to be prepared as a result of the V&V effort is identified in Figure 4-6 and discussed in Section 4.

For those act:sities which have both an informal and formal V&V step, only the docume 2 tion prepared as part of the formal step is listed (the informal reports are for internal project uses, although they will be made available to DISTRICT as background material for use in their approval activity). The contents of the documentation will be consistent with the typical report contents which are described in NSAC-39.

2-2

i SAI-84/1525-264 03/01/84 f

3.

Systes and Subsystes covered by V&V 3.1 V&V atphasis V&v evaluation emphasis will be placed on the ability of the PMIS system to meet the intent of NUREG-0737, Supplement 1 requirements (Reference 4).

'Ihese NRC requirements cover Emergency Response Facility upgrade activities which include the Safety Parameter. Display System (SPDS), Technical Support Center (TSC), and Emergency Operations Facility (EOF) computer system functions provided by PMIS. Other PMIS functions, such as the Nuclear Steam Supply System (NSSS), are not addressed in these NRC requirements and are not considered Emergency Response Facility upgrade activities.

Three approaches will be used in the V&V evaluation of PMIS. Approach I will be used in evaluating the SPDS functicns.

Approach II will be used in evaluating the TSC and EOF functions performed by PMIS.

Approach III will be used in evaluating other plant computer functions performed by PMIS which include the NSSS functions.

The three approaches are based on the relative importance ef the PMIS functions with ' regard to the Emergency Response Capability upgrade requirements.

Approach I is required for the SPDS due to the NRC's emphasis on these functions and due to the SPDS requirements being most stringent. TSC and ECF functions implemented by PMIS are evaluated using Approach II due to the fact that they are in response to Emergency Response Capability upgrade requirements; however, less stringent recuirements have been placed on these functions. PMIS functions not related to the Emergency Response Capability upgrade activities are evaluated using Approach III. Approach III will emphasize validation of the systems functicns.

NSSS functions are covered by Approach III V&V.

3-1

SAI-84/1525-264 03/01/84 3.2 ERF Upgrade Activities Outside of V&V Scoce The scope of this V&V Plan does not include application of verification and validation to the field sensors, emergency operating procedures, operator i

training or Safety Analysis.

Informatien from each of these components of the total Emergency Response Facility (ERF) uograde activity will be used as background and supporting iaformation in performing V&V evaluation of the PMIS subsystems covered by V&V.

Control Room Design Review and Human Factors Engineering Evaluation information may be used in the evaluation of the SPDS.

Validation of the functionality of the displays is addressed in the human factors engineering evaluation and is not part of this V&V program. The V&V Team will use human factors engineering evaluation recults during the design verification.

Even though this V&V effort does not cover the above mentioned areas, the information derived from each of those activities does impact some of the decisions made in V&V evaluation activities.

. In reviewing this important background information, if inconsistencies or I

problem areas age found, the V&V Team will report these problems to the i

appropriate individuals. If the problems are significant and affect the performance of V&V, then the V&V Team will report these as discrepancies.

3-2 1

SAI-84 2325-264 03/^1/e4 4.

V&V ACTIVITIES i

Several different levels of V&V programs are needed to meet the requirements of different systems depending on the nature and impact of the system's intended use. For many military software development programs, there are as many as 10 V&V activities that must be performed to meet V&V program requirements. For safety grade software, nuclear industry standards such as ANSI /IEEE-ANS-7.4.3.2-1982 indicate 8 V&V activity phases.

The V&V program activities described in this V&V Plan are based en the NSAC-39 report and is a practical balance with the size and complexity of the PMIS.

The four V&V s

steps described in NSAC-39 and being applied for PMIS include Systems Requirements Verification (System Requirements Review), Design Verification (Design Review), Field Installation Verification (Field Verification Test),

and System Validation (Validation Test and Report)." A Final Verification and Validation Report will be prepared to aid in the licensing process.

@is final report will summarize the results of the four activities listed above and will summarize the discrepancies found during the V&V evaluaticn.

The balanced approach provides assurance that the system has been constructed in accordance with system requirement specifications and is a cost-effective implementation of V&V.

Figure 4-1 shows an overview of the V&V. activities to be applied in evaluating the PMIS and the following paragraphs describe each of these activities.

  • Note: he activity names shown in parenthesis are the names used in NSAC-39.

4-1

I l

h SAI-84/1525-264 03/01/84 4.1 System Recuirements Verification System Requirements Verificatica (Figure 4-1, Activity 1) is a review of the PMIS system requirements documentation against standards'and regulations (NUREG-0737, Supplement 1; NUREG-0696; Reg. Guide 1.97, etc.). The object of this evaluation is to determine that the system functions meet the intent

~

of NURIII-0737, supplement 1 and that the NRC guidelines have been followed.

We system requirements are the foundation on which the completed system must be designed, built' and accepted.

The completed system must be validated against the. System Requirements Document; thus the correctness and completeness of this document is essential. Errors and ambiguity in the System Requirements Document will likely lead to an unsatisfactory system.'

The System Reqvf aents Verification is perhaps the most important V&V activity. 'Ihe principal goal of this activity is to independently determine if the requirements will result in a possible and usable solution to the entire problem that meets NRC requirements.

The requirements are reviewed for correctness, completeness,. consistency, understandability, feasibility, testability, and traceability.

The Requirements Ve$ification also provides the basis for developing the system Validation Test Plan and reviewing the

, developer's Final Test Plan.

System Requirements Verification is separated into a two-phased activity to support DISTRICT review and minimize schedule impact.

An informal review is proposed prior to the DISTRICT review and approval of the requirements docu nentation. Comments of the V&V Team's findings will be furnished to the DISTRICT and - the Development Team.

The formal System Requirement Verification will begin following DISTRICT approval. Figure 4-2 illustrates the flow of information among the V&V Team, Develcpment Team, and DISTRICT Project Team for System Requirements Verification.

4-2

SAI-84/1525-264 03/01/84 During the informal review of System Requirements (Figure 4-2, Activity 1),

the V&V Team will collect necessary background informaticn and review the System Requirement'.

Preliminary system requirements review documentation s

will be prepared during this System Requirements Verification step.

V&V Team comments from the informal review of System Requirements are submitted to NPPD to facilitate their review and approval process (Figure 4-2,'

Activity 2),

The V&V Team comments are also submitted to the SAI Huntsville Development Team for response. These comments should not be treated as formal discrepancies. Resolution of areas of concern prior to the formal eva)..ation step will help to expedite preliminary error correction and reduce the number of discrepancy reports generated as a v

result of the formal evaluation of the System Requirements.

Upon NPPD approval of the System Requirements Document, the Development Team may begin the system design phase and the V&V Team may begin the formal System Requirements evaluation (Figure 4-2, Activity 3).

The formal System Requirements evaluation encompasses: (1) evaluation of the System Requirements document against NRC requirements; (2) evaluation of the System Requirements document for technical adequacy to perform its intended function, and (3) evaluation of the System Requirements for testability and traceability.

Areas of concern and functions of importance are identified during the requirements evaluation.

This information will be used in develcpment of the Validation Test Plan (discussed in Section 4.3.1).

Discrepancies fcund during the formal evaluation step will be reported to NPPD and the Development Team, and will be included in System Requirements Verification documentation. All major discrepancies should be resolved prior to the start of System Design Verification.

4-3 i

SAI-84/1525-264 03/01/84 A System Requirements Verification Report will be generated to document the System Requirements Verification activities performed and the results of the System Requirements evaluation.

Procedures for performing the System Requirements Verification activity and for discrepancy reporting will be detailed in the V&V Procedures for NPPD CNS PMIS document (Reference 8).

4.2 System Desian Verification System Design Verification (Figure 4-1, Activity 2) is an evaluation of PMIS hardware and software design against the verified PMIS system requirements.

Design Verification provides assurance that the system design complies with the system requirements and extraneou-functions are noted.

Hardware design utilizing off-the-shelf items will not undergo independent design verification, however vendor supplied documentation for these components will be used as background information.

The design documentation will contain much more detail than the System Requirements and will provide details on how the requirements are to be met.

One of the key objectives is to determine that the design is consistent with the System Requirements.

Depending on the design documentation, one should correlate each requirement defined in the System Requirements Verification activity with the specific design feature which implements it.

This correlation can be used during System Validation to identify any additional Validation Tests which may be needed.

Design documentation will also be evaluated to determine system mainta. inability.

1 1

4-4

~

SAI-84/1525-264 03/01/84 One of the objectives of the Design Verification is the independent assessment of the ability of the design to meet performance requirements.

Such capabilities as time response, availability, man / machine interface, data validation, cperating environment, dynamic range, and testability must be analyzed or considered as part of the design process.

The V&V Team reviews the resulting documentation for correctness, feasibility, consistency, traceability, and understandability.

Scme of the performance requirements may be difficult to evaluate, but the identification of these issues during design is much more efficient than discovering them during validation.

Off-the-shelf hardware components which have been proven in the field (i.e.,

component reliability was proven by long-term use) will not be evaluated except to determine that these cmponents meet the System Requirements.

Design Verification is a two-phased activity to support the DISTRICT and minimize schedule impacts. An informal review is proposed prior to DISTRICT review and approval of the design documentation. Comments of the V&V Team's findings will be furnished to the DISTRICT and the Development Team. The formal Design Verification will commence following DISTRICT approval.

Figure 4-3 illustrates the flow of information and interface relationships I

of the V&V Team, DISTRICT, and Development Team for Software Design Verification.

During the informal system design review step (Figure 4-3, Activity 1), the V&V Team will identify, collect, and review the system design and applicable background information.

The V&V Team will participate in the preliminary and critical design reviews (PDR and CDR).

During these reviews the Development Teams present the preliminary (PDR) and detailed (CDR) design to NPPD.

Results of the PDR and CDR may facilitate the V&V Team's identification of significant areas of concern to be focused upon during formal system design evaluation and stressed during System Validation.

4-5 1

SAI-84/1525-264 03/01/84 Problems and concerns identified during the informal review step are reported to the Design Team.

Informal reporting of problems and conct.:ns in the preliminary stage of system Design Verification may reduce the number of discrepancy reports generated later, during the V&V Team's formal evaluation of system design.

The Design Team is provided with an opportunity to respond to the problem reports and make any necessary design changes prior to the V&V Team's formal design evaluation activity. Any of these problems and concerns remaining open during the formal design evaluation will be treated as discrepancies.

The V&V Team's comments from the informal design step are submitted to NPPD to facilitate their review and approval of the design (Figure 4-3, Activity 2).

Once the system design has been approved by NPPD, the V&V Team may begin the formal design evaluaticn (Figure 4-3, Activity 3).

The formal Design Verification encompasses:

(1) correlation of design to requirements, (2) evaluation of the design for technical correctness, and (3) evaluation of the design documentation for completeress.

Subsystem and total s stem performance are addressed during design evaluation.

I Design specific test information will have been acquired as a result of the system design evaluation activi' ties.

Areas of concern' will be identified

.and listed for consideration during System Validation.

i Problems and variances found during the formal design evaluation will be reported to NPPD and the Develcpment Team and will be included in the System Design Verification Report.

Major discrepancies should be resolved prior to the start of the system validatien activity.

The System Design Verification activities and evaluation results are surrmarized and documented-in the System Design Verification Report.

4-6

SAI-84/1525-264 03/01/84 Detailed procedures for System Design Verification steps and discrepancy reporting will be included in the V&V Procedures for NPPD CIS PMIS document (Reference 8).

4.3 System Validation System Validation (Figure 4-1, Activity 3) provides assurance that the final system complies with the system requirerents and thus that the completed g

system correctly performs the intended functions.

Demonstration of acceptable operation of implemented functions is accomplished through a planned testing and evaluation process.

2he objective of validation testing and evaluation is to provide an cnd-to-end check to determine that the system implements the required functions in compliance with the specified system criteria.

System Validation comprises two primary phases:

(1) preparation of the Validation Test Plan, and (2) validation testing and evaluation (Figure 4-4). The foundation for this ectivity lies in the information derived from the System Requirements Verification, System Design Verification, and tests performed by'the developers.

4.3.1 Validaticn Test Plan Develetxnent l

The purpose of the Validation Test Plan is to define the approach to be taken during the validation testing.

'Ihe test approach is designed based on the insight gained during System Requirements Verification and System Design Verification. The test approach will support demonstraticn of subsystem and total system functional capabilities.

Subsystem interface and interaction is stressed in total system validation testing.

4-7

SAI-84/1525-264 03/01/84 he test approach for validation testing includes:

static and dynamic testing of processing and display capabilities e

of the PMIS dynamic tests utilizing time dependent data e

e demonstration of diagnostic capability e

demonstration that SFDS performance is not adversely impacted by other PMIS functions.

Se Validation Test Plan will include:

identification of functions of concern for most thorough testing e

found during System Requirements Verification and Design Verification e

identification of test categories to be covered during the validation testing description of procedures to be used during the analysis of the e

developer's test plans description of procedures to be used during the monitoring and e

evaluation of the developer's tests i

e identification of actions to be taken when a test fails e

description of procedures for defining additional tests if needed e

description of provisions for running additional tests if needed.

4-8

SAI-84/1525-264 03/01/84 4.3.2 Validation Testing Validaticn Testing activities are shown in Figure 4-5.

Validation Testing is integrated with develcpment test activities. The approach to validation testing is.to:

e Review the system test plan and procedures to determine the developer's test coverage (Figure 4-5, Activity 1).

e Monitcr develcper's formal factory testing to establish that the developer's system test plan is followed and that adequate test records are kept to support test evidence in project records (Figure 4-5, Activity 2).

e FNaluate the formal factory test results to determine system perfonnance (Figure 4-5, Activity 2).

e Define additional test requirements if needed (Figure 4-4,

~

Activity 3).

e Evaluate results of additional tests when available (Figure 4-5, Activity 4).

e Report any discrepancies found during validation testing (Figure 4-5, Activity 5).

e Produce Validation Test Report (Figure 4-4, Activity 3).

'Ihese steps in validation testing reduce redundancy in the testing effort, minimize test schedule impacts, and ensure adequate test coverage to establish confidence in the ccrrpleted system.

4-9

SAI-84/1525-264 03/01/84 Additional Validation Test Requirements defined by the V&V Team will be documented in the validation Test Report. The V&V Team will define the additional tests and evaluate the results.

These additional tests will be run at the earliest possible time which does not adversely impact the i

project schedule.

If possible the additional validation tests will be run during Factory Testing or Field Installation Testing.

If tests must be run

. after.the completion of Field Installation Testing, the results will be published as an Appendix to the Validaticn Test Report.

(

The results from each of the Validation Testing activities will appear in the Validation Test Report.

i 4.4 Field Installation Verification l

Field Installation Verification (Figure 4-1, Activity 4) is an evaluation of the validated system after it has been installed. It is a verification that t

the installed system is the one validated during System Validation (Figure 4-1, Activity 3).

As a minimum, Field Verification will ensure that each input signal is prcperly connected and that the signal range is consistent with the design.

Verification that the information displayed is directly correlated with the sensor data input is an objective of field verification testing.

Field Installation Verification is accomplished through V&V Team periodically monitoring the Developer's Site Installation and Acceptance Tests.

t l

4-10

SAI-84/1525-264 03/01/84 Discrepancies found during any of the tests perfor:ned during the field tests

. will be reported to NPPD and the Develcpment Team and included in the Field Installation Verification documentation.

Some problems which may be encountered are:

code execution failure, hardware component failure, i

incompatibilities between PMIS and the field sensors, envircnment-related failures, inability of the operator to understand and use the Operator's Guide or Maintenance Manual.

e e

A Field Installation Verification Report will be prepared to document the l

results of the installation verification activity.

This report will reference the Developer's Test Plan and Test Procedure, and test result l-documentation referenced during the field tests.

'Ihe report will include the V&V Team evaluation remarks and recommendations resulting from the independent observation of the executed tests and the test result analysis.

4.5 Final V&V Report i

> The purpose of the V&V Final Report (Figure 4-1, Activity 5) is to summarize

{~

t'he V&V activities performed throughout the project and to summarize the results of those evaluation activities.

The report provides the foundation for discussions of the scope and results of the V&V effort; it will be organized to aid in reviewing,the adequacy of the validation effort and i

providing confidence in the validated system.

Traceability of the Verification and Validation activities throughout the project, identification and resolution of major discrepancies, and reference to detailed documentation will be provided in the V&V Firal Report.

i 4-11 w

.+_v

-.7 9

-w

SAI-84/1525-264 03/01/84 i

l 4.6 V&V Docurentation f

The specific documentation to be prepared as a result of the V&V effort is i

identified in Figure 4-6.

A separate validation test plan and validation test report will be provided as a result of NSSS Functions V&V.

The contents of the documentation will be consistent with the typical report contents which are described in NSAC-39.

She V&V reports will be outlined i

l in the V&V Procedures for NPPD CNS PMIS (Reference 8).

i f

t 1

i 4-12

V&V PLAN AND~

IQ FROCEDURES H,

V SYSTEM REQUIREMENTS P

REQUIREMENTS VERIFICATION REPORT p

N g

STANDARDS AND VERIFICATION V&V PLAN f

REGULATIONS PRO EDURES t

PERFORM DESIGN DESIGN VERIFICATION REPORT p

DESIGN DOCUMENTATION VERIFICAT10rl V&V PLAN y

AND PROCEDURES h

V VALIDAT!0N TEST PtAN

'> PERFORM SYSTEM DEVELOPER'S TEST DOCUMENTATION VALIDATION VALIDATION TEST RFPART p

m 3

V&V PLAN 3

J(

AND PROCEDURES u

4 PMIS y

<> PERFORM

(.

FIELD FIEtD INSTAtLATION p

l

_m INSTAtt ATION DOCUMENTATION INSTALLATION VERIFICA110N REPORT U

VERIFICATION V&V PLAN &

jk PROCEDURES PHIS f

DEVELOP V1V (p.CINAL

^P>

REPORT

/

--J

~

V&V FINAL REPORT

REVIEW OF SYSTEM 3

(

REQUIREMENTS V&V TEAM COMMENTS J(

6 V&V TEAM

  • h R

EW AND APPROVE NPPD APPROVAL TO COM4ENCE DESIGN REQUIREMENTS 3

(p NPPD APPROVAL J(

V&V PLAN & PROCEDURES NPPD

(

PROJECT TEAM If If SYSTEM REQUIREMENTS VERIFICATION REPORT PERFORM FORMAL l

[

SYSTEM REQUIREMENTS STANDARDS AND REGIL ATI0t!S VERIFICATION mr N

SYSTEM REQUIREMENTS i

DISCREPANCY REPORT Z

V&V TEAM

  • l l

l T

RESOLVE SYSTEM Q

REQUIREMENTS DISCREPANCIES T

SYSTEM JL REQUIREMENTS O!SCREPANCY RESOLUTION DEVELOPMENT TEAM y

l w

  • V&V TEAM INCLUDES NPPD PARTICIPANT Ob<

PERFORM SYSTEM REQUIREMENTS VERIFICATION 2

FIGURE 4-2 2-22-84

.SAI-84/1525-264 03/01/84 r

b i

w I

A A

A T

3 c0 N

m w

N 0

2 8

[

e E

50 5

o 2

0 4

Oc 8=

0 a

E.

4-d5 z

=

a m a.

B 5

=

t 50 0

W O

M SM E

8 5

E yn ME S

C C

O E

m w

m n

Z l

E 5

5 A

E r

v E

E E

8 l

w 5

2 8

(-

j z

t-J o

}.

W E

2 5

r a

m a

=

m.-

E r

x 5 4>

85:

y L

kE w

l E

MEW O

2 6

5 a

JLJLJL S

v m

2 5

0 M e b

W 5 E S

$ *-EW 55 m

85W

~R.

2 o

mp

--a m

m 20 h

o *

<a8" t

a m

O_

e 3g g

lN 1

-m c

f' d)k I

m 4

>u Y

~

m

(

E

~

)

k

=

t S

T b

W Ltj a

m Z

0 m

w 91

=

. 5eg+ 8 2 E

E fa Z

z

'Em L

4-W r

e c,

~

$m 3 8 0 S1 e

Q

=a E

2558 s

E E

=

(f) 4

=

m g a.C51 W

E t

a-W Wg o--

a a

o -

a a

a a

s R s.

Oc O

0 s

b

?

2 8 55 C'r z.

m m

m

=m J

a d t=

o

( E

$ 8 _8 LL g

x-v

(

g m

=m

)

0$

m u.-

1

=~

z

>ma h

'Yg mm O"

S5S 0

=

c 4-15 I

I

e V&V PLAN TEST STANDARDS AND as AND REGULATIONS PROCEDURES 4

'D

.h f f

W SYSTEM REQUIREMENTS VERIFICATION REPORT m

DESIGNVERIFICATIONREPOR( f A ION VAllDATION TEST PLAN DEVELOPER'S TEST PLAN AND TEST.

3 3

y 4

PtAN TR0ctuuuts y

V&V PLAN AND PROCEDURES 1f 1f

~

l

^^

PERFORM

(

VALIDATION l

m TESTING T

AL M

PMIS A

V&V PLAN f

AND PROCEDURES l

VALIDATION TEST i

l RESULTS hlf lf l

t, (R}ARE VALIDATION TEST REPORT D ON TEST REPORT Oh PERFORM SYSTEM VAUDATION j>

FIGURE 4-4 2-22-84

_n V&V PL AN VALIDATION AND TEST PLAN PROCEDURES cn b lf II

. DEVELOPER *5 REVIEW N

TEST PLAN DEVELOPER'S TEST TEST PLAN AND PROCEDURES EVALUATION RESULTS T

AND PROCEDURES +

PLAN AND 3

PROCEDURES y

V&V PLAN VALIDATION AND TEST PLAN PROCEDURES

@t t

HONITOR FACTORY TESTS

(

A AND h,

3 EVALUATE RESULTS V&V PLAN VALIDATION AND N

I TEST PLAN PROCEDURES 1

Pit!S I

DEFINE ADDITIONAL TEST EVALUATIONJ TEST REQUIREMENTS ADDITIONAL TEST REQUIREMENTS RESULTS

(

(IFNEEDED) 8, I,

V&V PLAN VALIDATION AND

-a TEST PLAN

. PROCEDUREt

@v v

EVALUATE ADDITIONAL TEST RESULTS y

r V&V PLAN g

VAllDATION AND i

TEST PLAN PROCEDURES i

l

@ v 1r i

ADDITIONAL

-J I

l TEST RESULTS D

PANCIES I

O h

1 DISCREPANCY J

H REPORTS D

i PERFORM VALIDATION TESTING FIGURE 4-5 2-23-84

SAI-84/1525-264 03/01/84 L

(

FIGURE 4-6 Verification ard Validation Documentaticn PRCulrr e

V&V Plan e

V&V Procedures Manual e

Verification and Validatien Final leport

'FMIS e

System Requirements Verificaticn Report System Design Verification Report e

e System Validation Test Plan e

System Validatica Test Report

~

e Field Installation Verificatien Report NSSS Functions s

l System Validation Test Plan e

I i,,

e System Validation Test Report

}

l 4-18 l-

SAI-84/1525-264 03/01/84 5.

PMIS VEV Smpe r

L h

A comprehensive V&V effort will be applied for the PMI3 with particular emphasis on the SPCS functions.

All of the activities shown in Figure 4-1 and described in Section 4 will be performed. These activities are:

System Requirements Verification e

e System Design Verification e

System Validation e

Field Installation Verification This chapter identifies the specific PMIS and SPDS documentation to be evaluated during each V&V activity and areas of emphasi's during validation testing.

5.1 System Requirements verification 2

The development documentation to be reviewed during System Requirements Verification includes:

e NPPD Plant Management Information System Statement of Work (document number 1-323-05-766-00)

The NPPD PMIS Statement of Work document represents the baseline for this contract. This document has been approved by NPPD and includes the NPPD specifications as well as the SAI Statement of Work. This document will serve as the System Requirements document.

5-1

SAI-84/1525-264 03/01/84 2e System Requirements document will be evaluated against the following NRC

(

documentation:

{

e NUREG-0737, Supplernent 1 (Regulations) (Reference 4) e NUREG-0696 Guidelines (Reference 5) e NUREG-0835 Guidelines (Reference 11) e NUREG-0814 Guidelines (Reference 10) e NUREG-0700 Guidelines (Reference 9) e Reg. Guide 1.97, Revision 2 (Reference 6) o-Reg. Guide 1.23 (Reference 7)

NUREG-0737, Supplement 1 provides the regulatory requirements which must be met by PMIS. NUREGs 0696, 0835, 0814, 0700, and Regulatory Guides 1.97, Revision 2, and 1.23 are NRC guidelines which should be considered in design of the PMIS.

'Ihe NPPD PMIS Statement of Work will be evaluated against the NRC regulatory requirement and guidance documents. Any deviations from these documents will be treated as discrepancies.

The SPDS Safety Analysis prepared for NPPD CNS Plant Management Information System will be used as background information during SPDS Requirements Verification.

The SPDS Safety Analysis provides the basis for SPDS parameter selection and display design.

5-2

SAI-84/1525-264 03/01/84 L

5.2 System Desian Verification The system design documentation to be reviewed during Design Verification includes:

o Volume 1, Design W _w nt*

e Volume 2, Design Document

  • PMIS Functional Specification
  • e Point I/O Stmunary List Document
  • e Glossary **

The system design doctunentation will be evaluated against the verified PMIS cystem requirements which have been stated in the PMIS Statement of Work document.

V&V Team representatives will dcipate in PDR and CDR for PMIS as part of the Software Design Verification activity.

This will enhance the Software Design evaluation by allowing the V&V Team to gain understanding of the system design in a short period of time.

The V&V Team may also monitor the system unit tests (builds) to further increase understanding of the system design and performance.

  • Note: These reports will be prepared by the SAI Development Team and are described in the NPPD CNS PMIS Software Develcpment Plan.
    • Note: Glossary is anticipated to become an Appendix to PMIS Functional Specification.

5-3

SAI-84/1525-264 03/01/84 i

t

'Ihe SPDS Safety Analysis prepared for NPPD CNS Plant Management Information

(

System will be used as background information during SPDS design evaluatien.

'Ibe SPDS Safety Analysis prevides the basis for SPDS paramete'. selection and display design.

This document will be referenced ex'.ensively during evaluation of the SPDS design.

5.3 System Validation The following development test documentation will be evaluated during System Validation for the PMIS:

e Developer's Test Plan

  • e Develcper's Test Procedures
  • e Timing / Sizing Analysis **

e User's Manual

  • e Operator's Manual
  • e System Build Plan ***,

e Developer's Test Results*

  • Note: 'Ihese reports will be prepared by the SAI Develcpment Team and are described in the NFPD CNS PMIS Software Development Plan.

CANote: Not a separate document.

      • Note: Not a deliverable 5-4 l

L SAI-84/1525-264 03/01/84 L

As discussed in Section 4.3, the validation approach is to develop a validaticn approach based on information derived from System Requirements Verification and System Design Verification activities. The developer's test plan and test procedure documentation will be evaluated to determine how well the validation approach is supported by the development testing

(

which should include hardware diagnostic tests, system integration t$ests, and total system performance tests.

The development test plan and

(

procedures documentation will be incorporated into the Validation Test documentation.

The V&V Team will not repeat tests covered by the Developer's tests but will define any additional test requirements needed to support the validaticn test approach.

(

The Developer's Timing / Sizing Analysis, User's manual, and Operator's Manual will be reviewed by the V&V Team and referenced in the Validation Test Plan and Validation Test Report as needed.

l We V&V Team will periodically monitor the developer's formal system testing to insure that the developer's test plan and procedures are followed and test results are documented to provide complete test documentation. The V&V Team will define any additional test requirements needed to assure comprehensive testing.

Test results and test report analysis documentatien I

from the developer's tests and additional tests will be summarized in the Validation Test Report.

5.4 Field Installation Verificatien he V&V Team will evaluate the Developer's field installation test plan and l

test procedure documentation.

Plans and procedures for the site installation and operational performance tests will be included with the factory tests in the Developer's test plan and test procedures document.

We evaluation of field installation plans and procedures will be performed in conjunction with the factory test plan and procedures evaluation. The Developer's test documentation will be summarized in the Field Installatien Verification Report.

5-5 l

L...

1 SAI-84/1525-264 03/01/84 The V&V Team will periodically monitor site installation and site acceptance tests performed by the Developers. Test results and analysis results from these tests will be summarized into the Field Installation Verification Report.

5.5 Final V&V Report-V&V activities and results from V&V of the PMIS subsystems will be cummarized and documented in the V&V Final Peport.

The V&V Final Report will be organized to illustrate implementatien of the V&V Plan and to surmiarize the results of the evaluations performed.

' Discrepancies and discrepancy resolutions will be summarized in the V&V Final Peport.

J s

m

's e

r 5-6

SAI-84/1525-264 03/01/84 6.

REIATIONSHIP OF V&V TO PROJDCT DEVELOPMENT 6.1 V&v Team Independence

'Ihe V&V Team's organizational independence is shown in Figure 6-1. The V&V function is performed by an SAI Group which is separate from the SAI Development Team in Huntsville. This ensures objectivity in V&V evaluation, eliminates preconceived bias, and separates development project problems from the technical evaluations performed by the V&V Team.

6.2 V&V Requirements A disciplined approach for project develcpment, change control, and product approval is required for an efficient and cost-effective V&V program.

'Ihe roles of each team involved in development and evaluation of the PMIS must be clearly defined to eliminate confusion between the teams and to reduce redundancies in the product development and evaluation process.

Specifically, the V&V program is dependent on:

i I

e An orderly development process e

A Configuration Management program to provide document identification, documentation change control and product status reporting.

A clear understanding of the Quality Assurance function so that e

evaluation performed by -Quality Assurance will not be duplicated

.by V&V.

A discrepancy reporting procedure that is integrated with the e

project development process.

6-1

SAI-84/1525-264 03/01/84 Figure 6-2 shows how the V&V task is integrated into the Development Process. 'Ihe following subsections discuss the V&V Tea's relationship with Quality Assurance and Configuration Panagement.

6.3 Develegnent Documentation Recuired/ Supplied for V&V Products (documentation) from each soft. ware development activity are the basis for V&V evaln*4cn.

It is important that the V&V Team have access to the most current and complete development documentation.

Information transmitted v_erbally between NPPD and the Development Team or between Development Team members cannot be considered by the V&V Team during evaluation. The V&V Team must remain aware of any agreements made which permit deviation from the Statement of Work (Reference 2), 'and' these agreements should be in writing. Incomplete or obsolete documentation complicates the V&V Team's task and may result in the reporting of discrepancies.

'1he following list identifi'es the specific PMIS development documentaticn to

~

be~ reviewed during each V&V activity (these were also listed in Section 5).

I i

i 6-2 19

SAI-84/1525-264 03/01/84 l

i i

V&V Activity PMIS Develognent Document System Requirements Verification NPPD Plant Management Information System Statement of Work System Design Verification Functional Specification Volume 1 Design Doc ment Volume 2 Design Dccet Point I/O Sunmary Iist Document Timing / Sizing Analysis Validation Test Plan Develognent, System Build Plan System Validat, ton and Field Installation Verification Developer's Test Plan Developer's Test Procedures User's Manual Operator's Manual other documentation including Monthly Progress Reports, minutes of meetings, tvlephone call memos, and estimates to complete will be used thrcughout V&V as background and support information.

6-3

SAI-84/1525-264 03/01/84 i

6,4 Develetxnent Documentation Aporoval and Control The development documentation products should be approved by CA and Project Management and placed under Configuration Management prior to formal V&V activities being performed. The formal product approval and control process for PMIS is shcwn in Activities 1, 2, and 3 of Figure 6-2.

In summary, the develq, ment products are submitted to Quality Assurance for review and approval.

The Quality Assurance approved document is then submitted to the ccnfiguration Management department for identification and change control.

Once the document has been placed urder formal control, it is then transmitted to the V&V for formal evaluation.

6.5 Discrepancy Handling Problems and inconsistencies found throughout Verification and Validation will be reported by the V&V Team according to a formal procedure.

The V&V Team will review discrepancy resolutions and subsequent system changes.

Discrepancy and discrepancy resolution reports generated throughout V&V will be included in the V&V records.

Some of the typical problems that may be identified and reported as.

discrepancies are:

e deviations from NRC requirements and Statement of Work e

deviations from documentation requirements or development procedures e

documentation inconsistencies e

incorrect design logic or implementation incorrect results (frcm system and validaticn testing) e 6-4

SAI-84/1525-264 03/01/84 Standard discrepancy report forms will be available for reporting discrepancies throughout V&V.

I f

Discrepancy reporting and discrepcy resolution evaluation procedures will be detailed in the V&V Procedures for NPPD CNS PMIS (Reference 8).

6.6 Configuration Management and Quality Assurance Documentation Zhe V&V Team requires access not only to system develcpment documentation but also to Configuration Management and Quality Assurance records.

Change reports and documentation status records maintained by Configuration

' Management will be used by the V&V Team as supplemental information, allowing the V&V Team to remain cognizant of development status during V&V.

One of the V&V activities is to ensure that the appropriate changes have been implemented and reflected in the documentation and 'that discrepancies have been resolved.

To reduce duplication of effort, reviews and evaluations performed and documented by Quality' Assurance personnel will not be repeated by the V&V Team.

The documentation resulting from Quality Assurance reviews and evaluations should be available to the V&V Team.

6-5

PRESIDElll k

4 Ru

.g 4$

EllGillEERiflG EllERGY SOF HARE STEMS SCIEllCES

~

V&V ADillfilSTRATOR QUALITY UTILITIES PROJECT

[

ASSURAllCE OPERATION C0flTROLS i

V&V TEAM

  • Pills C0flFIGURAT10tl PROJECT MAtlAGEMEtlT f1AtlAGER
  • flo te : The V&V Team includes a participant from flPPD.

PMIS DEVELOPMEllT

^~

o TEAM

{R?

FIGURE 6-1 V&V Afl0 DEVELOPMEllT TEAM ORGAfilZATI0flAL lilDEPEllDENCE

u I

~

4 A

7 T

R DEVEL OPMENT

' PROCEDURES b

I T

q PERFORM c PRODUCT NOT APPROVED DEVCLOPMENT -

g'-

  • " AC IIVIIY oA DISCREPANCY REPORT-PROCEDURES TO BE RESOLVED's DEVELOPERS j@ PERFORM

-I s

j

('

(

OA y

PRODUCT

' TASK J-CM h.

p@ ~

j-CONTROLLED PRODUCT g PERFORM PRODUCT

/

CONFIGURATION

/

f APPROVED g*= MANAGEMENT

/

vgy f

m PRODUCT TASK

- V4 V i

PL AN r EVALUATION

/@

f RESULTS h'

4 g

(

3 g

CONFIGURATION PERFORM

(

MANAGEMENT

( ygy y

j EVALUATION rr - DISCREPANCIES b

V V TEAM REVIEW

/

l

  • DISCREPANCIES %

DISCRE.PANCY REPORT,

k NO ACTION REQU: RED PROJECT ESTABLISH

\\

MANAGEMENT COMPLETENESS -> DELIVER NPPD

(

g 4

(

h J

OA PROJECT MANAGEMENT v

NPPD 8

b RELATIONSHIP OF V$V TO PROJECT DEVELOPMENT h

BSP FIGUkE 6-2 2 2364 l

1

SAI-84/1525-264 03/01/84 7.

INTEGRATICN OF V&V Wrm PROJECT DEVEIOPMENF ACTIVITIES 7.1 V&V Milestones and Schedule

'Ihe V&V activities have been integrated with the PMIS development and NPPD approval activities. V&V activities have been planned so that there is minimum' interference with the system development and testing milestones'and schedules.

V&V planning takes into account the system development milestones and aligns V&V milestones with them to minimize ccnflict. The V&V schedule and milestones are shown in Appendix B.

7.2 V&V Schedule Constraints The PMIS development test schedule and shipment date have the greatest impact on the V&V milestone schedule.

System Requirements Verification and System Design Verification activities are scheduled so that they are complete in advance of formal system testing. The results of these V&V activities may impact the PMIS software and hardware design, therefore time will be allotted to allow-for the development team to make the changes l-necessary to the design, code, and test documentation prior to formal testing and shipment of the system to CNS.

l l

-Validation Testing is usually performed prior to shipment of the system to the site.

To accommodate the shipment date and prevent conflicts with the PMIS formal test schedule, the Validation approach will be to rely on the Developer's to perform a majority of the functional tests during factory

testing with the V&V Team assuming a monitoring role.

Site Installation and Field Tests will be monitored by the V&V Team as part of Field Installation Verification.

L 7-1

,,--e,

,. ~,

SAI-84/1525-264 03/01/84 8.

RMMNCES I

l 1.

Software Development Plan Plant Management Information System Cooper Nuclear Station, SAI-83-5082 &HU, January 20, 1984.

I 2.

NPPD Plant Management Information System Statement of Work (document number 1-323-05-766-00).

3.

NSAC-39, " Verification and Validation for Safety Parameter Display Systems, prepared by SAI for Nuclear Safety Analysis Center, December 1981.

4.

Supplement 1 to NUREG-0737, Requirements for Emergency Response Capability (Generic Ietter No. 82-33), Decerrber 17, 1982.

5.

NURTr-0696, Functional Criteria for Dr.argency Response Facilities, February 1981.

t-6.

Regulatory Guide 1.97, Instrumentation for Light-Water-Cooled Nuclear Power Plants to Assess Plant and Environs Conditions During and Following an Accident, Revisien 2, Decerrber 1980.

7.

Regulatory Guide 1.23, Onsite Meteorological Programs, February 17, 1972.

8.

Verification and Validaticn Procedures for Nebraska Public Power District Cooper Nuclear Station Plant Management Information Systen, SAI-84/1024-264, January 31, 1984.

8-1

SAI-84/1525-264 03/01/84 9.

NUREG-0700, Guidelit.es for Control Room Design Reviews, September 1981.

10. NUREG-0814, Methodology for Evaluation of Emergency Response Facilities, August 1981.
11. NUREG-0835, Human Factors Acceptance Criteria for the Safety Parameter Display System, Oc+ h 1981.

e 9

e 2

1

. 2

SAI-84/1525-264 03/01/84 APPENDIX A WID Diagram

'1he Verification and Validation Interactive Description (VVID) diagram is one of the tools used in VGV planning and implementation.

Figure A-1 illustrates the conventions used in WID diagrams. VVIDs are effective for use in illustrating process activities, inputs, outputs, relationships between activities, and external impacts on the activities.

4 A-1

b

~

j 10 T

NOTEi STRIPE AT THE TOP OF ACTIVITY BOX INDICATES THAT THERE IS A DRAWING WHICH SHOWS FURTHER DETAll OF THE ACTIVITY CONSTRAINTS (EXTERNAL. CONSIDERATIONS WHICH AFFECT THE PROCESS) h h

M PROCESS

- OUTPUT INPUTS-*

(TASK OR SUBTASK)

OR REPORTS OR INTERMEDIATE Y.

  • (PRODUCTS)

I h

t t

RESOURCES (WHO CONTRIBUTES TO TFE TASK)

INTERACTION

^

(REVISION BASED ON REVIEW COMMENTS)

BSP 2-21 84 CONVENTIONS USED IN VsV INTERACTIVE DESCRIPTION DIAGRAMS y

f noun u

}

SAI-84/1525-264 03/01/84 APPDOIX B NPPD Verification and Validation Schedule l

B-1

k>!7t eb.

S o

1 D1 1

1 l

i 4

t P

l Ol l

i l

St i

1 A1 1

1 1

1 1

J1 1

J1 1

1 E

- 1 1

t t1 L

1 U

A1 1

1 D

s 1

1 E

l o

i1 l

l o

l l

i _

't C

- I i

S I

l Jl l'

N Dt i

O 1

T 1

1 I

1 4

F 1

1 A

O1 1

D l

i St I

1 1

L 1

A1 A

1 V

J1 1

1

/

l N

Ji l

O l

l li I

Fl T

l A

l AI 1

C 4

1 t

I 0

il1 l

l F

0 F1 1

1 I

t 1

R J1

_ EZ

_3-1 1

E 1

1 Z__

D1 '

1 V

1 3_

1 l1 S

i1 I

N A

P C

I T

I

/

A A

F S

C L

I I

P t

t N

F O

E T

t t

f t

C O

I T

I V

T t

S I

t S

T f

O t

f E

T E

E A

L P

Of D

t A

V T

D L

E P

i P

u C

I A

t E

I l

i P

D I

L T

t i

r I

E F

G O

A S

V N

l C

I I

I V

V l

l l

A 0

t S

T E

E A

l I

V f

L 4

V f

l P

P V.

D D

E D

L I

T L

A L

V V

O L,

S E

A I

l L

r E

/

A Y

I I

l l

V V

S V

S F

F I

t I

F i

i l

i P

P E

E t

l i

t O

O t

t l

t t

C T

f f

l f

l L

L O

O A

O o

U I

E E

F F

P F

F D

t t

t t

E t

t V

V f

f O

u f

f E

E E

E E

E t

t u

i i

D D

P P

P P

P P

s ya t

t

SAI-84/1525-264 03/01/84 APPENDIX C List of Figures The following is a list of all figures contained in this report and a copy of each for easy reference.

Figure Title Page 1-1 Relationship of V&V Activities C-2 4-1 Overview of V&V Activity Perforniance C-3 4-2 Perfom System Requirements Verification C-4 4-3 Perform Design Verificaticn C-5 4-4 Perform System Validation C-6 4-5 Perform Validation Testing C-7 4-6

' Verification and Validation Documentation C-8 6-1 V&V and Develognent Team organizational Independence C-9 6-2 Relationship of V&V to Project Developnent C-10 A-1 Conventions Used In.V&V Interactive

~

Description Diagrams C-ll C-1

g 'g4/1525-264 03/01/84 L

N'l AA e

E o

S ri 2

N b $

N E r t ".

v

" "I +$ =

5

$5 *h r

r

>E

+ 35 Ct wC C

YN at LU 5

m w

C To d=8 p

="O V

o o 6

E f

ttf l

7

  • E E b w

=

a Og c

5:=

5 3

O_ -

s"e

'W

+

w U

4 E

6

+ng E

(f) 25 Z

A

=

a 5

  • 8 O

E =

C d, z

m o

0 S S to a

. > W".0 85 Ld m

o

=

a

=

" a o--> d5

+>b t i "5

(r E

s W~E-0"d 8

5;

~

e

  • $ 8 2

c e C5 id C a 55

=

6 ma 5

55 AAA AA ue 5

Es g

me 5 %

5 a

L j

L EG Ma z

E

' =

EE E

a s c5a e a e r oa Og Gs %

S 8 5"W as a me use 5w tw a ta

m. u 8 5

mm zo z

m o-C-2

L V&V PLAM.

p AND L'

' PROCEDURES 4

-SYSTEM RE0illREMFN TS p E

Y EM

.RE00lREMENTS VERIFIfXTION REPORT R

N p

j STANDARDS AND VERIFICATION V&V PLAN N

REGULATIONS PRO URES

- t PERFORM DESIGN DESIGN VERIFICATION REFPRT m

DESIGN DOO! MENTATION VERIFICATION V8V PLAN AND PROCEDURES t

VALIDATION TEST PtAN

'> PERFORM SYSTEM 9

DEVELOPER'S TEST DOCUMENTATION VALIDATION VALIDATION TFST RFPORT p

m 3

V8V PLAN 3

w 1*

AND T

PROCEDURES PMIS g

<> PERFORM C

FIELD FIEt 0 INSTAt t ATION m

p INSTALL ATION DOCUMENTATION C

INSTALLATION VERIFICATION REPORT F

VERIFICATION V&V PLAN &

jl PROCEDURES a

PHIS f

DEVELOP V&V Q

FINAL

^

y-REPORT F

/

/

V&V FINAL l

REPORT o

i g

C OVERVIEW OF VfEV ACTIVITY PERFORMANCE 2

FIGURE 4-1 2 22-81 e

l h

9 SYSTEM PERFORM INFORNAL-INFORMAL REVIEW COMMFNTS TO DEVELOPERS p

d :*

EQt EMENTS >

REVIEW 0F SYSTEM 3

s.

REQUIREMENTS V&V TEM COMMENTS Jg V&V TEAtt' h

RE W AND APPROVE NPPD APPROVAL TO COM4ENCE DESIGN REQUIREMENTS J(

APPROVAL V&V PLAN & PROCEDURES PROJECT TEAM If If SYSTEM REQUIREMENTS VERIFICAtl0N REPORT E

PERFORM FORMAL STANDARDS AND REGill ATlot:S F CAT SYSTEM REQUIREMENTS DISCREPANCY REPORT V&V TEAM *

(

RESOLVE SYSTEM T

4 REQUIREME NT.S DISCREPANCIES 3

SYSTEM

- J(

REQUIRE!!ENTS DISCREPANCY RESOLUTION DEVELOPMENT TEAM

  • V&V TEAM INCLUDES NPPD PARTICIPANT S:

h:

PERFORM SYSTEM REQUIREMENTS. VERIFICATION 1

FIGURE 4-2

V&V PLAN AND m

REQUIREMENTS PROCEDURES

.(

VERIFICATION

.h f

. REPORT N j m

]

PREPARE FOR Y

DESIGN.

INc0RMAL REVIEW C0fNENTS TO DEVELOPERS N

, DESIGN 00CtMENTATION**m VERIFICATION 3

INFORMAL

]

.e V&V TEAN h

C0tNENTS V8V TEAM

  • k.->

APPROVE I

(

DESIGN NPPD APPROVAL TO C0tNENCE CODING m

DOCUMENTATION 3

8 NPPD APPROVAL AL

(

VaV PLAN AND PROCEDURES n

TEAM b

If Y

( SYSTEM REOUIREMENTS'VERIFirATION RFPORT PERFORM FORMAL DESIGN VERIFICATION REPORT k

DESIGN DESIGN DOCUMENTATION "

VERIFICATION A

.()---

DISCREPANCY REPORT V&V TEAPP V&V PLAN AND PROCEDURES 1r

(

DESIGN DISCREPANCY RESOLVE DESIGN

  • V&V TEAM INCLUDES NPPD PARTICIPANT RE50 Lull 0N T

( _> DISCREPANCIES

    • DESIGN DOCUMENTATION INCLUDES HUMAN FACTORS ENGINEERING RESULTS o

A

(

DEVELOPMENT e

TEAM y

g PERFORM DESIGN VERIFICATION

.c.

FIGURE 4-3 2-22 84 ff 11

~

V&V PLAN TEST STANDARDS AND AND REGULATIONS PROCEDURES 1

.n A-SYSTEM REQUIREMENTS VI -)f

'1f h

-m VERIFICATION REPORT m

DESIGNVERIFICATIONREPOR( 'A ION VALIDATION TEST PLAN DEVELOPER'STESTPLANAND{

TEST 3

3

-y.

PtAN TRouuunts.

7 3

V&V PLAN AND PROCEDURES lf If PERFORM O

k VALIDATION 1

TESTING 3

]k PMIS A

f' V1V PLAN

.AND VALIDATION TEST PROCEDURES RESULTS hlf If t,

PRggE ION TEST REPORT O

A PERFORM SYSTEM VAllDATION j

FIGURE 4-4 2-22-84

SAI-84/1525-264 03/01/84 i

A A

A A

A b

5 E

o---

3o8 i

'40->

3 e O

o.

m

==

d te W

M2 v

Ez f

Me C

p5 ee E

M

< a.

av N

2 ->

S2 N

un

> - g "A A

E 9 f m

h f

g $

&----q 8

a -gE ->

l a

8

>C i m

m a

at a0 mMa na Ez GSW SM E

C5 St-p 5 "-> w <8 0 80 2

3 3;

e se @

m i e h

d08 a

E

'5E 5 ->

W C 0

2 55

55 5

5a g$ 5 0

3 Ez W"*m Z

3 p5 CM C

-H w

<m mm SG+O $ 4 W

0 i

s*

Li.!

J o f m

o 5

a r

=

o------,

+g

a. o 8 g

/_ =

'E wiO*

4-O' 22 5

W 20 r

5 O

Es F--

E

$@$ 4--$

'O-T 8

m s

5:<a

=

me e

B e-

>w h8 d

g S*

2"

=

  1. c c

w

a. o S m

335->

>O A

2 2

0 0'

O

~

-* -e Lt m o a "8 Q'

29 g

EE-Ed M" 3. O a

O-Og 0

g,_ ->

a a.

-m Y '

r m

=

8' 55 0 ac 2-

. o,-

O0 9 o-C-7

SAI-84[525-264 03/01/64 i

f FIGURE 4-6 L

Verification and Validation Documentat.an PRG7ECT e

V&V Plan e

V&V Procedures Manual e

Verifiestion and Validation Final Report PMIS e

System Requirements Verification Report System Design Verification Report e

System Validation Test Plan e

e System Validation Test Report e

Field Installation Verification Report NSSS Functions System Validation Test Plan e

System Validation Test Report e

C-8 I-

PRESIDENT b

I T

x T

M EllGINEERIrlG EllERGY SYSTEMS S0 I ARE SCIENCES V&V ADillfilSTRATOR n

QUALITY UTILITIES PROJECT ASSURAriCE OPERATION CONTROLS v&V TEAM

  • PMIS CONFIGURATION PROJECT MANAGEMENT MAtlAGER
  • Note:

The V&V Team includes a participant from NPPD.

pgg3 DEVELOPMENT o

TEAM R

S 3

FIGURE 6-1 V&V AND DEVELOPMENT TEAM ORGANIZATIONAL IllDEPENDENCE i

~

~

_~

A R

0; ti 6

4 DEVELOPMENT PROCEDURES T

k Q

PERFORM

,c PRODUCT NOT APPROVED DEVELOPMENT g (

  • ACTiVIIY aa DISCREPANCY REPORT-PROCEDURES TO BE RESOLVED %

,#' @ PERFORM f

DEVELOPERS s

j N

3

(

OA-TASK PRO T

CM PROC DURES k

pg f-CONTROLLED PRODUCT

,'g PERFORM PRODUCT

/

o4 CONFIGURATION

/

APPROVEDj,

f> MANAGEMENT

/

VGV o

PRODUCT TASK

/

PLAN EVA U TION h

d(4)

Y RESULTS g

CONFIGURATION PERFORM L

MANAGEMENT

( vgy v

EVALUATION t>-- DISCREPANCIES b

VV TEAM REVIEW

  • DISCREPANCIES %

DISCREPANCY REPORT g k

b NO ACTION REQUIRED PROJECT MANAGEMENT TEtlESS.->. DELIVER g

g

{

NPPD g

k b

PROJE T MANAGEMENT NPPD 8h RELATIONSHIP OF V$V TO PROJECT DEVELOPMENT BSP

.s.

FIGuaE 6-2 2-21 84 l

e f

Ai t!

U!

N NOTE: STRIPE AT THE TOP OF' ACTIVITY. BOX INDICATES

-4 THAT THERE IS A DRAWING WHICH SHOWS cn '

FURTHER DETAll OF THE ACTIVITY CONSTRAINTS (EXTERNAL CONSIDERATIONS WHICH AFFE(T. THE PROCESS) h k

PROCESS WOUTPUT INPUTS--*

(TASK OR SUBTASK)

ORW (REPORIS OR INTERMEDIATE PRODUCTS) n S

6-t t-RESOURCES (WHO CONTRIBUTES TO THE TASK)

INTERACTION *

(REVISION BASED ON REVIEW COMMENTS) ggp 2-21 84 CONVENTIONS USED IN VeV INTERACTIVE DESCRIPTION DIAGRAMS c

t' FIGbRE A-1 o(-

E