ML20203K737

From kanterella
Jump to navigation Jump to search
Rev 1 to S0123-606-1-15-1, DRMS Software Verification & Validation Plan
ML20203K737
Person / Time
Site: San Onofre  Southern California Edison icon.png
Issue date: 07/24/1995
From: Edelman M, Piquemal P, Purpura G
SOUTHERN CALIFORNIA EDISON CO.
To:
Shared Package
ML20203K553 List:
References
S0123-606-1-15, S123-606-1-15, NUDOCS 9803050216
Download: ML20203K737 (40)


Text

, . . , . . . .

j.

h ENCLOSURE 4 DRMS SOFTWARE VERIFICATION AND VALIDATION PLAN

~

9903050216 980302 PDR ADOCK 05000361

-p PDR

- _ _ _ _ _ _ _ _ _ _ _ . l

91. APPROVEG)-Wtg may proceed. l@wity Cm (

O t. ApengExcsPT As NOTED - Make changes CM tccubmn. Mfg.

O 3. NOT APPROYED - Correct and resubmit for reAew Not to t:0 used for PGD-00055 0 4!aYr$wCsDocuucur informataony ND00$5A. DOC l

"I l

',V 8g 1995 SOWHERN WORN;A EDISON CNPANY f g -

U"M*Jn'n'c'eTo"hMo"nty '"

b'"*"o'*Er3Eifv.I4>%

Sn.E ie e nMeUp*m"5'i 7,e,U,e.IEE/EEe:en-n t.c-,s,.,sg

- #u - -,

Quahty Classifcaton Ref. Doc. Ms. 98 LM 84-/ M w pu ou, omen r *.o w om m .. ou

.c. .. .. . - to, =,.c. .o. m DRMS SOFTWARE VERIFICATION AND VALIDATION PLAN Reviewed by: Reviewed by: Reviewed by:

Name Pascal Piquemal Mike Edelman George Purpura Date Kg ff py py Signature // peggg, M '

/

RECElVED CDM N SEP. 2 61995 j i i i l SITE FILE C0P.Y.

INSTRUMENTS Suite 150 5000 Highlands Parkway Smyrna, GA 30082 300 3- 406-/-/6 -/

PGD-00055 / REVISION 1 PGD00$$A. DOC REVISION LOG:

k Revision # Date Revised Pages Comments

~ -

0 3/31/95 N/A Originalissue *

\

1 7/19/95 All L

f-g D ~

Page -li-

PGD 00055 / REVISION 1 PoooossA.noc TABLE OF CONTENTS Page 1.0 Appendix A: Software Verification and Validaticn Plan, M G Pl D oc M6120 FA . ............. .. .... . ........................... ......... ........ ... . . ...... ... . .... .. . .. 4 t

Page -lii-

D3 33 Co 09 60 09 mop INSTRUMENTS SA 95-07-31' 13 : 50 03' ST 'E e P2 g MyP R~adiation Monitoring System

c. ..

. 4 e

u= - _a A

F m. . g -

N.~ Ep

?- i: :' .; ma 1 't?.,

g; $ .i t~...

~- 1 ,

?

i..- . E7 a

b-

[

Software Verification and Validation plan Writ'en by: . J.Nadaud Date: ,

Visa: _ .

Checkod by: _S. Dimanche _ Date: Vis.s: ,

Approved by: L. Chapelot Date, Visa:

INTERNAL DISPATCHING EXTERNAL DISPATCHING J.Nadaud F. Schulcz MGP instruments Inc.

~

. .D.Canessa F. Chlosta .

~ ~ ~~

~B. LaIs'no A. Pommier .

~~ ~

,G. Ruaro J.P. Guillemot '

Number of pages : 3G F 20/07/95 SCE Comments a kama n. c.-= L v. w 4 & '.

E 31.o3,95 MGPi lac. comments * *=aa km L cwa D 20.02.95 Modification according to customer's

  • N=*
  • P"=a= LM comments C 22.08.94 MGP internal revis6cn for customers 'N 4 *****d Lhr v comments B 16.05.94 MGP intemai revision S. o==aa- ***=*a= ' ** i v A 2a.02.s4 original edition *= *""-a '"**

Ind. Date Modificavon number A designation (A.F. Signatures A.._ .d. (

. hum.i.7,.::::::741 #~~M~ '"'L'lll"A. -.~. [ 4 612 O FA l, .. .

glannunesMGP

~

Radiation Monitoring System I Software Verification and Validation plan p2 Update Table index /Date Modified pages Origin and designation of the modification Written by A/28.02.94 Ongmal Verson S. Dimanche, T MOTTIN B/16.05.94 Pg 8,13,14,17,23,24,26,27,28,33,45,49, MGP intemal resumption S. Dimanche S1,71 C/22.08.94 mRpages English translation modifcations S,f>.anche Pg 6,7,8 Reptam safety software by related to safety software Pg 9 Syntax Pg 10 Md PQS 46800 Pg 12,13,18,24,28,29, 30,32,44,65 Modify source code verification Pg: au of them Replace Merlin Gerin Provence by MGP instruments Pg: 21,22 V&V team Pg: 64 Update document list i

Df24.01.95 ARpages Reorganized V&V plan according J. Nadaud to IEEE 1012-1986 standard E/30.03.95 Pg 5,26,27,30 Removed Ringhals and SCE references J,Nadaud Pg 14 Removed dates Pg 36 Mded LPU/Si,IC, PIPS and SAS CTF F/17/07/95 p 5,7,17,20,34,35,36. SCE Comments J.Nadaud p 13 Added new staff menbers E T M h 7 '"l = " 7.#' ," O O %' ~ 7 A " ";. ~ 46120 FA

3 MSP gennaam Software Verification and Validation plan Radiation Monitodna System p3 Table ofcontent

1. Purpose............................................................................................................................5
2. Referenced documents . . . . . . . .. .. . . . . . .. . . .. . . . . . .. . . .. . . .. ... . . . . . .. . . .. . . . . . . . . . . .. .. . . . . . . . . . . . .. . . . . . . .. . . . . . . .. . . . . . . . 6
3. Definitions......................................................................................................................7 3.1. Definitions . . . . . . . . . . . . . .. ... .. . . . . . .. . . . . . . .. . . .. . . . .m . . . . ... .. . . . . .. . . . ................................7 3.2. Abbreviations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

3.3 Acronyms & Notations . .... . . . . .. . . . ... . . .. . . . .. . .. ... . .. .. . . .. . . . . . . .. . . . . . . . .. . . . . . . . . . .. ... . . . .. . . . . . . . ., .. 10 3.4. Docu me ntation naming . . . . .. ... . . . . .. . ... .. .... . . . . . . . . . . . . . . . . .. . . . . .. . .. . . .. . .. . . . . . . . . . . ... . . ... . . . . . .. . . . 10

4. Verification and Validation Overview ............. . .. ... ................ ... .......... ..... .. .. .. ...... ... ..... ........ 11 4.1. O rga nization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .:. . . . . . . . . . . 1 1 4.2. Master Sched ule . . . . . .. . . . . . . . . ... . . .. . . . . . .. . . . ... .. . . . . . . . . . . .. . . .. . . . . . . . . .... . . .. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14 4.3. Resou rces S um ma ry . . . . .. .. .. . . . . . . . . .... ... . . ... .. .. ... . . .. . .. . . . .. . . . . . . . . . . . . . . . ... . . . .. . . . . . . .. . . . . . . . . . . 16 4.4. Responsibilities . . . .. . . . . ... ... . .. . . . . . . . .. . . .. . . . .. ... . . . . . . . .... . .. .. . . . .. ... .. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17 4.5. Tools, Techniques, and Methodologies ............................................................ ... 17
5. Life-Cycle Verification and Validation ................................................................................ 19 5.1. M anagement of V&V. . . . . ... . . . . . . . . ....... . . .. ..... . . . .. . . .. . . . ... . ... .. .. . . . . . . . .. . . . . . . . . . . .. . . . . . .. . . . . . . . . 19 5.1.1 V&V tasks, inputs / Outputs, Resources and Responsabilities...............19 5.1.2. Risks....................................................................................................20 5.2. Ccacept Phase V&V . ... ... . .. .... . . .. . . . ..... .... .. . . ... .... ... .. .. . . .. .. . . . . .. . .. . . . .. . . . . . . . . ... . . . . . . .. . . . 21 5.2.1. V&V tasks, Inputs / Outputs, Resources and Responsabilities ............... 21 5.2.2. Risks....................................................................................................21 5.3. M odel development Phase V&V. ..... ... ... ...... . .. . .. . . . .. ... . . .... . . . . .. . . . . . .. . .. . . . . . .. . . . . . . . . . . . . . 22 5.3.1. V&V tasks, Inputs / Outputs, Resources and Responsabilities ............... 22 5.3.2. Risks....................................................................................................22 5.4. Model Operation Phase V&V ................................................ ....... ................. ...... 23 5.4.1. V&V tasks, inputs / Outputs, Resources and Responsabilities ............... 23 5.4.2. Risks....................................................................................................23 5.5. Requirement Phase V&V . . ... ... ..... . . . . . . ... .. ... . . ... .. . .. . .. ... . . .. ... . .. . .. . . ... . .. ... . ... . . . .. .. .. . .. . 24 5.5.1 V&V tasks, Inputs / Outputs, Resources and Responsabilities ............... 24 5.5.2. Risks...................................................................................................24 5.6. Desig n Phase V&V . . . . . . .. . . ...... . . . .. .. .... . . . . . . .. ..... . . .. . . . .. .. . . . .. .. . .. ... . . . ... . . .. .... . . . . . . . . . . . . .. . 25 5.6.1. V&V tasks, Inputs / Outputs, Resources and Responsabilities ............... 25 5.6.2. Risks....................................................................................................25 5.7. Implementation Phase V&V ......................................... ................... . ....... .... ........ 26 5.7.1. V&V tasks, Inputs / Outputs, Resources and Responsabilities ............... 26 5.7.2. Risks....................................................................................................27
5. 8. Test P h ase V&V . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2 8 5.8.1. V&V tasks, Inputs / Outputs, Resources and Responsabilities ............... 28 5.82. Risks................................................................................................29 5.9. Delive ry Phase V&V. .. . . . . . . . .. . .. . .. . . . . .. . . .... .. . . .. . .. .. . . . . . . . . . . . . . . . . . . . . . . . . .. . . . . .. . ... . . . . . . . . . . . . . . 29 5.9.1. V&V tasks, inputs / Outputs, Resources and Responsabilities ............... 29 -

5.9.2. Risks.................................................................................................29 5.10. Installation and Operation Phase V&V .............................................................. 30 m.-nnn. .,.. n n PWhamshefWe$dchen gt pgprmMnn IglIfD GM PAHIEOS OD OS $00WRGft Ger't fgDuf4Weemert HenNIss, seJ a.ones: son ecree de nos Serwcas. 46U0 n

g MGP asnunes Radiation Monitoring System Software Verification and Validation plan p4 5.1t.1. V&V tasks, Inputs / Outputs, Resources and Responsabilities ............. 30 5.10.2 Risks..................................................................................................30 5.1 1. M ainte na n ce Ph a se V&V .. . . .. . . .. . . . . . . . . . . . . . .. . . .. . . .. . . . .. . . .. . . . . . . . . . . . . . . . . .. . . . . . . . . . . . .. . . . . . . . . . 31 5.11.1. V&V tasks, Inputs / Outputs, Resources and Responsabilities ............. 31 5.11.2. Risks..................................................................................................31

6. Software Verification and Validation Reporting.................................................................. 32 6.1. Ta s k R e porti ng . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3 2 6.2. V&V Phase Summary report .... ...... ... . .................. ........ .......... .. . ... ...... ... .. ... ...... . . 32
6. 3. An o m aly Re po rt . . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3 2 6.4. Final Software Verification & Validation Report................................................... 33
7. Verification and Validation Administrative Procedures....................................................... 34 7.1. Anomaly Reporting and Resolution.......................... .......................................... 34 7.2. Tas k Iteration Policy. . . . . . .. . . . .. ... . . . . . . . . . . . . . . . . . . .. . . . . ... . . . . . . . . .. . .. ... . . . . . . . . . . . . . . . . . . . . . . . .. . .. . . . . . 34 7.3. Deviation Policy . . . .. . . . . . . . . . .. . . . .. . . .. . . . . . . . .. . . .. . .... .... . . . . . . .. . . . . . . . . . .. .. . . .. ... .. . . . ... . . .. . . . . .. . . . . . 34 7.4. Control Proced ures . .. . . . . . . . ... .. .. . . . .. . . .. .. . ... . . .. . . . .. ... .. .. . . .. . .. . . . . .. . . . . . . . .. . . . . . . . . . . . . . . .. . . . 34 7.5. Standards, Pratices, and Conventions ............. .................................................. 34 APPENDIX A: List of all documents to be generated under this SWP.................................. 35 w - - -,., % m . ,.. n u.n " "en..no.s --

noi

- u.u n . - ,. .-%w- 46120 FA

3Software MGPannman Verification and Validation plan Radiation Monitoring System p5

1. Purpose This Software Verification and Validation Plan was developed by MGP instruments to establish a program that defined a series of tasks and their associated inputs and outputs. Upon completion, this program will ensure and demonstrate that the safety related RMS software complies with established software practices and meets system requirements specifications and exhibits a high level of reliability.

For standard softwares such as MASS, the SV&V process follows MGP Instruments IWmal Rules.

This plan meets the intent of IEEE Std 1012-1986, reference 2.5, however it deviates from the approach recommended in this standard in the following areas:

. Model development and Model Operation phases have been added to improve the specification and validation process.

. Installation, Check-out and Operation Phases Tasks will be addressed by MGP instruments Inc. specifically for American projects.

. Testing, documentation, naming, and generation differs in some cases to conform with MGP Instruments intemal rules (from QA Manual reference 2.8).

. Schedule is part of the Software Development Plan reference 2.9.

. Methods are part of the Software V&V Tool Plan reference 2.13.

@MM4.'l7.1707M"="""".JO~*,  % 46120 FA

g LG P Radiation Monitoring System Software Verification and Validation plan p6

2. Referenced documents The refdrence standards used for guiding this document and for SV&V implementation are as follows:

2.1. ANSmEEE Std 7291983 IEEE Standard Glossary of Software Engineering Terminology 2.2. ANSmEEE Std $301984 Guide to Software Requirements SpecMcations 2.3. ANSMEEE Std 10181987 Recommended Prac6ce for Software Design Descriptions 2.4. ANSMEEE Std 8291983 Standards for Software Test Documenta600 2.5. ANSinEEE Std 10121988 Standard for Sottware Verincation and Valida60n plans 2.8. ANSlIf NS 10.41987 Guidehnes for the verMcaticn valida600 of scientinc and engineering computerprograms for the nuclearindustry 2.7. lEC 880 !EC Standard for Software for computers in the safety systems of nuclearpowerstatkns. Edi6on 1986 The applicable MGPl documents are:

2.8. MGP Oi 04.07 Quality Instruchon: Software-related documents 2.9. MGP/RMS PDL 45202 Software DeveJopmentPfan 2.10. MGPIRMS PGS.48800 RMS SpecMc Quality assurance Plan 2.11. MGP/RMS SQAP.45203 Software QuaAiry Assurance Pfan 2.12. MGP/RMS RPC -45880 C Progren,n,l,,g Rules 2.13. MGP/RMSSWTP 110835 Software VAV TooIPfan

~

vem iann,E,.snan2 7 8 . 7 0 7 nwe n, E O ~w m a no. m . 46120 FA

gdlansmannMGP Radiation Monitoring System Software Verification and Validation plan p7

3. Definitions 3.1. Definitions application software. Downloaded software which covers most of the functions of an equipment item (acquisition, treatment, alarms, screen ....)

acceptance testing. Formal testing conducted to determine whether or not a system satisfies its acceptance criteria and to enable the customer to determine whether or not to accept the system.

anomaly. Anything observed in operation of software that deviates from expectations based on previously verified software products or reference documents.

basic software. Resident software which has the minimum functions of an equipment '

item (maintenance mode, serial link)

CHRONOSOFT. NGP instruments' software management tool. This is the database that is ussd to record every software generated oy MGP instruments R&D Department, and to trace ar,y change, component Testing. Testing conducted tu verify the implementation of the design for one software element (for example function, module) or a collection of software elements. This is synonymous with the Unit Testing.

concept documentation. For example statement of need, advance planning report, project initiation memo, system definition documentation, procedures, policies and '

customer acceptance criteria / requirements.

development team. Team of q';alified engineers in charge of applying software development life cycle.

developer. Member of the development team.

Independant SV&V team. Team of qualified engineers in charge of complementary software verification and validation tasks.

Interface documentation. This is related to any document that describes how the system is interfaced with extemal devias. This is made up of Parameter tables or Technical Specifications about communication protocol.

Integration testing. An orderly progression of testing in which software elements, hardware elements, or both are combined and tested until the entire system has been integrated.

life-cycle phase. Any period of time during software development or operation that may be characterized by a primary type of activity (such as design or testing) that is being conducted. These phases may overlap one another; for V&V purpose, no phase is concluded untilits development products are fully verified.

L7=Tb7 M'I.7074*'"~.'.~ 7E'%- % 46120 FA

3,MGP anmannss Radiation Monitoring System Software Verification and Validation plan p8 safety related software. Software for RMS safety related equipment.

software component. C language module (set of functions).

software testing. The process of testing an integrated hardware and software sub-system (such as a LPU or a LDU) to verify that the system meets its specified requirements.

soir<are verification and validation plan. A plan for the conduct of software ve'dication and validation.

~

system integration testing. An orderly progression of testing in which sub-system elements are combined and tested to verify that the complete networked system meets its specified requirements.

test case. Documentation, that is part of the test, file specifying inputs, predicted results, and a set of execution conditions for a test item.

test procedure. Documentation, that is part of the test report, specifying a sequence of actions for the execution of a test. ,

traceability. The degree to which a relationship is established between two or more products of the development process, especially product having a predecessor-successor or master-subordinate relationship to one another; for example the degree to which the requirements and design of a given software component match.

validation. The process of evaluating software at the end of the software development process to ensure compliance with software requirements, validator. Member of the SV&V team who carries out validation.

verification. The process of determining whether or not the products of a given phase of the sof%a e development cycle fullfill the requirements established during the previous phase, verifier. Member of the project team who carries out verification.

M L7 b~J M""'"O*T O "~57"l/l' O T - % 46120 FA

Radiation Monitoring System Software Vermcation and Validation plan p9 3.2. Abbreviations ANS American NuclearSociety ANSI American Natio.,al Standards Institute CG Quall'y Contract as part of the QAM DU Display UnN (LDU orRDU)

EPROM Erasable Programmable Read Only Memory lC lonization Cnamber lEC Intemational Electrotechnical Commission IEEE InstMute of Electrical and Electronics Engineers 1 lO input Output lG QualMyInstmation aspart of the QAM LDU LocalDisplay Unit LPU LocalProcessing UnH MASS- Maintenance and Setup Software for windows (non safety related)

MGPllnc MGPInstmmentsInc(Atlanta)

MGPl MGPInstruments SA (Lamanon)

NCS Non Conformities Sheet PC PersonalComputer(IBM compatible)

PIPS Passivated Implanted Planar Silicon GAM MGP Instruments SA Quality Assurance n'anual GA QualityAssurance .

RDU Remote Display Unit RMS Radiation Monitoring System SAS Spectrum AnalyzerSystem Si Silicon SGAM Software Quality Assurance Manager u SVSv Software Veri 6 cation and Validation

.vsv verification and vs.kistion L" 2".',", ll"h",,,,,/l".lll2" 7.,; *,,,'l",*,",", ""' """ *",,",,," ";"=O ., ,,,, m 46120 FA

g,MGP annun n Radiation Monitoring System Software Verification and Validation plan p 10 3.3. Acronyms & Notations DM Department Manager PM Project Manager GA Quality Assurance Team STM Software Technical Manager (Development team manager)

TM Technical Manager (product manager)

VLV Software V&V team VAVM Software V&Vteam Manager 3.4. Documentation naming LATF Acceptance Test Plan d PTA Plan de Test d' Acceptation l ATR Acceptance Test Report l RTA Rapport de Test d'Accep'ation l CPR C Programming Rules l RPC Regles de Prog /N nmation en C l CTF Component Test File l DTU Dossier des Tests Unitaires l LSD List of Software Documents lLDL Liste des Documents Logiciels l PT Parameter Table l TP Table des Paramdtres lR Report l CR Compte Rendu l SCCF Software Change Control File l DSL Dossier de SuiviLogiciel Sof tware Design Descriptions lAOG Analyse Otranique Gdndrale lSDD lAOD Analyse Organique Ddtaillde lSDP Software Dev'elopment Plan lPDL Plan de Ddveloppement Logiciel lS1F Software Integration File l Dil Dossier d'intdgration du Logiciel l SIR Software integration Report lRIL Rapport d'Intdgration du Logiciel l SITF System integration Test File l DIS Dossier d'/ntdgration Systeme SITR System lntegration Test Report l RIS Rapport d'Intdgration Syst6me SMF Software Manufacturing File l DFL Dossier de Fabrication du Logiciel SOAP Software Quality Assurance Plan l POL Plan de Qualitd Logiciel SPF Software Programming Fiie DP Dossier de Programmation du (Source Code Listings) Logiciel SRS Software Requirements lAF Analyse fonctionnelle Specifications l STF Software Test File l DEL Dossier d'Essal Logiciel STR Software Test Report l REL Rapport d'Essal Logiciel SVVP Software V&V Plan l PVL Plan de V&V dulogiciel SVVFR Software VLV Final Report l RFVVL Rapport Final de V& V du Logiciel SVVPSR Software V&V Phase Summary l RPVVL Rapport de Phase V& V du Report l Logicle/

SVVTP Software VSV Tool Plan l MVVL Mdthodes de V&V du Logiciel TS Technical Specifications lST Spdcification Technique 1 7 1 7 M = . O . O . O

  • O E ' 7 J d"*' *"" " O " O  % 46120 FA ~

glMGP annununs Radiation Monitoring System Software Verification and Validation plan p 11

4. Verification and Validation Overview 4.1, Organization The overall objective of the V&V Plan for the RMS project is to assure that the program promotes quality and a highly reliable product through an independent process of technical review and evaluation.

The implementation of a computer project involves management, development and qualification actions. The type of actions and their relationships are summarized in the following chart:

MANAGEMENT QUALIFICATION Orgar.lzation Quality Assurance Resources Quality control Follow up Verification / validation

'r i PROJECT e >

/\

DEVELOPMENT Concept phase Design phase Manufacturing phase 7bw 7M1707Me** 1l'E""d -% 46120 FA j

jp\MGP i!nomaams Radiation Monitoring System Software Verification and Validation plan p 12 The project organization is illustrated below:

Management I

R&D Department Manager DM Projed Manager PM 1r ilr Loftware independant Technical SV&V1aam Manager Manager STM VsVM

_1 r ~

1r

~

1r 1r

~ "

gog,, . . I _

fa'n""O TM o;,y,,=,aa v"sifa**

sysV

  • ugaa og n % T b M OL"'41~ C T W ,0*"'"7 0 = a. w 46120 FA

I

$.banmanenMGP Radiation Monitoring System Software Verification and Validation plan p 13 The individuals on staff expected to participate in the V&V effort ere:

~

Name Function \ Team

~

Francis Schulcz R&D Dep'[rtment Manager DM

+

Laurent Chapelot QA Manager QA Laurent Mesisy Software QA Engineer QA Alain Pommier Project Manager PM Julien Nadaud Software Technical manager Sni, DEV Bruno Laisne Analyst and softwate inanager DEV Pascal Galibert Analyst Engineer DEV Franpois Conte Devoix Analyst Engineer DEV George Esteve Analyst DEV Didier Bonnardel Analyst ._ DEV Jean Claude Sauvan Analyst DEV Sylvie Dimanche Independant V&V Manager V&VM, V&V

~

Michel Martin Analyst Engineer V&V Laurent Battini Analyst Engineer V&V Marc Essayan Analyst Engineer V&V DEV Jean Pierre Vidal Analyst VAV Philippe Stolpe Technical manager TM Fabienne Brault Technical manager TM in addition to the above, independant ctaff members from Merlin Gerin Grenoble were l

involved during the concept phase as extemal consultan?s.

l Name , Fs rction Team Chantal Carbonnel Merlin Gerin/SES QA manager MG/QA l Jean leuis Bergerand Merlin Gerin/SES Software Expen MG/SE l Claude Esmenjcud Merlin Gerin/SES software Expert MG/SE Franpois Miloni Merlin Gerin/SES Software Expert MG/SE fienri Blanc Merlin Gerin Software Coordinator MG/SC nierry Motin V&V Expert Consultant V&V The Project Manager verifies V&V tasks and has authority for resolving issued raised by V&V tasks.

Software QA managers have authority for approving V&V products.

i l

l CA"""AT= M"7OG"."."L*,*.~.".J.~70%  !; 4 612 0 PA i

g4!annumanMGP Radiation Mon!toring System Software verification and WIidation plan p 14 4.2. Master Schedule At the end of the software verification and validation a formalissue of the SV&V Final i Report will be provided. SV&V progress will be reported routinely.

The SWP overview shown in Figure 1 summarizes the life-cycb nodel used for the RMS project. It follows the sample model presentrd in 'ho IEEE 1012 Standard but the following points have been added:

  • A model development and operation have been added ' increase the probability of l doing the product right the first time. Software frori the model phase will also be used I for hardware testing.  ;
  • The installation, checkout and operation phases are assigned to MGP Instruments  ;

inc.

Use the following diagram as an aid to define the bulleted text in Figure 1 :

CL E*Mn

'- A'P}Q

.a U L

  • 4 "l" b' " M 7 " O lk 7f 0 "J ~ O ~n. O P L' %T ,. % 46120 FA

1,BhnenneursMGP Radiation Monitoring System e Software Verification and Validation plan p15 El l$3.( El l IlN "

Y ~

I.I..I...I.............!..!  !!!!

4l I Il di 1 E.L E. ,

II N  ; I! -

Il  ! Il dit 1 ,

l 41 li ;ilu,d i I 3l3l-111 1 i l'i.41lIlid ddihilili i l 111 l

il@l4l i:llij!l}I[

~~

{b g l!!ff:

l 1 R11111Ji

i EtIffHggi g w  :  :

}lg10 I id Il 4

titli Ida, l

i l al!

i f i{ 11,lPi BE33fli fjN

  • hf{ .11 k 4 I!  !..$.N.E.I.0.h.I.5EE! !

h'4". "~h'O'*"4" Ta T,2 ".'.L*,~.a" 'i'A"" ". ~ 46120 FA

i i

Redistbn Monitoring System Software Vermcation and Validation plan p 16 4.3. Resources Summary The resources required for the SV&V beyond manpower include the following systems:

1 LPUISI 1 LPU/ PIPS 1 LPUISAS 1 LPU/IC 1LPU/lO 1LDU 1 RDU or a 2nd LDU PMC MASS f

' eb"e7h274170%C~#71~w =w 46120 FA J

J>M P.

Radation Monitoring System Software Verification and Validation plan p 17 4.4. Responsibilities The primary focus of the V&V life cycle is to assure that quality / time / cost criteria defined for the project are accomplished. Quality has primary priority. However, this must also bs accomplished in time within budget.

Here is a summary of responsibilities for SV&V tasks puformsd over the development life-cycle:

Lye-Qcle Phase S Y& V Task Responsabinty to Concept Documentation Evaluation Department Manager (DM)

Model Development Model SRS Evaluation Software Technical Manager (STM)

Interface Evaluation STM Model Operation TechnicalChoice Validation STM liardware Qualification TM Requirements SRS Evaluation V&V Manager Interface documentation Evaluation V&V Manager Software Test Filt Generation VAV Manager Design Design Evaluation DEV Software Test File Revision V&V Manager Integration Test Generation DEV Component Test Generation STM or VAV Manager Implementation Code Analysis SIM or VAV Manager Software Procedure Generation V&V Manager '

Software Test File Evaluation PM.DEV Software Integration Revision DEV Component Test Execution DEV, V&V Manager Test Test Procedure Generation V&V Manager Integration Test Execution DEV Software Test Execution V&V Manager Delivery VAV Final RW Generation STM Maintenance SVVP Revision STM Anomaly Evaluation PM or STM

. Proposed Change PM Phase Task Reiteration S1M Installation and Operation MGP instruments Inc.

67.1,; ,,, J C*l"",.';2",,"';7d':"a"4"'%"*" ';'A*:",',",,"ll1., % 46120 FA L

jg, ,lg,P Radiation Monitorirm System Software verification and Validation plan p 18 4.5. Tools, Techniques, and Methodologies The methods used in the V&V process include review by cognizant engineers with independent verification and formal review proceedings.

The tools utilized for SV&V tasks are as follows:

Document evaluation:

. Detailed review.

. LOGISCOPE for source code analysis using metrics.1his automated tool has been used originaly to evaluate our source code generation (not used anymore).

Document generation:

. Networked PC-DOS / WINDOWS.

. Microsoft Word for Windows version 2.0 or later.

Component testing:

. Networked PC-DOS / WINDOWS.

. Borland C or Microtec C/68000 environment for generation and execution of test programs.

Integration testing:

. Networked PC-DOS / WINDOWS.

. Microtek C/68000 develop.nent chain (compiler + linker) for the generation of software to be integrated,

. 68000 emulator

, target hardware for the software Software testing:

. Networked PC-DOS / WINDOWS.

. Microtek C/68000 development chain (compiler + linker) for the generation of software to be accepted by the validator,

. 68000 emulator

, target hardware for the software to be accepted by the validator

. simulation means for the operational environment of the target hardware.

Methods are described in the Software Verification and Validation Tool Plan.

1717b"M170%%%~.JO"O. m 46120 FA

l 3bannunesMGP Radiation Monitoring System Software Verification and Validation plan p 19

5. Life-cycle Verification and Validation Schedule is described in the Software Development Plan, reference 2.9, Methods and Criteria are described in the Software V&V Tool plan.

Outputs from phase tasks are used to develop corresponding V&V phase summary reports and are ongoing inputs to the SVVR. Outputs of V&V tasks become inpu;s to subsequent life-cycle V&V tasks.

5.1. Management of V&V 5.1.1. V&V tasks, inputs / Outputs, Resources and Responsabill+1es V& V Task Required erput Requtred Ourput Resources Responsab!!ities Software Verir6 cation and Validation Plan (SVVP) Development schedules, SVVP and Updates STM Generation. Generate SVVP for all lifegcle phases based concept documentation, upon available documentation. Update SVVP for each life. SRS, Interface cycle phase, particularly prior to delivery phasi. 'the SVVP documentation, SDD, is a living document. source code listings, executable code, user documentation, proposed changes Seftware VAV Tool Plan (SVVTP) Generation. Produce a Concept documentation. SVVTP and STM, VAVM document that identify the special software tools, techniques, Proposed changes Updates and methodologies employed by the V&V efrort. This is in complement of MOP Instruments Quality Assurance Manual.

( Basellae Change Apeasment. Evaluate proposed software Proposed changes Updated SVVP changes (anomaly correction, performance enhancements, requirements changes, clarifications) for effects on previously completed VAV tasks. When changes are made, plan iteration of affected tasks which includes reperforming previous V&V tasks or initiating new VAV tasks to address tLe software changes created by the cyclic or iterative development process.

Managemeni Review. Condue; periodic review of V&V Development Schedules, Task Reporting, PM, STM, VAVM cffort, technical accomplishments, resource utilization, future VAV Outputs VAV Phase planning, nsk management. Support daily management of Summary Reports V&V activities.

17A ., ,,"_".*l"4 ",,7.,f,,, ", L'4"'*",,",;",,7C",",;';',, ,, ,,, s.,,,,,,,

46120 FA

$Z$anummMGP Radiation Monitoring System Software Verification and Validation plan p 20 5.1.2. Risks The risks identified to date are:

.The V&V schedule directly depends upon the product software development schedule.

Impact: Any change to the development schedule has a direct impact on the V&V schedule.

Acilon : Periodically (at least once a month), carry out the following:

, assign a person to update the workload of the development and the V&V tasks remaining to be performed,

. update the development and V&V schedules making sure they are properly synchronized.

eV&V personnel requires capabilities and attitudes that differ from those encountered during software development. '

impact : The departure and/or redeployment of a verifier /validator inay have an adverse effect on the schedule. A reduction in ihe motivation of the verifier /validator may have a negative effect on the quality of the product.

Actica : Periodically, (each week), a meeting is held between the members of the V&V team and the project manager. This meeting promotes team work:

. each member of the V&V team to report work progress, to express any technical and personal communication problems encountered,

. anticipation of events before they occur thus avoiding technical and motivation problems.

. .The projection of the workload involved in the V&V tasks may be incorrect (over- or underestimated, workload not wall distributed),

impact : adverse effect on the schedule Action : The periodic monitoring (monthly) described earlier perceives these shorterp.ings and define corrective actions.

WETET.M"7074~OLCO  % 46120 FA

Software Verification and Validation plan Radiation Monttoring System p 21 5.2. Concept Phase V&V 5.2.1 V&V tasks, inputsIOutputs, Resources and Responsabilities V& V Task Requiretiovut Required Output Resources Responsabilities Concept Doevamentat6en Evolust6on. Evaluate concept Concept documentation Task reportmg DM, PM, STM, documentation to detennine if proposed concept satisfies (system SRS, planning, MG/QA, MO/SE, user needs and project objectives (performance and safety IEC 880 Procedures, M0/SC goals). Identify mWor constraints of interfacing systems and cus'.omer requirements) constraints or limitations of proposed approach. Asws:

criticality of each software item. .

Project laitialisation review. Initialize project accordingly Concept documentation, Review report DM, PM STM, to the development plan, and to the system requirements. Software Development MOQA, MO/SE, Plan MO/SC 5.2.2. Risks

  • As the software requirements are being implemented we explore attemate and improved ways of accomplishing the same, impact This continuous improvement of knowledge requires i -tfication and revision to documents which have already been written.

Action: The use of electronic documentation should ease the modification of I these documente.

. implementing new technologies induces feasible risk. I impact Be aware during the development that initial performances ruay fall short l of meeting all customer expectations .

l l Action: Software prototype was developed based upon general specification.

! Experimenting with the prototype facilitates refining and improving the software, until it meets all software requirement specification.

1 l

l 1

)

\

1 M O.* C h * = M 1 7 0 7 4 ~ T . J O % s - 46120 FA y --__ ,.

$.!annagesMGP Radiation Monitodna System Software Verification and Validation plan p 22 5,3, Model development Phase V&V 5,3.1. V&V tasks, Inputs / Outputs, Resources and Responsabilities VdVlask Required svut Required Output Resources Responsabilities Model System SR$ Evaluat60s, Evaluate SRS for Concept documentauon, lask reporting. DM, PM, DEV TM correctness, consistency, completeness, accuracy and System SRS, interface Anomaly Repor'.s readability, documentation Model 110 and DU SRS Evaluation. Evaluate LPU and System SRS, LPU SRS, Task repcrting. DM, PM, DEV TM DU SRS for correctness, consistency, completeness, accuracy DU SRS, Interface Anomaly Reports and readability, documentation ,

Model specific 1#U SRS Evaluation. Evaluate specific System SRS, LPU ba, Task reporting, DM, PM, DEV, r LPU SRS for correctness, consistency, completeness, DU SRS, LPU/SI SRS, Anomaly Reports TM accuracy and readability, LPU/ PIPS SRS, l.PU/SAS SRS, Interface documentation Modellaterface Documentation Evaluat6on. Evaluate RMS Network TS, DU Task reporting. DM, PM, DEV, interface documentation requirements for correctness, PT, LPU/Si PT, Anomaly Reports TM consistency, completeness, accuracy and readability, LPU/ PIPS PT, LPU/SAS PT 5,3,2. Risks Not applicable OM.'"M'J "O " 7.25 7 4 Y.5"$7.~ T . % 46120 FA l

g,M,,gP, Radiation Monitoring System Software Verification and Validation plan p 23 5,4. Model Operation Phase V&V 5.4.1. V&V tasks, Inputs / Outputs, Resources and Responsabilities

V& V1ask Requiredinput Required Output Resources Responsabilities Techancel Choice Val 6dation. Check that technical choices Model SRS, interface lask reportmg, DEV. TM taken from concept phase are applicable, documentation, Model Anomaly Executable Code reports Hardwere Qualification.1his hardware oriented phase Model SRS, Interface Model Anomaly TM allows technical managers to use the complete product documentation, reports (hardware and software together), they can detect anomalies Executable Code and evaluate product ergonomics. Detected
  • bugs' are corrected immediately. Improvements are taken as SRS revisions.

5,4,2. Riska e Delays related to the use of the prototypes may result in anomally reports during the development phase.

impact. Many problems with the development cycle with many different documents.

Action: The anomaly reports should be generated very strictly, but the procedure of the management of the changes during the development cycle should be optimized in order to minimize the impact of those anomalies, l

l l

1 i

l l

ETE"b"J4LO*CUJYJ.JOO. % 46120 FA l 1

$\tannnnesMBP Radiation Monitoring System Software Verification and Validation plan p 24 5,5. Requirement Phase V&V 5.5.1, V&V tasks, inputs / Outputs, Resources and Rosponsabilities V& V Tak Required erput Required Ourput Resourtes Resporuab Ittles

~

System SRS Evaluation,livaluate SRS requirements for Concept documentation, Task reportmg. DM, PM, YkV, correctness, consistency, completerwss, accuracy and System SRS, Interface Anomaly Reports DEV TM readability, documentation, Standards LPU and DU SRS Evaluation. Evaluate LPU and DU SRS System SRS, LPU SRS. Task reporting, DM, PM, V&V, for 'orrectness, c consistency, completeness, accuracy and DU SRS, Interface Anomaly Reports DEV TM readability, documentation, Standards Specific LPU SRS Evaluation. Evaluate specific LPU SRS System SRS, DU SRS. Task reporting. DM, PM V&V, for correctness, consistency, completeness, accuracy and LPU SRS, LPU/Si SRS, Anomaly Reports DEV, TM readability, LPU/ PIPS SRS, LPU/SAS SRS, LPU/lc SRS, LPU/lO SRS, Interface documentation, Standards laterface Documentatlos Evaluatloa. Evaluate Interface RMS Network TS, DU Task reporting, DM, PM, V&V, Documentation for correctness, consistency, completeness, PT, LPU/Si PT, Anomaly Reports DEV TM accuracy and readability, LPU/ PIPS PT, LPU/SAS PT, LPU/IC PT,LPU/lO li, AlgorithmInterface SRS Software Test File pies generation. Plan software testing to Concept documentation , DU STF, LPU VAV determine if software *atisfies system objectives. SRS, Interface STF, LPU/Si STF, docun,entation LPU/P1PS STF, LPU/SAS S17, LPU/IC STF, LPU/lO STF, Anomaly Reports 5.5.2. Risks Not applicable b"4,, h"J,47,",,7.,;1'"*, %""' **,,'llZ""** "*'",/fC""."*,, ,, . % 46120 FA

$ungnmamsMGP Radiation Monitoring System Software Verification and Validation plan p 25 5,6, Design Phase V&V 5,6,1, V&V tasks, Inputs / Outputs, Resources and Responsabilities V& V Task Requiredlyut Required Output Resources Responsabilities LPU and DU SDD Tramability and Evalent6on. Evaluate LPU SDD, DU SDD, lask reporting. VAV, DEV LPU and DU SDD for conectness, consistency, Interface documentation Anomaly Reports completeness, accuracy and testability.

Specific LPU SDD Traceability and Evaluation. Evaluate LPU SDD, DU SDD, Task reporting, V&V, DEV specific LPU SDD for correctness, consistency, LPU/Si SDD, LPU/) IPS Anomaly Reports completeness, accuracy and readability. SDD, LPU/SAS SDD, LPU/IC SDD, LPU/lO SDD, Interface documentation, Standards Composest Test Generation. Plan component testing to SRS, SDD, Interface DU Component VAV,DEV detennine if critical software elements correctly implement documentation. Test File (CIT),

component requirements. Components that have to be tested LPU CTF, LPU/Si are selected because they include algorithms or calculation, CTF, LPU/ PIPS the} can not be tested easily in the integration testing, and CTF, LPU/SAS they can not be validated easily. For example hardwwe CTF, LPU/lc CIT, interfue functions are part of the integration tests and are not LPU/lO CTT, Task part of the component tests. reporting, Anomaly Reports 5,6,2, Risks Not applicable MZ""hT 0,.'l2" 70 UI'"_%"~.J.L'*1 . m 46120 FA

1MGP napapann_ Radisflon Monitoring System Software Verification end Validation plan ri

._ ! 5 _.

5,7. Implementation Phase V&V 5,7,1, V&V tasks, Inputs / Outputs, Resources and Responsabilities V& Y Task Required trput Required Output Resources Resporuabilities Source code Amalysis. Cross-Reading to ensure that source SDD, C programming Task reporting, DEV code is clear, well commented and understandable. Check Rules, DU/ Base, Anomaly Reports that source code complies with development standards. This DU/Appli, LPU/Si, evaluation includes detallied design and headers. IIcaders LPU/ PIPS, LPU/IC, will include name of the reader, and date of verification to LPU/SAS, LPU/lO trace this task. Source Code Listings Source code Evaluation. Cross Reading of detailled design LPU/ Base and Common Task reporting, V&V description that is part of the source code to ensure that LPU/Appli Source Code Anomaly Reports detailled design is clear, understandable and that headers are Listings (SPFI10512, conect. Assess source code quality using metrics. SPFil0519, SPF110403, LOGISCOPE Softwere wi!! be used for automatic evaluation. SPF110516)

This evaluation will be done only on the first release of the LPU/Si source code.

Component Test Case and Procedere Generation. SDD, Interface DU Component V&V, DEV Develop test cases and procedures for component testing. documentation, Source Test File (CTT),

Code Listings. LPU CTF, LPU/Si CIT, LPU/ PIPS CIT, LPU/SAS CIT, LPU/IC CIT, LPU/lO CIT, Task reporting, Anomaly Reports .

Software Test Flie Case Generation. Develop test cases for SDD, Interface DU STT, LPU VAV software testing. documentation, Source STF, LPU/Si STF, Code Listings. LPU/ PIPS STF, LPU/SAS SIT, LPU/IC STF, LPU/lO STF, Anomaly Reports lategration Test Case sad Procedere Generation. Develop SDD, Interface DU SIF, LPU SIF, DEV test cases and procedures for integration testing. documentation, Source LPU/Si SIF, Code Listings. LPU/ PIPS SIF, LPU/SAS SIF.

LPU/IC SIF, LPU/lO SIF, Anomaly Reports System lategration Test File Generstles: Plan system Concept documentation, SITT DEV testing to determine if software satisfies system objectives. SRS, Interface Critena for this determination are: (a) compliance with documentation, User functional requirements as complete software end item in documentation, SDD systern environment (b) performance at hardware, software, user and operator interfaces (c) adequacy of user .

documentation (d) performance at boundaries (network full loading) and under stress conditions.

h%""., "4".",,,,,,,%"42"&f,,,", """4~'*'",,",,',",*",.'TA"".",,,,,,,%

, 46120 FA

$,!gennamsMGP Radletion Monitoring System

! Software Verification and Validation plan p 27 l

V& V Task Required tiput Requtred Output Resources Responsabilities Acceptance lest Plan Generation Plan accepunce testing Concept documentation, Acceptance Test DEV to determine if software correctly implements system and SRS, Interface Plan softwere requirements in operational environment. documentation, User documentatica, SDD Component Test Execution Perfonn component testing as SDD, Interface DU CIT, LPU V&V, DEV required by component test procedures. Analyn results to documentation, Source CTT, LPU/Si CTF, determine that software correctly implement design. Code Listings. LPU/ PIPS CTT, LPU/SAS CIT, LPU/IC CTF, LPU/lO CTF, Task reporting, Anomaly Reports 5,7.2, Risks

. The list of available hardware for system tests & for acceptance tests may not be

( defined at this time, impact: The test plans may not be completed 4. ing this ph6se.

Action: Envision the maximum number of tests to be performed and eventually report the completion of the tests plans during the next phase, y

M E""*"J " M 17 0 6~ O % ~.J E % a w 46120 FA

. .- . _ . - _ . _ _ . . _ . - _ . . _ . _ - . _ - = _ . - - . - . _ _ . - - _ - . . . . _ - . - . ~ - . - -

g d!MP Radiation Monitoring System Software Verification and Validation plan p 28 6.8. Test Phase V&V 6,8.1, V&V tasks, Inputs / Outputs, Resources and Responsabilities Vd V Task Required tqua Required Outpur Resources Responsabilirles beRware ' lost Procedure Geuerat6ca. Develop test SDD, interface DU/ Base STR, VAV procedures for software testing. These procedures are part of documentation, Source DU/Appli SIR software test reports preparaticn. Code Listings. LPUSase STR, l LPU/Appli STR, i LPU/Si S1R, ,

LPU/P!PS S1R, )

LPU/SAS STR, LPU/IC STR, i LPU/lO STR, Anomaly Reports  ;

lategration Test Execution. Perform integration testing in Source Code Listings, DU SIR, LPU SIR, DEV j accordance with test procedures. Analyze results to Executable code. LPU/Si SIR, determine if sonware implements sonware requirements and LPU/ PIPS SIR, design and that software components function correctly LPU/SAS SIR, l together. LPU/IC SIR, LPU/lO SIR, Anomaly Reports Software Test Execution. Perform software testing in Source Code Listings, DU/ Base STR, V&V accordance with software test procedure described in Executable code User DU/Appli STR

, software test report documents. Analyre results to determine documentation. LPU! Base STR, ifsoftware satisfies requiremena. LPU/Appli STR, LPU/Si STR, LPU/ PIPS STR, l LPU/SAS STR, LPU/IC STR, LPU/IO STR, Anomaly Reports l

System Integration Test Execution: Perform system testing Source Code Listings, System Integration DEV, TM

! Executable code, User Test Report, in eccv, A,cc with test procedures. Analyze results to determine if software satisfies system objectives. Documeei documentation. Anornaly Reports and trace all testing results as required by the test plan.

Accepts.,ce Test Esecution: Perform acceptance testint, ln Source Code Listings, Acceptance Test DEV, TM a.:cordance with test procedures under formal configurrJtion Executable code, User Reports. Anemaly control. Analyre results to determine if softwwe satisf.es documentation. Reports acceptance criteria. Iket and trace all testing results as required by the Test Plan.

l

(

l l

% "E"'""M' *Z"M,",,7M 7d"" ",.'.'.7., JO*"d* ., ,,, %

46120 FA l

_ _. -. .. ._. ._. _. -- .. _- = _ _ _ _ . _ _ _ . . _ . . - _ _ _ - _ _ - _ - .-_. . ..

1 Radiation Monttonno System Software Verification and Validation plan p 29 l I

5.8.2. Risks e Late availability of hardware can slow down the test and iteration phase as well as the validation test phases..

Impact: Planning schedule slip.

Action: Plan to have hardware available as soon as possible if hardware from other projects is available, use it, until the final hardware is available.

. if anomalies are found in the course of the system validation, it will need to reiterate the development process.

Impact: High cost & slip on delivery dates.

Action: Make as many tests as possible in the prototype .itage to identify possible problems. Critical points & network performance, etc. should be tested freely for the developers during the prototype phase. This risk has to be scheduled from ths beginning and should not impact delays.

5.9. Delivery Phase V&V 5.9.1. V&V tasks, Inputs / Outputs, Resources and Responsabillties l'a Y Task Requotd taput Required Output Resources h2ponsabilities VAV Final Report Generation, Summarue all VAV All V&V Phase VAV Final Report DEV

i. activitics and results, including status and dispositions of Summary Reports anomalies in the VAV final report (see 6.3 of this plan).

5.9.2. Risks Not applicable U* T M h " J.J.7.17 d O " L ~. 7.J O L~a  % 46120 FA

._ _ _ = _ __ ._.__ __ _ . _ __.._ _ _ _ _.____ . _ .____ _ _ __ - _ _ _

l Radiation Monitoring System Software Verification and Validation plan p 30 6.10. Installation and Operation Phase V&V 5.10.1.V&V tasks, Inputs / Outputs, Resources and Responsabilities V& V 1ask Required trput Required Ourput Resources Responsabilities lastallat6on and Operat6on.1hese tasks arc assigned to installation Package. Anomaly Reports MOP instruments MGP instruments Concept Documentation, Inc.

SRS, interface Documentation, $3 ace Code 1lstings, thecutable Code, User Documentation, SVVP, SVVR 5,10.2, Risks a MGP Instruments Inc may encounter difficultles installing & supporting software bought from a subcontractor ,

impact. Installation time: Too long.

Action: MGP Instruments will allocate and train enough USA based personnel to support the software V&V and installation program, MGPl SA personnel support to American operations must be accounted, planned, and exercised.

l

  1. J L"*.,L T **Jf.7."7.2,0.T O ~, 7J 0 %  % 46120 FA

Radiation Monitoring System snftware verification and Validation plan p 31 5.11. Maintenance Phase V&V 5.11.1 V&V tasks, inputs / Outputs, Resources and Responsabilities I'd V Task Required tiput Beguired Output Resources Responsabilities heftware VAV Pia Revision. For software verified and Concept Documentauon, Updated SVVP STM validated under this plan, revise SVVP to comply with new SRS, Interface constraints based upon available documentation. Documentation, Source Code Listings, Executable Code, User Documentation, Proposed changes, SVVR Asomely Evaluation. Evaluate severity of anomalies in Anomaly Reports Task Reporting DEV software operation. Analyre effects of anomalies on system.

Proposed Cheage Assument. Asses allproposed Proposed Changes Tuk Rer*>rting DEV, VAV modifications, enhancements, or additions to determine effect each change would have on system. Determine extent to which VAV tasks would be Iterated.

Phase Task iteration. For approved software changes, Approved Changea Task Reporting DEV, VAV perform VAV tads necessary to ensure that: planned from Itaated changes are implemented correctly; all documentation is Tasks, Anomaly complete and up to date; and no unxteptabic changes have Reports, Required occurred in software performance Phases Outputs of iterated Tasks 5.11.2. Risks a During the maintenance phase, the developers may be assigned to other projects and may not be readily available to assist.

Impact. Lack of resources for immediate response to problems.

Action: Plan that resources used for development stay in the project long enough ,

to complete the maintenance phase work.

M % . C M 1 7 O 7 4 % ~.J O "l1  % 46120 FA

1g Janmanen 1MEP Radiation Monitoring System Software Verification and Validation plan p 32

6. Software Verification and Validation Reporting This section describes how the results of implementing the Plan will be documented.

G.1. Task Reporting A report for each of the Tasks /Sub-tasks performed in the SWP shall be developed and issued as they are completed. Listed below are the different reports to be generated:

Management Progress reporting and intemal notes -

Documentation Evaluation Documentation checking forms with review reports Source Code Analysis Source file's header update with history Component testing Component Test File (CTF)

Softwatu integration Software Integration Report (SIR)

Software Testing Software Test Report (STR)

System Infogration Testing System Integration Test Report (SITR)

Acceptance Testing Acceptance Test Report (ATR)

Others Meeting reports or intemal notes 6.2. V&V Phase Summary report A phase Summary Report shall summarize the results of V&V tasks performed in each of the following life-cycle phases: Concept, Model, Model operation, Requirements, Design, implementation and Test. Each V&V Phase Sui.1 mary report shall contains the following:

1. Description of SV&V tasks performed
2. Summary of tast results
3. Summary of anomalies and resolution
4. Recommandations 6.3. Anomaly Report An Anomaly report shall document each anomaly detected in the SV&V. The content of the report and the administrative controls for its une are provided in section 7.1.

UO".J" ,"J." MOM".7070%7.J.".O T -% 46120 FA

Radiation Mon}toring Systen, Software Verification and Validation plan p 33 6.4. Final Software Verification & Validation Report The final report shall include a summary of SV&V activities and results. Any deviation from the SV&V Plan will be noted. Summaries shall include positive findings as well as references to discrepancies.

The final report will also contain conclusions and recommandations based on above summaries.

The SV&V results reported in each SV&V Phase Summary Report will be summarized in a SV&V final report. The report will be organized to facilitate evaluation of the quality of RMS softwares, and of the SV&V program itself. It shall be in the following format:

. Summary of all SV&V Phase Summary Reports with task results, anomalies and resolution.

. Assessment of Overall Software Quality

. Conclusions and Recommendations l

l l

l l

l l

ETA.b" ML7d C L%~%%T  % 46120 FA

. i Radiation Monitoring System Software Verification and Validation plan p 34

7. Verification and Validation Administrative Procedures 7.1. Anomaly Reporting and Resolution Refer to 2.11. MGP/RMs SQAP.45203.

7.2. Task iteration Policy A change request regarding a version results in the following processing with respect to the SV&V life cycle:

- analysis of the Impact of the change (identification of items involved and the degree of modification)

-repetition of the V&V cycle on items which change in order to check that the modifications have been taken into account in versions n+1 When the change request occurs during development, the software version index does not change, but a sub-version index is managed by the software configuration tool.

When it occurs after acceptance of a version, the software version index is increment 9d.

7.3. Deviation Policy When a deviation from the SWP has been detected or is predicted during the future activitler, the MGP Instrument Quality Contract CQ 02 03 is applicable.

A deviation report is written and examined by the quality assurance and management -

departments. This teport is joined to the SWR.

7.4. Control Procedures As indicated in the SQAP, the items produced by the V&V life cycle (documents, test programs, test sets and results) are stored using the same procedures and in the same place as the other software products.

7.5. Standards, Pratices, and Conventions See paragraph 4 D E 7 b~m. M "7.f5 E ~~ T M J O % = % 46120 FA

n; Radiation Monitoring System Software Verification and Validation plan p 35 APPENDIX A: List of all documents to be generated under this SWP Software Test FIIes (STF) and Test Reports (STR)

DU S7F 46432 DU STR 46639 LPU9ase STF 46430 LPUSase STR 46637 LPUCommon Application STF 46631 LPU Common Application STR 46636 LPWSISTF 46633 LPWSISTR 46640 LPWPIPS STF 46634 LPUPIPS STR 46641 LPWC STF 46635 LPWC STR 46642 LPWSAS STF 46636 LPWSAS STR 46643 LPWO STF 110496 LPWO STR 110497 Software Integra,Jon FIIes (SIF), Software Integration Reports (SIR) and Component Test Filus (CTF)

DUSase SIF 46702 OUSase SIR 44704 DWApplication Slf 46703 DU!Applicatkvs SIR 46706 LPUSase Sl? 46673 LPUSase SIR 46476 LPUCommon Application SIF 46674 iPU Common Application SIR 46676 LPWAlgodthmInterface SIF 110662 LPWAlgorithmInterface SIR 110663 LPUSICW 111224 l LPWSISIF 46677 l LPWSISIR 46676 LPUPIPS CTF 111226 l

LPUPIPS SIP 46662 LPUPIPS SIR 46663 LPWC C7F 111227 i

LPWC SIF 46667 l LPWC SIR 46666 l LPWSAS CTF 1112M l LPWSAS SIF 46697 LPUSAS SIR 46696 LPWO SIF 110493 LPWO SIRmTF 110494 1%1**l"'" " O" 747J""'%",';'OT, m % 46120 FA i

l MGP

__ Radiatbn Monitoring System

, Software Verification and Validation plan p 36 Others SoRwere VAVFine, Report 110834 SoMwere VAV ToolPlan 110838 SystemIntegration Test He 110SM SystemIntegration Test Report 110837 e

=-

D7 70Y7M"707nen/O~m~EO~ a.noe se. 46120 FA

- _ - _ _ _ _ _ - _ _ _ _