ML100550680

From kanterella
Jump to navigation Jump to search
UFTR-QA1-06.1, Uftr Digital Control System Upgrade, Software Test Plan - Sivat Test
ML100550680
Person / Time
Site: 05000083
Issue date: 02/17/2010
From: Dugan E, Ghita G, Haghighat A
Univ of Florida
To:
Office of Nuclear Reactor Regulation
References
UFTR-QA1-06.1
Download: ML100550680 (28)


Text

UF/NRE ProjectID: QA-I UFTR QUALITYASSURANCE DOCUMENT Revision 0 Copy 1 UFTR I Page)I of 28 Project

Title:

UFTR DIGITAL CONTROL SYSTEM UPGRADE UFTR-QA1-06.1, Software Test Plan - SIVAT Test Prepared by, Reviewed by, Prof. Ed Dugan Dr. Gabriel Ghita

. .. nature) ..... (Signature)

Date: .... .

Approved by, Prof. Alireza Haghighat

. .... ... (Signature)

Date:

UF/NRE Preparedby Reviewed by QA-), UFTR-QAI-106.1 UFNR Name: Name: Revision 0 Copy 1 UFTR Date : Initials: D Date: Initials: VoL 1 Page 2 of 28 THE DISTRIBUTION LIST OF THE DOCUMENT No. Name Affiliation Signature Date 1.

2.

3.

4.

5.

6.

Preparedby Reviewed by QA-I, UFTR-QAI-106.1 UFINRE Name: Name: Revision 0 Copy I UFTR Date: Initials: Date: Initials: Vol. 1 Page3 of 28 THE LIST OF THE REVISED PA GES OF THE DOCUMENT Revision no. Reviewed by -Approvedby The Modified Pages Date I .1- 4-4 4- + 4-4 4- -I- 4-

___ I ____ I ____ I _____ I __

4 4- + 4-

Preparedby Reviewed by QA-I, UFTR-QAI-106.1 UFINRE UF/R Name: Name: Revision 0 Copy I UFTR Date: -initials: Date: Initials: Vol. 1 Page4 of 28 TABLE OF CONTENTS

1. Purpose and Scope .......................................................................................................... 6
2. R eference ................................................................................................................................. 7 2.1 UFTR D ocum ents ...................................................................................................... 7 2.2 AREVA N P Inc. D ocum ents ..................................................................................... 7 2.3 Industry Standards ................................................................................................... 7 2.4 NRC D ocum ents ........................................................................................................ 7
3. A bbreviations and Acronym s ........................................................................................... 8
4. Test Item s ................................................................................................................................ 9 4.1 Functions to be Tested ............................................................................................... 9 4.2 Features to be Tested ............................................................................................... 10 4.3 Features N ot to be Tested ........................................................................................ 10
5. A pproach ............................................................................................................................... 11 5.1 Test Specification ...................................................................................................... 11 5.2 Test Procedure ............................................................................................................... 11 5.3 Test Preparation ...................................................................................................... 11 5.4 Test Execution ............................................................................................................... 12 5.4.1 Test of Input Subm odules .................................................................................... 12 5.4.2 Test of I& C Functions ........................................................................................... 12 5.4.3 Test of O utput Subm odules ................................................................................. 13 5.5 Com prehensiveness ................................................................................................. 13
6. Item Pass/Fail Criteria .................................................................................................... 14
7. Suspension Criteria and Resumption Requirements .................................................... 15
8. Test D eliverables ................................................................................................................... 16 8.1 Test Specification ........................................................................................................... 16 8.2 Test Procedure ............................................................................................................... 18 8.3 Test Log ........................................................................................................................... 19 8.4 Test Incident R eport ............................................................................................... 20 8.5 Test Sum m ary R eport ............................................................................. ............... 20
9. Testing Tasks ........................................................................................................................ 22
10. Environm ental N eeds ................................................................................................. 23

Preparedby Reviewed by QA-I, UFTR-QA 1-106.1 UF/NRE Name: Revision 0 Copy 1 UFTR Name:

Date: Initials: Date: Initials: Vol. I Page 5 of 28 10.1 H ardw are ....................................................................................................................... 23 10.2 Softw are ......................................................................................................................... 23 10.2.1 O perating System (s) ........................................................................................ 23 10.2.2 Com m unications Software ............................................................................... 23 10.2.3 Security ................................................................................................................... 23 10.2.4 Tools ....................................................................................................................... 23

11. R esponsibilities ................................................................................................................... 25 11.1 Project Coordinator ................................................................................. ...................... 25 11.2 Softw are D evelopm ent G roup .................................................................................. 25 11.3 IV& V Team .................................................................................. ................................ 25
12. Staffi ng and Training ................................................................................................. 26 12.1 Staffi ng ........................................................................................................................... 26 12.2 Training .......................................................................................................................... 26
13. Schedule ............................................................................................................................. 27
14. Risk and Contingencies ............................................................................................... 28

Preparedby Reviewed by QA-), UFTR-QA 1-106.1 Name: Name: Revision 0 Copy 1 UFTR Date: Initials: Date: Initials: VoL 1 Page 6 of 28

1. Purpose and Scope This document provides the SIVAT Test Plan for the UFTR Reactor Protection System (RPS) application software. SIVAT testing is performed prior to Factory Acceptance Testing (FAT), in order to verify that the functional requirements of the Software Requirements Specification (SRS) and the software design in the Software Design Description (SDD) are properly implemented in the SPACE application. For more information about SIVAT, refer to the SIVAT User Manual, /12/.

The introduction of the test plan shall provide an overview of the entire testing process. It shall at a minimum list the following plan objectives:

1. To define the software test items to be tested.
2. To detail the approach required to prepare for and conduct SIVAT testing.
3. To define the resources needed to perform the testing.
4. To communicate to all responsible parties the tasks that they are to perform and the schedule to be followed in performing the tasks.
5. To define the test tools and environment needed to conduct SIVAT testing.
6. To describe the acceptance (pass/fail) criteria for the Software Test Items being tested.

The form and content of this document follows IEEE Std. 829-1983, /14/, endorsed by RG 1.170, /16/. The methods for planning, preparation, execution, and evaluation of the SIVAT test follow IEEE Std. 1008-1987, /15/, endorsed by RG 1.171, /17/. This plan and additional SIVAT testing documents shall comply with UFTR "Quality Assurance Program (QAP)," /1/, UFTR "Conduct of Quality Assurance," /2/, UFTR "Quality Assurance Project Plan (QAPP)," /3/, UFTR "Software Quality Assurance Plan (SQAP)," /4/, and UFTR "Software Configuration Management Plan (SCMP)," /5/, and UFTR "Software Verification and Validation Plan (SVVP)," /6/, as applicable.

Preparedby Reviewed by QA-), UFTR-QAI-106.1 RE Name: Revision 0 Copy 1 UFTR Name:

Date: Initials: Date: Initials: Vol. 1 Page 7 of 28

2. Reference 2.1 UFTR Documents

/1/ UFTR-QAP, "Quality Assurance Program (QAP)"

/2/ UFTR-QAP-0 1-P, "Conduct of Quality Assurance"

/3/ UFTR QA1-QAPP, "Quality Assurance Project Plan (QAPP)"

/4/ UFTR-QA1-01, "Software Quality Assurance Plan (SQAP)"

/5/ UFTR-QA1-02, "Software Configuration Management Plan (SCMP)"

/6/ UFTR-QA1-03, "Software Verification and Validation Plan (SVVP)"

/7/ UFTR-QA1-05, "Software Safety Plan (SSP)"

/8/ UFTR-QAI-109, "Software Library and Control"

/9/ UFTR-QA 1-100, "Functional Requirements Specification (FRS)"

/10/ UFTR Technical Specifications 2.2 AREVA NP Inc. Documents

/11/ AREVA NP Inc., 43-10272, "Software Program Manual, TELEPERM XS Safety Systems"

/12/ AREVA NP Inc., 01-5044046-01, "TELEPERM XS SIVAT-TXS Simulation Based Validation Tool User Manual TXS-1047-76-2.1 (Version 1.5.0 and Higher)"

/13/ AREVA NP Inc. Document No., 38-9033245-000, "Safety Evaluation by the Office of Nuclear Reactor Regulation Siemens Power Corporation Topical Report EMF-21 10(NP) "TELEPERM XS: A Digital Reactor Protection System" Project No. 702" 2.3 Industry Standards

/14/ IEEE Std 829-1983, "IEEE Standard for Software Test Documentation"

/15/ IEEE Std 1008-1987, "IEEE Standard for Software Unit Testing" 2.4 NRC Documents

/16/ Regulatory Guide 1.170, Rev. 0, September 1997, "Software Test Documentation for Digital Computer Software Used in Safety Systems of Nuclear Power Plants"

/17/ Regulatory Guide 1.171, Rev. 0, September 1997, "Software Unit Testing for Digital Computer Software Used in Safety Systems of Nuclear Power Plants."

UFINRE Preparedby Reviewed by QA-l, UFTR-QAI-106.l Name: Name: Revision 0 Copy 1 UFTR Date : Initials: Date : Initials: VoL 1 Page 8 of 28

3. Abbreviations and Acronyms CATS-SDE Code Adaptation Tool for SDE Simulation Environment CRC Cyclic Redundancy Check FAT Factory Acceptance Test AREVA GmbH AREVA Germany Division ID Identification I/O Input/Output LSSS Limited Safety System Settings MSI Monitoring and Service Interface NRC Nuclear Regulatory Commission Oracle Database Management System Used by SPACE PC Personal Computer RCS Reactor Coolant System RPS Reactor Protection System RTE Run Time Environment SDD Software Design Description SDE Simulation Development Environment (used by SIVAT)

SER Safety Evaluation Report SIVAT Simulation Based Validation Tool SPACE Specification And Coding Environment (TXS engineering system)

SRS Software Requirements Specification TXS TELEPERM XS V&V Verification & Validation

Preparedby Reviewed by QA-l, UFTR-QAI-106.1 UFINRE Name: Revision 0 Copy 1 UFTR Name:

Date: Initials: Date: Initials: VoL. 1 Page 9 of28

4. Test Items The Test Item is the UFTR RPS Application Software that is compiled and linked using the SIVAT simulation-environment of the TELEPERM XS (TXS) system.

The Test Item is created using the SWAT-Tool CATS-SDE /12/. Sources for this simulation-environment are the SPACE project database, the simulation software (SDE), and the results of the code generation.

All software modules that make up UFTR RPS Application Software will be tested.

SIVAT testing as described in this plan will verify the correct implementation of all the functions and requirements, as described in the SRS. Additionally, SIVAT testing will verify the software design in the SDD was properly implemented in the SPACE project database as shown in the Application Software Code document.

The SPACE project database (i.e., Application Software) is identified by a version number and release date as described in UFTR "Software Library and Control," /8/. The code configuration is documented and checked through the use of the Cyclic Redundancy Check (CRC) checksums of the database tables.

4.1 Functions to be Tested The following RPS Application Software functions adapted from the Limited Safety System Settings (LSSS) shall be tested per UFTR "Functional Requirements Specification (FRS)," /9/:

1. RPS Trip #1: Reactor period < 3 sec
2. RPS Trip #2: Reactor Power > 119% of full power
3. RPS Trip #3: Loss of chamber high voltage (> 10%)
4. RPS Trip #4: Loss of electrical power to control console
5. RPS Trip #5: Primary cooling system

- Loss of primary pump power

- Low water level in core (<42.5")

- No outlet flow

- Low inlet water flow (< 41 gpm)

6. RPS Trip #6: Secondary cooling system (> 1kW)

- Loss of flow (well water *60 gpm)

- Loss of secondary well pump power

7. RPS Trip #7: High primary coolant inlet temperature (_ 99°F)
8. RPS Trip #8: High primary coolant outlet temperature (> 155'F)
9. RPS Trip #9: Shield tank low water level (6" below established normal level)
10. RPS Trip #9: Ventilation system

- Loss of power to stack dilution fan

- Loss of power to core vent fan

Preparedby Reviewed by QA-), UFTR-QAI-106.1 UFINRE UFNR Name: Name: Revision 0 1 Copy 1 Date: Initials: Dale: Initials: VoL 1I Page10 of 28 Justification for the Limited Safety System Settings (LSSS) is established in the UFTR Technical Specifications, /10/. The LSSS are established from operating experience and safety considerations.

NOTE: The Functions listed above include any associated supporting software modules.

4.2 Features to be Tested The Application Software functionality that is specified in the RPS UFTR SDD will be tested to determine if software elements (for example: modules, submodules, and functions) correctly implement software requirements. As a minimum the criteria for this determination are:

1. Compliance with functional requirements.
2. Performance at boundaries, interfaces, and under stress and error conditions.

In addition to the proper functionality of the Application Software according to the RPS UFTR SDD, the following characteristics must be verified:

1. Signals to output boards must have no fault status at all times, even under error and stress conditions.
2. Test results must be verified from start of test until the completion of the test in order to verify that no unexpected intermediate results are present.
3. Race conditions must be handled in order to ensure spurious alarms are not generated by the software.

NOTE: UFTR Specific RPS Application Software versions that have been fully tested with SIVAT do not need to be fully tested when the Application Software version is updated. The Application Software will only be tested for differences in logic between the two versions. Signal names, clear text descriptions, parameter changes, and Function Diagram name changes of the version update can be verified via visual inspection.

4.3 Features Not to be Tested The testing of TXS system software components (Operating System, 1/0 Drivers, Communication Software, RTE, and Function Blocks) is not within the scope of this test.

These system software components are considered "reusable" software and have been tested and qualified by AREVA NP GmbH. These system software components were previously deemed acceptable by the NRC within the scope described in TXS SER (Safety Evaluation Report), /13/.

Preparedby Reviewed by QA-I, UFTR-QAI-106.1 UFINRE Name: Revision 0 Copy I UFTR Name:

Date: Initials: Date: Initials: Vol. 1 Page11 of 28

5. Approach 5.1 Test Specification The test personnel will use the software design defined in the SDD to prepare the Test Specifications for each Input, Function, and Output Module or Submodule. The Test Specifications shall ensure that all functionality as defined in the SDD is tested. This shall be accomplished by specifying test cases that test the combinational logic of the individual software modules and submodules (Input, Function, and Output). To ensure that the boundaries and interfaces of the software modules are fully tested, overlapping tests shall be used.

The Test Specifications are to consist of test design and test cases that contain the detailed instructions for testing as well as the features or test case acceptance criteria (i.e.,

Expected Results) to be employed during the testing effort.

The inputs, tasks, and outputs of SWAT testing shall be determined, refined and designed in this phase. The format and outline of the Test Specifications are described in Section 8.1.

5.2 Test Procedure The Test Procedures shall contain test scripts that institute the test cases defined in the Test Specifications. These test scripts shall be created by utilizing the SDD and the Application Software Code document. For further information regarding the form and function of the commands that are to make up the test scripts, refer to the SIVAT User Manual, /12/.

The Test Procedure shall also verify the version/revision of the UFTR RPS Application Software, SIVAT Software and test scripts that are to be tested. Verification shall be done by comparing the CRC Checksums between the versions entered into the Software Library against the version specified in the Test Procedure.

The format and outline of the Test Procedure is described in Section 8.2.

5.3 Test Preparation The Test Engineer shall check out the following software from the Software Library (UFTR "Software Library and Control," /8/):

" The UFTR RPS Application Software

" SWAT test scripts ( that correspond to the specified Test Procedure)

The Test Engineer shall perform a pre-job brief.

SWAT testing shall be performed after the approved software design has been implemented in the SPACE project database and the Application Software has been generated for the SIVAT simulation without any error. Code generation is performed for two reasons:

1. The code generator will check the SPACE application software database for syntax completeness and testability.

UF/NRE Preparedby Reviewed by QA-1, UFTR-QAI-106.1 Name: Name: Revision 0 Copy I UFTR Date : I Initials: Date: Initials: Vol. Page 12 of28

2. The code generator will prepare the code for SIVAT simulation.

5.4 Test Execution The test tasks should be executed in the sequence of:

1. Input Submodule Tests
2. Function Module Tests
3. Output Submodule Tests NOTE: The Signal Monitoring Submodules shall be tested together with their corresponding functions and modules.

In the Test Procedures, the SIVAT Simulation commands shall be used to form scripts. These scripts are to be used to completely automate the test execution by using the StartSim.tcl configuration file within the SWAT-Tool. This file can be modified by the user to specify the order in which the scripts are to be run. The location and names of the script files must be entered into the StartSim.tcl file. The StartSim.tcl file is automatically called when the simulator is started. SWAT runs the test scripts and then terminates the simulation. For further information refer to the TXS Simulation Based Validation Tool (SIVAT) User Manual /12/.

The automated test scripts shall generate data files capturing the results of the test runs. The data files should be entered in the Software Library in accordance with UFTR "Software Library and Control," /8/. The data contained in the data files shall then be plotted using the SIVAT plot conversion tool. These plots are reviewed against the expected results and discrepancies shall be logged for disposition by the IV&V Group and shall be captured in the Test Incident Report (see Section 8.4). Additionally, the Test Logs (see Section 8.3) and a summary of the executed tests, including discrepancies affecting the software design and implementation, shall be documented in the Test Summary Report (see Section 8.5).

5.4.1 Test of Input Submodules Input Submodules read the input information from the field device(s),

perform signal validation and failure annunciation functions, and distribute the input signals to the I&C Function(s).

These tests verify the correct data is sent to the related I&C Function(s).

5.4.2 Test of I&C Functions I&C Functions obtain data from the Input Submodule(s), then perform the logic functions and send actuation signals to the Output Submodule(s) or actuation device(s).

These tests verify the implemented logic and the correct output to the annunciators, actuators, and to the related Output Submodule(s).

Preparedby Reviewed by QA-I, UFTR-QAI-106.1 UFINRE Name: Revision 0 Copy 1 UFTR Name:

Date: Initials: Date: Initials: Vol. 1 Page 13 of28 5.4.3 Test of Output Submodules Output Submodules obtain the outputs from the I&C Functions and then send the actuation signals to the actuation devices. Output Submodules also process checkback signals from the field device and provide component test logics.

These tests verify the implemented logic, the actuation of the field device, and the correct status of the actuated field devices.

5.5 Comprehensiveness The SIVAT Test Plan shall ensure that the functional requirements of the software design detailed in the SDD are properly implemented in the SPACE application. The comprehensiveness of the testing effort shall ensure that all functionality as defined in the SDD is tested.

Preparedby Reviewed by QA-I, UFTR-QAI-106.1 UF/NRE UFTR Name: Name: Revision 0 Copy I Date: Initials: Date: Initials: VoL 1 Page 14 of28

6. Item Pass/Fail Criteria The Test Item shall be considered successfully passed when the results of the test match the predicted results described in the Test Specification without unexpected intermediate results.

Any Test Item containing unexpected results can still be considered successfully passed if:

1. The disposition of the unexpected result determines that the UFTR Application Software is functioning correctly.
2. Disposition/justification of the item is documented and preserved in the Test Incident Report for future reference.

Under these conditions, a retest of the item shall not be necessary.

The Test Item shall be considered failed if the test script has a syntax error that prevents the script from running, or if the test script or the Test Specification is found to be in error (i.e., the results of the test do not match the predicted results described in the Test Specification). Any errors encountered while performing the test shall be documented in the Test Log and Test Incident Report.

Preparedby Reviewed by QA-I, UFTR-QAI-106. I UF/NRE Name: Name: Revision 0 Copy I Date: Initials: Date: Initials: Vol. 1 Page 15 of 28

7. Suspension Criteria and Resumption Requirements If a discrepancy is found during test execution, the error shall be documented in the Test Log and the Test Incident Report and where possible the automation is continued. A disposition of the discrepancies logged shall determine if the discrepancy affects the Test Specification, Test Procedure, SDD, or the Application Software.

If a discrepancy is found while comparing the plot data to the expected results, the discrepancy shall be recorded and dispositioned by the Hardware & Testing and Development Groups. The discrepancy shall be recorded in the Test Incident Report and the disposition of the results continues.

When a discrepancy is detected that affects the SDD or the Application Software, an "Open Item" shall be created and corrected following the methods described in UFTR-QAP-0 l-P, "Conduct of Quality Assurance," /2/.

During the review of the test results, all discrepancies shall be recorded in the Test Incident Report as described in Section 8.4.

Test reruns shall start after the changes to the SDD and Application Software have been implemented and the Test Specifications and Test Procedures have been updated to the new design. Test reruns shall be performed on all sections of the Test Specification deemed necessary and recorded in the Test Incident Report and the Test Summary Report.

Preparedby Reviewed by QA-), UFTR-QAI-106.1 UFMNRE Name: Revision 0 Copy 1 UFTR Name:

Date: Initials: Date: Initials: VoL 1 Page16 of 28

8. Test Deliverables The following types of documents shall be generated:
1. Test Specifications
2. Test Procedures
3. Test Log (Attached to Test Summary Report)
4. Test Incident Report
5. Test Summary Report All documents shall be prepared, reviewed and approved in accordance with the UFTR QAP, /1/, and UFTR-QAP-01-P, "Conduct of Quality Assurance," /2/.

8.1 Test Specification The Test Specification shall specify refinements of the test approach and identify the features to be tested by creating test cases. It also may be produced for subsets of the complete UFTR RPS Application Software.

The outline and content of the Test Specification shall have the following structure:

- Test Specification Identifier The Test Specification shall be identified by a UFTR Document ID number.

- Features to Be Tested Identify the components of the Test Item and describe the features and combination of features that are the object of this design specification. Other features may be exercised but need not be identified. For each feature or feature combination, a reference to its associated requirements in the requirements specification or design specification should be included.

- Approach Refinements Specify refinements to the approach described in the Test Plan. Include specific test techniques to be used. The method of analyzing test results shall be identified. Specify the results of any analysis that provides a rationale for test case selection. Summarize the common attributes of any test cases. This may include:

  • Input constraints that must be true for every input in the set of associated test cases
  • Shared environmental needs

" Shared special procedural requirements

  • Shared case dependencies

- Test Identification List the identifier of each test case associated with this design.

Preparedby Reviewed by QA-), UFTR-QAI-106.1 UFINRE UFTR Name: Name: Revision 0 Copy I Date: Initials: Date: Initials: Vol. 1 Page17 of 28

- Pass/Fail Criteria Specify the criteria to be used to determine whether the feature or feature combination has passed or failed.

- Test Items Identify the components of the Test Item and features to be exercised by this test case.

For each item consider supplying references to the following Test Item documentation:

" Software Design Description

  • Application Software Code document

- Input Specifications Specify each input required to execute the test case. Some of the inputs will be specified by value (with tolerances where appropriate) while others will be specified by name.

Specify all required relationships between inputs.

- Output Specifications Specify all the outputs required of the Test Items (i.e., expected results). Provide exact value (with tolerances where appropriate) for each required output.

- Environmental Needs

" Hardware Specify the characteristics and configurations of the hardware required to execute this test case.

  • Software Specify the system and application software required to execute this test case.

This may include software such as operating systems, compilers, simulators, and test tools. In addition, test items may interact with application software.

" Other Specify any other requirements such as unique facility needs or specially trained personnel.

- Special Procedural Requirements Describe any special constraints on the Test Procedures that execute the test case. These constraints may involve special set up, operator intervention, output determination procedures, and special wrap up.

- Interease Dependencies List the identifiers of test cases that must be executed prior to this test case. Summarize the nature of dependencies.

UFINRE Preparedby Reviewed by QA-1, UFTR-QAI-106.1 Name: Revision 0 Copy I UFTR Name:

Date: Initials: Date: Initials: VoL 1 Page 18 of 28 8.2 Test Procedure Test Procedures shall consist of steps identified by the respective Test Specification. The outline and content of the Test Procedure shall have the following structure:

- Test Procedure Identifier The Test Procedure shall be identified by a UFTR Document ID number.

- Purpose Describe the purpose of the Test Procedure.

- Special requirements Identify any special requirements that are necessary for the execution of the Test Procedure.

- Procedure Steps The procedure steps shall be documented as part of the attached test script.

" Log Describe any special methods or formats for logging the results of the test execution, the incidents observed, and any other events pertinent to the test.

  • Set Up Describe the sequence of actions necessary to prepare for the execution of the procedure.

" Start Describe the actions necessary to begin execution of the procedure.

" Proceed Describe any actions necessary during the execution of the procedure.

  • Measure Describe how the test measurements will be made.

" Shutdown Describe the actions necessary to suspend testing, when unscheduled events dictate.

  • Restart Identify any procedural restart points and describe the actions necessary to restart the procedure at each of these points.
  • Stop Describe the steps necessary to bring execution to an orderly halt.

" Wrap Up Describe the actions necessary to restore the environment.

" Contingencies Describe the actions necessary to deal with anomalous events that may occur during execution.

Preparedby Reviewed by QA-I, UFTR-QAI-106.1 UFINRE UFTR Name: Name: Revision 0 Copy.!

Date: Initials: Date: Initials: Vol. 1 Page 19 of28

- Attachment For each Test Procedure, the test scripts shall be attached. The StartSim.tcl configuration file (see Section 8.3) shall also be attached to each Test Procedure. The test scripts shall be written using the SIVAT command syntax and shall contain the following information:

" Version identification of the test script

  • Test purpose (including Functions/Modules tested) 8.3 Test Log The Test Log shall provide a chronological record of relevant details about the execution of tests. The Test Log shall be attached to the Test Summary Report. The outline and content of the Test Log shall have the following structure:

- Description Information that applies to all entries in the log, except as specifically noted in another log entry, shall be noted here. The test log shall:

" Identify the items being tested, including their version/revision levels.

" Identify the attributes of the environment under which testing is conducted, by including facility identification, hardware being used, and system software used.

- Activity and Event Entries For each event, including the beginning and end of activities, record the occurrence date and time along with the identity of the author. The following information shall be considered:

" Execution Description Record the identifier of the Test Procedure being executed and supply a reference to its specification. Record all personnel present during the pre-job brief and test execution including testers, operators, and observers. Also indicate the function of each individual.

" Procedure Results For each execution, record the visually observable results (for example, error messages generated, aborts, and requests for operator action). Also record the location of any output. Record the successful or unsuccessful execution of the test.

  • Environmental Information Record any environmental conditions specific to this entry (for example, hardware substitutions).
  • Anomalous Events Record what happened before and after unexpected events. Record the

I Preparedby Reviewed by QA-1, UFTR-QAI-106.1 UF/NRE Name: Name: Revision 01 Copy 1 Date: Initials: Date: Initials: VoL 1 Page 20 of 28 circumstances surrounding the inability to begin execution of a Test Procedure or failure to complete a Test Procedure.

Incident Report Identifiers Record the identifier of each Test Incident Report, whenever one is generated.

8.4 Test Incident Report The Test Incident Report shall document any event that occurs during the testing process that requires investigation. The outline and content of the Test Incident Report shall have the following information:

- Test Incident Report Identifier The Test Incident Report shall be identified by a UFTR Document 1D number.

- Summary Summarize the incident. Identify the Test Items involved, indicating their version/revision level. References to the appropriate Test Specifications or Test Procedures shall be supplied.

- Incident Description Provide a description of the incident. Provide the name of the person that identified the incident and the date that the incident was identified. The incident description shall use the discrepancy resolution process defined in Section 5.4 of this document.

- Impact If known, indicate. what impact this incident will have on Test Plans, Test Specifications, Test Procedures, SDD, or Application Software.

8.5 Test Summary Report The Test Summary Report shall contain test summary information. Some or all of the content of a section may be included by reference to the document that contains the information. The outline and content of the Test Summary Report shall have the following structure for each test run:

- Test Summary Report Identifier The Test Summary Report shall be identified by a UFTR Document ID number.

- Test Summary Summarize the evaluation of the test items. Identify the items tested, indicating their version/revision level. Indicate the environment in which testing activities took place.

For each test item provide references to the following documents if they exist:

" Test Plans

" Test Specifications

" Test Procedures

" Test Incident Reports

Preparedby Reviewed by QA-I, UFTR-QAI-106IO)

UFINRE Name: Name: Revision 0 Copy 1 Date: Initials: Date: Initials: Vol. I Page 21 of 28

  • Test Logs

- Variances Report any variances of the Test Items from their design specifications. Indicate any variances from the Test Plan, Test Specifications, or Test Procedures. Specify the reason for each variance.

- Comprehensive Assessment Evaluate the comprehensiveness of the testing process against the comprehensiveness criteria in the Test Plan if it exists. Identify features or feature combinations that were not sufficiently tested and explain the reasons.

- Summary of Results Summarize the results of the testing. Identify all resolved incidents and summarize their resolutions (i.e., reference to the Test Incident Report). Identify all unresolved incidents.

- Evaluation Provide an overall evaluation of each test item including its limitations. The evaluation must be based on the test results and the item level pass/fail criteria. An estimate of failure risk may be included.

- Summary of Activities Summarize the major testing activities and events referencing the Test Log.

Preparedby Reviewed by QA-1, UFTR-QAI-106.1 UFINRE UFTR Name: Name: Revision 0 Copy I Date: Initials: Date: Initials: Vol. 1 Page 22 of 28

9. Testing Tasks The following steps shall be executed in sequence to verify that the functional requirements of the SRS and the software design in the SDD are properly implemented in the SPACE application:

" Prepare the Test Specification (with expected results) in accordance with the software specification documents.

" Prepare the Test Procedure (test scripts) in accordance with the software specification documents.

" Checkout applicable Software (i.e., Application Software & test scripts) from the Software Library.

" Complete pre-job brief prior to running the Test Procedures.

" Run the SIVAT Test Procedures for the appropriate Test Item.

" Plot the test results using the SIVAT plot conversion tool

  • Compare the test results against the expected results listed in the Test Specification to check for implementation correctness.

If discrepancies are identified by the Independent Verification & Validation (IV&V)

Group, the steps as described in Section 7 shall be followed for corrective actions.

The procedures for change control are defined in UFTR SCMP, /5/.

Preparedby Reviewed by QA-I, UFTR-QAI-106.1 UFINRE Name: Revision 0 Copy I UFTR Name:

Date: Initials: Date: Initials: Vol. 1 Page 23 of 28

10. Environmental Needs The physical characteristics of the hardware shall be provided in sufficient detail to ensure the simulation tool (SWAT) and the Application Software tests can be executed as desired. Also specify the level of security that must be provided for the test facilities, system software, and proprietary components such as software, data, and hardware. Identify any other testing needs. Identify the source for all needs that are not currently available to the test group. Identify any special test tools or supporting software packages needed.

10.1 Hardware The computers used for the PC clients/Engineering Servers shall have the following minimum performance capabilities:

" Minimum clock frequency of the processor: 600M1-lz

  • Minimum graphics card/monitor resolution: 1280 x 1024 pixels

" Minimum memory capacity: 512 MB

" Minimum hard disk capacity: 20GB

" Monitor, Mouse (PS/2), Keyboard (PS/2) ports

  • USB port 10.2 Software 10.2.1 Operating System(s)

For the Engineering Server, the operating system shall be SuSE Linux 8.2.2 or higher. For the PC clients, the operating system shall be able to support a network configuration and use network connection interface software to connect to the Engineering Server.

10.2.2 Communications Software For the PC client, network connection interface software (i.e., Exceed 8.0.0.0 or higher) is required for multiple users to gain remote access to the Engineering Server.

10.2.3 Security A password protected login to the Engineering Server is used by the Test Engineer to perform the test. Software versions are controlled during the test by using the process defined in UFTR "Software Library and Control," /8/.

10.2.4 Tools The following software tools are required to be installed on the Engineering Server prior to starting testing:

1. Oracle Database version 1.3.1
2. SPACE version 3.0.7a
3. SIVAT version 1.5.3

Preparedby Reviewed by QA-), UFTR-QA 1-106.1 UF/NRE Name: Name: Revision 0 Copy 1 UFTR Date: Initials: Date : Initials: Vol. 1 Page 24 of 28

4. SDE version 1.2.3

Preparedby Reviewed by QA-1, UFTR-QAl-106.1 UFINRE Name: Revision 0 Copy I UFTR Name:

Date: Initials: Date: Initials: Vol. 1 Page 25 of 28

11. Responsibilities The overall project team organization and responsibilities for the design activities for the RPS project are defined in UFTR QAPP, /3/. The individuals and groups identified in the following sections will be involved in SWAT testing.

11.1 Project Coordinator The Project Coordinator controls the overall testing process and makes all decisions relevant to project personnel and assignments (including delegation of responsibilities).

The Project coordinator also acts as the interface to other groups for deviation management and documentation version control.

11.2 Software Development Group The Software Development Group is responsible for the complete software design and the software code generation using the TXS engineering tool SPACE. The Software Development Group is also responsible for the identification and control of software Configuration Items. All modifications to the software design and software code in all phases, are performed by this group.

11.3 IV&V Team The IV&V uses the SIVAT simulation tool for software testing in order to enhance the quality of the software design prior to the FAT effort.

The Test Administrator is designated by the Project Coordinator to oversee the testing process of each Test Item. This includes preparing the pre-job brief and answering any questions the Test Engineer has during the execution of the Test Item.

The Test Engineer(s) prepares and executes the test. The Test Engineer(s) also prepares the data, in the form of plots, which are reviewed independently for identification of incidents. The IV&V Group is responsible for creating the Test Incident Report and the Test Summary Report.

The TV&V Team is responsible for the independent V&V during the performance of all phases of the software life cycles described in UFTR SVVP, /6/, including this plan.

Preparedby Reviewed by QA-I, UFTR-QAI-106. I UFINRE UFTR Name: Name: Revision 0 Copy .

Date: Initials: Date: Initials: Vol. 1 Page26 of 28

12. Staffing and Training 12.1 Staffing The following staff is needed to carry out this testing project:
1. Project Coordinator
2. Test Administrator
3. Test Engineer(s)

Per Regulatory Guide 1.171, /17/, requirement (noted below), the IV&V personnel should be independent from the Software Development Group. This means that the design engineer(s) shall not be a participant in the Test Plan, Test Specifications, or Test Summary Report development unless an independent review of these documents is performed. Also, the design engineer(s) shall not be involved in the testing unless an independent review of the test results is performed.

12.2 Training The Test Engineer(s) shall be trained in the use of the SPACE system and the SIVAT testing tool.

The test personnel shall receive the following training prior to performing Application Software testing:

L. Software Configuration Management Plan, /5/,

2. Software Verification and Validation Plan, /6/,
3. Software Library and Control, /8/,
4. Software Safety Plan, /7/
5. Open Item Process
6. TXS Software
7. Quality Assurance Training

Preparedby Reviewed by QA-I, UFTR-QAI-106.1 UFINRE UFTR Name: Name: Revision 0 Copy .1 Date: Initials: Date: Initials: Vol. 1 Page27of28

13. Schedule The SIVAT Test Plan and implementing activities are governed by UFTR QAPP, /3/ and SVVP, /6/.

Preparedby Reviewed by QA-), UFTR-QAI-106.1 UF/NRE UFTR Name: Name: Revision 0 Copy I Date: Initials: Date: Initials: VoL 1 Page28 of 28

14. Risk and Contingencies SWAT testing may not begin if there are Design Open Items to be resolved, unless a conscious project decision is made by Project Management. This project decision must be identified and recorded in the Test Log with sign-offs by Project Management.

Design Open Items that are known at the time of the test start are to be recorded in the Test Log and are to be clearly identified in the Test Summary Report. Actions must be taken to ensure that these Design Open Items are resolved and tested before FAT commences.