ML20078P110
ML20078P110 | |
Person / Time | |
---|---|
Site: | Clinton |
Issue date: | 10/19/1983 |
From: | Hall D ILLINOIS POWER CO. |
To: | |
Shared Package | |
ML20078P098 | List: |
References | |
RTR-NUREG-0737, RTR-NUREG-737 PROC-831019, NUDOCS 8311030199 | |
Download: ML20078P110 (26) | |
Text
-
CLINTON POWER STATION VERIFICATION AND VALIDATION PLAN FOR SAFETY PARAMETER DISPLAY SYSTEM Recommended Approval bibs 7/
? Power Plant Manager
/ OTDate'/b / C/03 Recommended Approval i9 btb t / /.2NN ['S Manager cle r a ' n Engrg. Date Approved / / /8/M8
! Ee Pfesident . Date ,
l-l I
f' 8311030199 831028 PDR ADOCK 05000461 PM
! F
]
TABLE OF CONTENTS 1.0 Purpose 2.0 Document Approval 3.0 Program Overview 3.1 Verification 3.2 Validation 3.3 Verification and Validation Activities 3.3.1 SPDS Requirements Review 3.3.2 SPDS Design Review 3.3.3 Validation Test 3.3.4 Field Verification Test 3.3.5 Validation Report 4.0 Responsibilities 4.1 Validation and Verification Activity Team Members 4.2 Program Coordination and Control 5.0 System Requirements Review 5.1 General 5.2 Purpose 5.3 Requirements Review Overview 5.4 Requirements Review Activities 5.4.1 Requirements Matrix 5.4.2 Source Document Review 5.4.3 V&V Plan Consistency Assessment 5.4.4 SPDS Requirements Review 5.4.5 Testing Requirements Review 5.4.6 Human Factors Requirements Review 5.4.7 Requirements Review Report 5.5 Subsequent Changes 6.0 System Design Review 6.1 General 6.2 Purpose 6.3 Design Review Overview 6.4 Materials Required 6.5 Design Review Activities 6.5.1 Design vs. Requirements Consistency 6.5.2 Evaluation / Analysis of Documentation 6.5.3 Data Interface Design Review ,
6.5.4 Performance Assessment 6.5.5 Design Specific Test Informa' tion 6.5.6 Design Verification Report 6.6 Subsequent Changes 7.0 Validation Test 7.1 Overview 7.2 Validation Test Plan 7.2.1 General 7.2.2 Test Case Definition 7.2.3 Test Environment Definition 7.2.4 Format 7.3 Validation Test Execution and Results Analysis 8.0 Field Verification Test 8.1 Objective 8.2 Purpose 8.3 Test Plan and Report 9.0 Validation Report 10.0 References J
, a 1.0 PURPOSE The purpose of this document is to specify the requirements and procedures for the implementation of Safety Parameter Display System (SPDS) Validation and Verification (V&V) at Clinton Power Station. This activity is performed to meet the SPDS V&V requirements of NUREG-0737, Supplement?#1.
2.0 DOCUMENT APPROVAL The Validation and Verification Program is the method through which Illinois Power Company will assure the implementation of an SPDS which is a useful and effective tool for plant operators.
Therefore it is vital that the contents of this document be approved by the appropriate levels of management. Each revision of this document shall be a complete reissue bearing a title page with recommendations for approval by the CPS Power Plant Manager and the Manager of Nuclear Station Engineering. The signature of the Vice-President shall indicate formal approval of this document's contents.
3.0 PROGRAM OVERVIEW 3.1 Verification Verification is the review of the requirements to see that the right problem is being solved and then the review of the design to see that it meets the requirements. Thus, verification is based on a communication of activities and results of project activities. The most frequent means of communication is through documentation. Documentation creates a traceable and systematic approach to the development of a product. The translation of information from one phase of development to the next must be sufficient to be understood by knowledgeable percons other than the originator. If it is determined that an accurate translation has been performed, then that step has been verified. Verification activities do not replace industry practices, compliance with industry standards, design guidelines or other project management or configuration management activities.
3.2 Validation Validation is the test and evaluation of the integrated hardware and software system to determine compliance with the functional, performance, and interface requirements.
Thus, the validation process provides an overall assurance that the capabilities specified in the system requirements are implemented in the hardware and softw_;e. Another goal is tu ensure that problems or potential deficiencies that may have occurred during development have been corrected.
J
3.3 Validation and Verification Activities 3.3.1 SPDS Requirements Review The SPDS Requirements are the foundation on which the completed system must be designed, built, and accepted.
This is perhaps the most important activity of the V&V Team. The completed system is validated against SPDS Requirements. The main objective of the indepedent SPDS Requirements review is to determine if the requirements are unambiguous, correct, complete, consistent, feasible, and testable.
3.3.2 SPDS Design Review The review of the hardware and software design is focused on determining if the design is a correct implementation of the requirements. Items such as system architecture, input / output interface, operating sequences, information flow testability, human factors engineering, etc., will be reviewed.
The operational objective of the SPDS is to provide the control room operators with concise and unambiguous data which will characterize the overall plant safety status. Therefore, an objective is to review the bases for the selection of variables that comprise the SPDS parameter set and to apply methods of validation to ensure that the appropriate parameters have been chosen.
3.3.3 Validation Test A V&V review of the SPDS Preoperational Phase Tests will be performed to determine if the completed system meets the requirements stated in the SPDS Requirements Document. Test plans and procedures will be prepared prior to validation testing. Test execut, ion and results analysis complete the validation testing activity with any identified discrepancies documented for resolution.
3.3.4 Field Verification Test The objective of this activity is to verify that SPDS was properly installed. As a minimum, field verification will consist of verifying that each input is correctly connected and that the signal range is consistent with the design. Startup Checkout & Initial Operation (C&IO) Testing on SPDS may fulfill much of this testing.
3.3.5 Validation Report The validation report is a summary document of the project V&V activities to aid in the traceability of the activities performed.
l
t 4.0 RESPONSIBILITIES 4.1 Validation and verification Activity Team Members Verification and Validation activities shall be performed by available personnel who are independent of those who participated in the design and development of the SPDS. The SPDS V&V Team shall be comprised of members from various disciplines that are familiar with CPS design in order to ensure maximum effectiveness and credibility.
The SPDS V&V Team Members are represented by the following departments as shown listed below:
Department NSED-Technical Assessment CPS-Technical Staff NSED-Construction Engineering CPS-Startup CPS-Operations Sargent & Lundy Engineers 4.2 Program Coordination and Control The-SPDS V&V Team Leader is responsible for the overall implementation of the Validation and Verification program for SPDS. .The Emergency Response Capabilities
~
Implementation (ERCIP) Program coordinator is responsible for coordinating these activities with the remainder of ERCIP.
5.0 SYSTEM REQUIREMENTS REVIEW 5.1 General The SPDS Requirements are the foundation on which the completed system must be designed, built, and . accepted. The completed system must be validated against the SPDS Requirements Document.
5.2 Purpose The principal goal of this activity is to determine if the requirements will result in a reasonable and usable solution to the entire problem. One objective is to determine if the right problem is being solved. The requirements are reviewed for correctness, consistency, understandability, feasibility, testability, and traceability. The Requirements Review also provides the basis for developing
-the SPDS Validation Test Plan.
5.3 Requirements Review Overview The review will consist of an evaluation of the material in the requirements document as well as an identification of any deficiencies in the definition of the system. As a minimum the review will evaluate the following items:
- a. Complete and correct specification of the performance requirements and operational capabilities and concepts of the system. Determine if the right problem is being solved and if the requirements are consistent with emergency operating procedures (EOPs),
etc.
- b. Complete and correct in system definition and interfaces with other equipment / systems.
- c. Concise, correct and consistent description of the interfaces and performance characteristics of each major function.
- d. Establishment of a reasonable and achievable set of. test requirements. These requirements should be related to performance goals and also define acceptability criteria.
- e. Definition of the necessary logistics, personnel, and training requirements and considerations.
- f. Definition of input and output signals, and establishment and management of the data base.
- g. Treatment of man / machine interface requirements,
- h. Definition of subsystems and integration requirements.
One of the primary attributes that will be ass.essed is the testability of each-requirement.- The :est requirements define the test objectives and enable tne system to be validated after system integration. In addition to identifying the types of tests to be performed, acceptance criteria must also be defined.
Specific design requirements to facilitate integration testing or installation testing should be identified in the requirements. It is realized that some requirements cannot be tested and/or acceptance criteria cannot be defined.
This lack of definition should not be ignored but should be recognized in the requirements document.
5.4 Requirements Review Activities 5.4.1 Requirements Matrix The first step in the SPDS Requirements Review will be
V the development of an SPDS Requirement Matrix for tracking progress through design into validation testing. This is simply a list of the requirements with columns used to indicate the portion of the SPDS Design' Document or Validation Test Plan addressing the requirement..
The Requirement Matrix is a tool which enables cross referencing of tests to the requirements. It also aids in identifying where a requirement is implemented, and, if changes or failures occur, helps identify affected areas.
5.4.2 Source Document Review This activity is simply a review of the source documents (i.e., NUREGs, Reg. Guides, etc.) to determine if the requirements as specified in the SPDS Requirements Document are consistent with the source documents and are complete. Each source document should be specified and an analysis of each requirement shall be performed to determine how the SPDS Requirements Document supports each requirement.
5.4.3 V&V Plan Consistency Assessment This activity is a review of the SPDS Requirements Document to determine if the format and content are consistent with the performance of the Design Review, Validation Testing and Field Verification Testing as specified in this document.
5.4.4 SPDS Requirement Review This activity is a review of the system requirements for the'following attributes:
- a. Is the right problem being solved?
- b. Are th"e requirements consistent with EOPs?
- c. Are requirements complete and correct in system definition?
5.4.5' Testing Requirements Review This activity is an analysis of the testability of the requirements and determination of how well the acceptance criteria are defined. This review will determine what testing will be needed and how it will be judged.
r . - - - - , ,-- g y _ ,
a
% J.
} :
5.4.6~ Human Factors Requirements' Review
- This activity analyzes' how well human factor
. considerations have'been' addressed;to ensure operators williaccept and use SPDS..
.5.4.7- cRequirements Review Report The results'of the SPDS Requirements' Review.will be documented. ~The outline for this report is identified-
-in Table 1. Actual' contents of this report will be
- appropriately categorized when the report is written.
5.5 Subseqbent Changes--
Revisions-that are made to the.SPDS. Requirements Document
.- . subsequent to the~ completion of the Requirements Review 7;- 1 4E_'. .' . require an independent review to ensure-the change does not A invalidate the verification' process.. A-summary of this review will'be appended to the Requirements Review Report f; ar. f'-
for each-revision of the SPDS Requirements Document.'
^
6.0 SYSTEM-DESIGN REVID1 6.1 General
- The objective of.this activity is the verification of the
. hardware and software design against;the SPDS Requirements.
This review covers both the hardware and software specifications as well as the design. The SPDS Design Document should'be supplemented by an SPDS HardwareJDesign' Specification and an SPDS Software Design Specification
. during normal system-development.- The' specifications.'and-f
- design description may'be integrated.into a single document
~
,' g~g l' as.a resultiof. iteration in-items such as.' display formats,
' data structures, hardware interfaces, etc.
- 6 . 2. Purpose
~
The' purpose;of the Design Review is to examine the design of the: software'and hardware to ensure it is complete and that there are no ambiguities or deficiencies.
6.3 Design Review Overview QJ !
- This-V&V activity examines the design of the-software in
= terms of its logical integrity,' ability to. satisfy'the-i performance requirements,-data manipulations, and timing a requirements. .There.are many.other; items to be considered j"a such as operator response, hardware interface design,' data base design, control structures, task allocation and cprogramming languages. The design review will address the following items:
- ~
~
J,.
-d N;
.y
- a. Architecture for both hardware and software.
b.- Input / Output interfaces.
- c. ' System and Executive Control.
- d. Operating sequence-initialization, startup, error detection,. restart, etc.
~
- e. Testability - use of test ~ equipment such as data tapes, simulations,-etc.
- f. Timing Analysis - sampling rates, response time, etc.
-g.- Availability of Hardware (i.e.,-Mean Time Between Failure, Mean Time to Repair).
h.. - Algorithm Design and Data Verification.
i.- - Information flow - communication between hardware subsystems,-data management, and signal conversion
' into engineering units.
. j .- Human' Factors Engineering
-what analysis was performed to determine response to displays?
-Has Reactor Operator input to format development been documented?
- k. Definitinn of physical characteristics, reliability, and maintainability objectives, operating _ environment, transportability constraints, and design and construction standards, including those intended'for the
< software.
1.- - Are the right plant parameters being monitored?
6.4 Material-Required The following items will be used as input to the Design Review and should be available~and complete prior to the performance of this event:
- a. SPDS Requirements Document
- b. SPDJ. Design Document
- c. SPDS Hardware Design Specification
. d. SPDS Software Design Specification o
- ,,-v-v,,w.,c c- ew w, ,,r-r-y-,-- ,m., - . - , , - y , w.w..,g --~y,, ,,m,-,~,ew . - ,.y., , yw.
-* e -
, tr., , .*w3
.. g, . . . . - .. .- - - - . .- .. . ~ . --
+ ,. 7 1.
f
.e. - SPDSl Require'ments Matrix (see.. Requirements Review) 4 Ef. SPDS Safety' Analysis Report 655 EDesign' Review Activities ;
This'section1 outlines various' activities which should be
. 1 performed during4the~ design review. It is-not intended that
- this:section should limit.the scope of activities. performed
'by(the' assigned V&V team but to give guidance and' direction
- to; their ef fort > Prior to'the initiation of the Design j ; Review, the1V&V team should review the documents in section 16.41to~ ensure they are complete and approved, and to i
Jdetermine_what additional. inputs should be made available.
6.5.1 Design vs Requirements Consistency e ~
] The~SPDS Design Document.(including-SPDS Hardware and.
E Software Design' Specifications) will contain much more detail than does the SPDS Requirements Document and
'will provide details on how the requirements-are-to be
- met.. One of?the key objectives of the Design Review is
.to determine that'the design is consistent with the P system requirements. Depending on the design
! documentation,'one should correlate'each requirement
. -defined in:the Requirements Review with the specific l
~
designifeature. This correlation will be used to incorporate decign specific. test information in the
!. - Validation TestLPlan. 1The Requirements Matrix should be used as a tool for the performance of this p
, correlation.1 6.5.2' Evaluation / Analysis of' Documentation
'The. major. method used-in performing'the Design Review . '
i Lis.the evaluation and analysis ofzthe documentation. . A
= design walk-through~will be used~to supplement the
- designLdocumentation, and such supplemental information
- shall beiadded to the formal documentation. Another
< analysis tool ~that=should be utilized is the assessment I of the' design. hierarchy from.the top;down. The objectives of the top-down analysis ~is the assessment
- of; adequacy of subsystem interface and the L- identification of potential deficiencies in meeting all
-.the : requirements .
- 6.5.3- -Data Interface Des ign-Rev ew i This activity is a review of signal set point tables
- . and similar essential design information and the
. methods established'to ensure hardware changes are accurately reflected. This will include an analysis of
. the subsystem interfaces-for completeness and accuracy.
t-O.
4 e
t7 y e y- wv. e 9 33-w-= y,-4wg- y y,y-me eyw tis,- *- w ~,mv- =, , , - . , -
- 3.-pm-r,- ,,,ee-- e. ,, w w w -,-1-we,.-+etw w y , y.g-- ,--ww-,- ,r+,-~- w 'w-ee* w wr'av +--,m = w - w- vr- e n e v T--
\
e 6.5.4 Performance Assessment Another key objective of the Design Review is the independent assessment of the ability of the design to meet performance requirements. Such capabilities as time response availability, man / machine interface, data validation, operating environment, dynamic range, testability, etc. must be analyzed as part of the design documentation. The V&V team reviews this information for correctness, feasibility, and consistency. This portion of the Design Review includes an assessment of the pertinent human engineering aspects to determine the usability and learnability of the displays from an operator viewpoint. Some of the performance requiremeats may be difficult to meet (or prove), but the identification of these issues during design is more efficient than discovering them during validation testing.
6.5.5 Design Specific Test Information Another result of the Design Review is the identification of the design specific test information.
This information is required for the development of the test objectives, test environment, and test procedures to be used during validation testing. The involvement of the V&V team in the Design Review ensures that the test plan can be efficiently developed.
6.5.6 Design Verification Report The results of the design verification activity will be documented with any deficiencies identified for resolution. An outline of the information to be provided in the Design Verification Report is given in Table 2. The actual contents of this report will be appropriately categorized when the report is written.
6.6 Subsequent Changes Revisions that are made to the SPDS Design Document subsequent to the completion of the Design Verification Review require an independent review to ensure the change does not invalidate the verification process. A summary of this review will be appended to the Design Verification Review Report for each revision of the SPDS Design Document.
7.0 VALIDATION-TEST 7.1 Overview This activity is performed to demonstrate that the integrated system meets the requirements. There are two distinct subactivities; (1) Test Plan Development, and (2)
Test Execution and Results Analysis. The foundation for this activity lies in the information derived from the Requirements Review, The Design Review, and the hardware, software, and system tests performed during acceptance and Startup testing. The SPDS Validation Tests follow the system integration test (Preoperational Phase Tests) performed by Startup to demonstrate that the hardware and software function acceptably. The Preoperational Phase Tests may fulfill many of the requirements of the SPDS Validation Test.
7.2 Validation. Test Plan 7.2.1 General The most important step in this activity is the development of the Validation Test Plan. The purpose of the Validation Test is to demonstrate that the completed system meets all of the SPDS Requirements.
Some requirements may not be testable and these were identified in the Requirements Review and specifically analyzed in the Design Verification. Test cases must be defined to demonstrate that each testable requirements has been met. Tests which require steady state input signals (static tests) may be used to test some. requirements. On the other hand, dynamic tests which utilize time dependent input data may be needed to demonstrate that the system meets some of the requirements. The ability to utilize data input tapes or digital or analog simulations is one of the key requirements of the test equipment. Items to be covered in the Test Plan include:
-Test Requirements
-Test Philosophy
-Test Environment
-Test Specifications
-Detailed Test Descriptions
-Test Procedures
-Test Evaluation Approach c - . _ _ - . .
.ry,.y y -,, ,w. q +--- .- - , - , y,,
4- ,
f
! 7 . 2. 2. Test Case Definition The Requirements Matrix developed during the Requirements. Review will be used to define.the test .
requirements. Based on each capability to be tested, static and/or dynamic tests will be.specified.
As part.of test case. identification, the Preoperational ,
Phase tests performed by~Startup will be reviewed and the results independently analyzed. It-is expected that some of the Preoperational Phase Test results.will alleviate the need for.independtly performing the same or.similar Validation' Tests. 'However, the performance
. of tests.by Startup.or others does not eliminate the need for validation testing, it can only improve the efficiency of the validation effort.
t 7.2.3 Test Environment Definition The definition of the test cases will define the test environment that is required. The test environment includes support hardware and software, special. test interfaces and diagnostic equipment. Of~special concern for the SPDS is the. definition of dynamic test data to drive-the displays. . Compatibility between the integrated system and the test environment is one of
- the issues covered in the Design Review. i 7.2.4 Format A formal Test Plan will be prepared prior to Preoperational Phase _ Testing and modified as necessary.
based on-Preoperational Phase Test results. Table 3 provides an outline of the' contents of the Validation Test-Plan and may be modified'during plan development.
i.
7.3 Validation Test ExecutionLand Results Analysis This activity includes validation testing, recording of-test
. results and the analysis of results for acceptability.
- Records mustlbe kept during each validation test-to insure
- ' that the test is identified,.and the input and outputs are archived with sufficient detail so that the same test could be repeated by others and the results' confirmed.
4 Particular attention must be-paid to the analysis of
- acceptability of results during testing. The procedures defined in the test plan for making modifications to the hardware'and software during validation testing to resolve nonconformances and the restart / continuation of tests must be.followed. Documentation'that such procedures were followed'must be maintained.-
?
-4 ,,-t- r 4 -, ,,,v y.. y i.e-..<,.,,-w., 1--,. ,,,-..,.,..-,-,mm..,mm,-.,,-,,.m.-- -,-rv-,y-, .--+,,w,-%, e. ,, ,,-,,,-i,-.-.-.r--+, -
'The test results, analysis, and nonconformances to acceptability criteria must be documented in the Validation Test Report. The outline of~such a report are given in Table 4. The actual contents of this report will be appropriately categorized when the report is written.
8.0 FIELD VERIFICATION TEST 8.1 Objective The objective of this activity is to verify that the.
validated system was properly instal. led. The installation procedures (e.g., C&IO Tests) will be reviewed to assess the completeness and thoroughness with which the system was installed and checked. An Installation Verification Test Plan will be prepared. The tests to be performed and any special test equipment will be defined in this plan.
8.2 Purpose The purpose of the Field Verification Test is not to validate the system but to insure its proper installation.
As a minimum, field verification will consist of verifying that each input signal is properly connected and that the signal range is consistent with the design. It must be verified that the information displayed is directly correlated with the sensor data being input. It is expected an independent review of Startup C&IO tests may fulfill a portion of the Installation Verification Test Plan.
8.3 Test Plan and Report The Installation Verification Test Plan should identify the test-philosophy, test methods, and test environment. For each test, the test inputs, test outputs, and acceptability criteria must be specified. The Installation Verification Test Plan is similar to the Validation Test Plan, see Table
- 3. Likewise, the Field Verification Test Report should contain the same type of information as that provided in the Validation Test Report.
9.0 VALIDATION REPORT The purpose of this activity is to summarize the V&V activities performed throughout the project. Since no new information is presented, this report is based on project documentation generated previously. This document will provide the foundation for discussions on the scope and results of the V&V effort and will be prepared to aid the NRC in reviewing the adequacy of the system validation effort. Emphasis will be placed on summarizing l
d
the activities performed, the findings and corrective actions which occurred in the following V&V activities:
- 1. Requirements Review
- 2. Hardware and Software Design Review
- 3. Validation Test
- 4. Field Verification Test
' Traceability of the verification and validation activities throughout the project, identification and resolution of discrepancies and reference to detailed documentation will be provided in the Validation Report. Table 5 provides an outline of the contents of the Validation Report and may be modified as needed during plan development.
10.0 REFERENCES
10.1 E..A. Straker, et.a.; " Verification and Validation for Safety Parameter Display Systems", Nuclear Safety Analysis Center, NSAC-39. December 1982.
10.2 S. H. Saib, et.a.; " Validation of Real-Time Software for Nuclear Plan Safety Applications." Electric Power Research Institute, NP-2646. November 1982 10.3 " Application Criteria for Programmable Digital Computer Systems in Safety Systems for Nuclear Power Generating Stations," ANSI /IEEE-ANS-7-4.3.2, July 1982 10.4 " Guidelines for an Effective SPDS Implementation Program", Nuclear Utility Task Action Committee, INPO 83-003, January 1983.
10.5 " Plant Modification Control Program," Illinois Power Company, Corporate Nuclear Procedure CNP 4.05, Rev. 0;
~
July 1983.
>
- 4 4
TABLE 1 Outline for SPDS Requirements Review Report
. BACKGROUND Identify SPDS. Requirements Sources Identify Other Material Required Review Participants References
. REVIEW SUMMARIES Source Document Review source document listing requirements consistency Consistency Assessment format evaluation capacity for supporting Design Review Validation Testing Field Verification Testing Design Requirements Review (Attributes of each) correctness completeness consistency with EOPs understandability
. testability Test Requirements and Acceptance Criteria what tests are needed test evaluation Human Factors Review documentation completeness CAPABILITIES AND/OR REQUIREMENTS TRACEABILITY Listing of Key Capabilities and/or Requirements Listing of. Specific Tests to be Performed
SUMMARY
OF DEFICIENCIES TO BE RESOLVED
TABLE 2 l
Outline of the SPDS Design Review Report ,
i BACKGROUND Identification of SPDS Design Documents Reviewed Identification of other Material Considered Assessment of Documentation Review Participants References CONSISTENCY ASSESSMENT Correlation-of Design Information with System Capabilities Requirements (is it possible to identify design characteristics with requirements defined in the SPDS Requirements Document)
' DOCUMENTATION EVALUATION / ANALYSIS (review for adequacy of:)
System Hardware / Software Architecture Input / Output Interfaces Hardware Subsystem and Interface Definition (includes isolation, signal type & rates, availability allocation, testing approach, etc.)
Hardware Subsystem Design Software Subsystem Definition (includes operating system, applications, programs, utilities, etc.)
Software Subsystem Design (includes unit conversion, priorities, algorithms, error control / recovery, etc.)
SUBSYSTEM & DATA INTERFACE REVIEW Results of set point review Results of other data table reviews Subsystems defined for completeness PERFORMANCE ASSESSMENT Hardware Integration Approach (includes subsystem interconnections / communication, testing approach, etc.)
Software Integration Tpproach (includes timing, interrupts, data management, input output, scaling, testing approach, etc.)
Human Engineering c
(includes ease of use as well as ease of learning)
TESTABILITY System Hardware and Software Integration (includes test facility definition and testing approach, human factors, special testing simulators or method to be used for generating dynamic test data, etc.)
SUMMARY
OF DEFICIENCIES TO BE RESOLVED (is the design:)
Complete (does it address all factors or are there major design decisions yet to be made)
Consistent (is the design for subsystems consistent with the system design, the interfaces, the hardware with the sofware, performance requirements, etc.)
Testable (can the hardware and software be tested as subsystems and/or the complete system--i.e., can test signals be input with results recorded)
Acceptable (can the operators understood and use it)
= -_ J
, :A Table 3 Outline for the Validation or Field Verfication Test Plan Introduction Purpose of the Tests Summary of Test Requirements Reference Documents Deliverable Materials Statement of Pre-Test Activity Test Philosophy
' Test Plan
System Description
Test Milestone Chart Environment-Hardware / Software Diagnostics Personnel Training Plan
. Test Materials and Equipment Test Specifications (General)
Requirements System Functions to be Tested Test / Function Requirement Relationship Test Methods Methodology and-Test Conditions Test Control Data Recording
. System Test Constraints Nonconformance Procedures Number and type of tests Test Description Test Sequences ,
Test Description (may be. grouped by series)
Purpose Test inputs Test outputs
[ Test Conditions
! System Conditions Test Control / Executions ,
Input Data l Input Commands Output Data
[-
Output Recording Documentation L Test Procedures (may be different for different tests)
Test Setup Test Initialization
( Test' Steps Test Termination i' Test Evaluation (during the testing)
Data Analysis Acceptability Analysis Treatment of Nonconformances L Other Tests....
i eyy c m m --w:- s -- w -p --w-t m - n- - s--em ----
- s o TABLE 4 Outline of the Validation'or Field Verification Test. Report Int'roduction Purpose of Tests Summary of Test Plan Reference' Documents Test-Analysis Test Identifier (each- test has a - unique identifier)
Test Objective Test Environment / Configuration Data Collected Data Analysis Acceptance Criteria (requirements)
Actions Taken Conclusions Other Tests....
System Function Analysis System Function Identifier Functional performance Performance Acceptance Criteria Actions Taken Conclusions
-Other functions: ,
- Summary (Capabilities demonstrated System deficiencies System refinements Recommendations
e - !;
TABLE 5 Outline of the Validation Report Overview-Summary of V&V Plan -
Identification of V&V Documentation-Other Project. References Overview of V&V Activities System Requirements Review Objectives Summary of Activities Summary _of Results Deficiencies Identified / Resolved
' References Design Review Objectives Summary of Activities
~ Summary of Results Deficiencies Identified / Resolved References Validation Test Summary of Test Plan Summary of Test Execution Summary.of Test Results '
Deficiencies Identified / Resolved References Field Verification Test Summary of Test Plan Summary of Results Deficiencies Identified / Resolved References Summary and Conclusions Development of Traceable /Auditable Quality Product
-md
u.;, 9
- , f-
- 3. ;
ILLINOIS? POWER COMPANY SAFETY PARAMETER DISPLAY SYSTEM
-SAFETY ANALYSIS' REPORT 1.0 . PURPOSE
'This Safety Analysis: Report is intended to provide the design base justification for the Safety Parameter System as implemented at Clinton. Power Station..
- 2. 0- SYSTEM OVERVIEW The Safety Parameter Display System is being implemented as a part .cdf the Display Control and Performance Monitoring
. Systems (DCS/PMS) currently installed in the Main Control Room. The Number 5 CRT in the NUCLENET-(Principal Plant.
Console) has been designated as the SPDS Display.
The purpose of the SPDS is to assist control room personnel in evaluating the safety statusoof the plant. The SPDS is to. provide a continuous indication of plant parameters or derived variables representative of the safety status of the plant.. The primary function of the SPDS is to' aid the operator,in the rapid detection of: abnormal operating conditions.
Implementation of the SPDS Display will be accomplished through ~ the installation cd? a permanent display in the DCS
- (Display Control System) and a temporary AID display that will automatically be displayed on all nine NUCLENET-DCS displays when any~of the parameters alarm.
Chapter 15 of the.Clinton Power Station Final' Safety L Analysis Report divides the postulated accident or o transient events into 8 individual categories. These categories of events have been considered in the selection of.SPDS parameters.
! :One of the 8 FSAR categories, Increase in Reactor Coolant Inventory, does not lead to'any significant consequences and; therefore,.is not part of the-bases for the SPDS.'The i remaining 7 categories in the FSAR are incorporated in SPDS. Each of these 7-categories of FSAR events can be associated with an SPDS. parameter group listed in this report.
t
(:
o l
l-o
-c .
, ?.
FSAR Chapter 15 Category SPDS Group Decrease in core coolant temperature')
Reactivity and-Power Distribution ( - Reactivity Anomalies ( Control (see i Section 3.1)
Anticipated Transients Without Scram J
- . Increase in Reactor Pressure - Reactor Coolant System Integrity (see Section 3.3)
Decrease in Reactor Core Coolant Flow Rate
) - Reactor core
- k cooling & heat DecreaseinReactorCoolantInventory{ removal from the J Primary System (see Section 3.2)
(~ - Radioactivity control (see Section 3.4)
Radioactive Release from a i Subsystem or Component - Containment
} conditions (see
(- Section 3.5)
Information necessary to monitor the radioactivity status and control in the plant is provided through the Area and Process Radiation Monitor (ARM /PRM) System. The ARM /PRM System is a separate computer system with a control terminal.CRT permanently mounted in the control room. ,
3.0 SPDS PARAMETER GROUPS AND ASSOCIATED SELECTION BASES 3.1 Reactivity Control Above the Source Range, once a stable period has been established, changes in neutron flux can be i used to determine the reactivity of the core. Even f
if a stable period has not been established, changes in flux can be used to determine if reactivity is positive, negative or zero. Control Rod insertion provides secondary information about reactivity control and the ability to support a scram is determined by the Scram Di.scharge Volume (SDV) level.-
-2_
p y u ta
+
R_,,
f? '
+
i t
The Average; Power Range Monitor:(APRM)-channels .
' monitor-neutron flux-(expressed as percent power)
'during; normal operation and. indicate if the fission process-is. terminated upon receiving a Reactor Protection System (RPS) Trip. The four APRM channels shall be' averaged. Any APRM found to be more'than 10% from the average shall be ignored and
['- the everage of"the. remaining APRMs shall-be a
. permanent part of the SPDS Display. The APRM 1
i
. display shall be a percentage of full power indication.
The Source' Range' Monitors-(SRMs) provide: neutron
- flux indication during reactor startup/ shutdown or j- " low-flux level. operation. The SRMs. allow the operator to confirm long term' net negative-reactivity by observation'of power at steady state I source levels. This ensures safe shutdown and the ability to' detect potential restart events. The four SRM' signals shall be averaged and indicated as a permanent part of the SPDS Display. SRM information shall be presented as Counts'per Second with rate information provided as period (time for a power. change by a factor of e).in seconds.
> \
Scram Discharge Volume (SDV)-is provided so the
- operator is alerted whenever the ability to insert control' rods may be jeopardiced.. An indication of SDVs'A-and B,.in gallons, has been provided on the
, Alarm Initiated Display portion of the SPDS Display.
, _ 3.2 Reactor Core Cooling and Heat Removal From the Primary System..
The primary. indication of the ability to remove heat from the reactor vessel is the amount of' water in the vessel, whatever.the source. Directly related-to the removal of' heat is the flow of water within the' vessel and the' pressure of the vessel.
i-l; ^ Reactor Wide Range water level, in inches, and
!l '
- Reactor Steam, Feed, and Total Core Flow, in
~
millionsfof pounds per hour, are provided as a i
' permanent part of the SDPS Display. Because of its importance, Reactor Water Level is also provided as a single value in the Alarm Initiated Display portion of the-SPDS Display. ,
4 3-
.-,,w.r-,w-- .n--. - , , , , . e , . , ~. . -y -
...-,v.m..,%ms ,,y ..,% ,,,c.,= ,wy ,wv, -.,,m. c.,ces..-,- etv-y-,,,. ve,mwy-,.-- y , - - . -ge,~.-
e .
3.3' Reactor Coolant System Integrity During normal plant operations, core coolant is monitored via Reactor Feed Flow, Reactor Recirc System A and B Flow, and Total Core Flow.
Monitoring these~ flows will provide a primary
. integrity indication. Excess flow in the Drywell Floor Drain Sump is an early indication of minor cracks or other unidentified leakage paths, as are Reactor and Drywell Pressure. Drywell equipment sump flow provides information useful to the
-operator in protecting equipment but not necessarily reactor integrity.
-Reactor Recirc Flow A and B, 'n thousands of gallons per minute, shall be a permarant part of the SPDS Display.
Wide Range Reactor Pressure and Narrow Range Drywell Pressure, in pounds per square inch gauge, shall be
- a. permanent part of.the SPDS Display.
Drywell Floor Drain Sump Flow, in gallons per minute, shall be a permanent part of the SPDS Display. Because this flow is the most sensitive indicator of leakage it is a standard technical specification basis for shutdown on small breaks and shall also be displayed on the Alarm Initiated Display portion of the SPDS Display.
Drywell Equipment Drain Sump Flow, in gallons per minute, shall be-indicated on the Alarm Initiated Display portion of the SPDS Display.
_4-
. (
3.4 Radioactivity Control Forty-six fixed digital Area Radiation Monitors (ARMS) are located throughout the plant to monitor gamma dose rate. Twelve portable ARMS are available for connection to other ports in the plant.
Fourteen fixed digital Constant Air Monitors (CAMS) measure airborne radioactivity within the station, with ten portable sample CAMS available. Two Process Radiation Monitors (PRM) sample the common, station HVAC exhaust, two monitor Standby Gas Treatment System (SGTS) . Monitors are also located as follows: one in Pre-treatment and two in Post-treatment Air Ejector off-gas, and one in Liquid Radwaste Effluent discharge. Six PRMs monitor various liquid streams to detect inter-system leakage of heat exchangers. There are 12 safety related PRMs, with control functions to initiate SGTS that monitor HVAC ducts on Containment Building Exhaust, Containment Building Fuel Transfer Vent Planum, and Fuel Building Exhaust. Finally, there are 4 safety related PRMs in the Main Control Room air intake.
Status of all 90 of the permanent monitors and as many of the 22 portable monitors are as connected to system communication ports shall be provided on the ARM /PRM Status Grid. The following conditions shall be indicated by color changes of the monitor unit number:
light blue communications fail blue uninitialized monitor red high alarm yellow alert / trend alarm dark blue not initialized white calibration / source check maintenance / flush / local control green normal / alarm off purple monitor in standby.
3.5 Containment Conditions In addition to those parameters already specified, it is essential the operator be aware of Containment Pressure and the status of containment isolation in order to ascertain conditions within containment.
Additional information that is useful to the operator includes Drywell temperature, Suppression pool-level and temperature, containment temperature, and containment hydrogen concentration.
L.
c ~'- % , , .
e,.,
- Containment pressure, in pounds per square inch gauge, shall be a permanent part of the SPDS Display.
b Containment Isolation, both inboard (I)~and outboard
'(0), for each of the 11 containment isolation valve groups specified in CPS No. 10N4001.02S Automatic Isolation shall- be -indicated as a single character (I or.0) for each group, with green indicating isolation has occurred. Containment isolation shall be a permanent part of the SPDS Display.
The Alarm Initiated Display portion of the SPDS
- Display shall provide drywell, containment, and suppression pool temperatures, in degrees Farenheit, suppression pool level, in feet, containment and' drywell pressures, in pounds per square inch gauge, and~ containment hydrogen concentration expressed as a-percentage.
t 8
-6'