ML081550145
ML081550145 | |
Person / Time | |
---|---|
Site: | Oconee |
Issue date: | 05/28/2008 |
From: | Baxter D Duke Energy Carolinas, Duke Energy Corp |
To: | Document Control Desk, Office of Nuclear Reactor Regulation |
References | |
TSC-07-009 | |
Download: ML081550145 (13) | |
Text
ADukeBAXTER AVEPresident Vice Energy. Oconee Nuclear Station Duke Energy Corporation ONO1 VP17800 Rochester Highway Seneca, SC 29672 864-885-4460 864-885-4208 fax dabaxter@dukeenergy.com May 28, 2008 U. S. Nuclear Regulatory Commission Washington, D. C. 20555 Attention: Document Control Desk
Subject:
Duke Energy Carolinas, LLC Oconee Nuclear Station, Units 1, 2, and 3 Docket Numbers 50-269, 50-270, and 50-287 License Amendment Request for Reactor Protective System/Engineered Safeguards Protective System Digital Upgrade, Technical Specification Change (TSC) Number 2007-09, Supplement 4 On January 31, 2008, Duke Energy Carolinas, LLC (Duke) submitted a License Amendment Request (LAR) to address replacement of the existing Oconee Nuclear Station (ONS) analog based Reactor Protective System (RPS) and Engineered Safeguards Protective System (ESPS) with a digital computer based RPS/ESPS. By letter dated April 24, 2008, NRC requested Duke provide additional information associated with five issues. Enclosure 1 provides the requested information for the remaining two issues. Duke provided the requested information for the first three issues by letters dated April 29, 2008, and May 15, 2008. Enclosure 2 provides a list of regulatory commitments being made as a result of this LAR Supplement.
If there are any questions regarding this submittal, please contact Boyd Shingleton at (864) 885-4716.
Very truly yours, Dave Ba er, Vice President Oconee Nuclear Station
Enclosure:
- 1. Duke Response to Remaining NRC Issues
- 2. List of Regulatory Commitments www. duke-energy.comr
U. S. Nuclear Regulatory Commission May 28, 2008 Page 2 cc: Mr. L. N. Olshan, Project Manager Office of Nuclear Reactor Regulation U. S. Nuclear Regulatory Commission Mail Stop 0-14 H25 Washington, D. C. 20555 Mr. L. A. Reyes, Regional Administrator U. S. Nuclear Regulatory Commission - Region II Atlanta Federal Center 61 Forsyth St., SW, Suite 23T85 Atlanta, Georgia 30303 Mr. G. A. Hutto Senior Resident Inspector Oconee Nuclear Station S. E. Jenkins, Manager Infectious and Radioactive Waste Management Section 2600 Bull Street Columbia, SC 29201
U. S. Nuclear Regulatory Commission May 28, 2008 Page 3 Dave Baxter affirms that he is the person who subscribed his name to the foregoing statement, and that all the matters and facts set forth herein are true and correct to the best of his knowledge.
Site Subscribed and sworn to me: ?~, 2a6~
Date Notary Public My Commission Expires:-
Date V
SEAL
TSC 2007-09, Supplement 4 May 28, 2008 Page 1 of 9 Enclosure 1 Duke Response to Remaining NRC Issues NRC Issue 5: Regulatory Guide 1.168, Revision 1, dated February 2004, endorses IEEE 1012-1998 and IEEE 1028-1997 with the exceptions stated in .the Regulatory Position of the regulatory guide describes a method acceptable to the NRC staff for complying with parts of the NRC's regulations for promoting high functional reliability and design quality in software used in safety systems. SRP Table 7-1 and Appendix 7.1-A identify Regulatory Guide 1.168 as SRP Acceptance criteria for reactor trip systems (RTS) and for engineered safety features systems (ESF). The LAR and other associated documents have described certain exceptions to IEEE 1012-1998. In particular, IEEE 1012 makes the generation of various test plans the duty of the V&V organization. The Areva V&V plan, document 51-9010419-005, "Oconee Nuclear Station Unit 1 RPS/ESFAS Controls Upgrade Software Verification and Validation Plan,"
makes this test plan generation the responsibility of the design or test organizations. The information provided in the LAR and in Areva document 51-9047317-009, "Position Paper:
Conformance of TELEPERM XS Application Software with IEEE Std 1012-1998," does not contain sufficient detail to allow the staff to determine the acceptability of this deviation. The staff will require additional information to determine if the proposed alternative to the requirements of IEEE 1012 will provide an equivalent confidence in a high quality test process, and therefore an equivalent confidence in the safety of the resultant system. The additional information to be provided should include that information provided to the licensee for their determination that this alternative to IEEE 1012 was acceptable, and may include the following:
- Documentation of independent V&V Group's Assessment of Testing
- Documentation of V&V Group's Role/Interaction with the Test Group
- Documentation of how Problems identified by the test were Resolved
- Documentation of Duke's Review of the V&V Testing Practices Response to NRC Issue 5: The details associated with the above issue on Verification and Validation (V&V) were discussed with the NRC during a meeting on April 29, 2008, and during subsequent conference calls on May 5 and 9, 2008. Following these discussions and Duke's review of the ONS RPS/ESPS project V&V activities, Duke and AREVA decided to modify the V&V approach to testing. The following response addresses the changes to the AREVA V&V efforts and will be incorporated into the next revision to the Software Verification and Validation Plan (SVVP).
The revised SVVP will conform to the guidance of the applicable recommendations of IEEE Std 1012-1998, as endorsed by Regulatory Guide 1.168, Revision 1, with one exception related to the V&V responsibilities. This exception is the use of matrixed support personnel from the development groups to assist with some of the validation test activities. The V&V group is responsible for the V&V test development; however, it may get assistance from the Software
Enclosure I TSC 2007-09, Supplement 4 May 28, 2008 Page 2 of 9 Design group, Hardware Design group, and the Test group. The assistance includes the preparation of validation test specifications, procedures, and reports and the performance of test tasks. While assisting with V&V activities, these support personnel work under the supervision of the V&V group. These support personnel may not assist with the preparation of software test documents for design work they prepared. These matrixed personnel bring special skills to supplement the validation activities (i.e. test setup, safety, etc). The test documentworkflow and control points are shown in Figure 5-1 and ensure that the independence of validation testing is maintained.
Ver-ification and Validation Organization The V&V group is responsible for independent V&V activities described in the SVVP. The V&V group performs independent V&V reviews, validation tests, and develops V&V reports/documents.
The V&V group is technically, managerially, and financially independent from the design organizations, as required by IEEE Std 1012-1998, and endorsed by Regulatory Guide 1.168, Revision 1. V&V personnel may not participate in any design preparation activities; however, the V&V group can perform the Appendix B independent design review for the Software Design group, Hardware Design group, and the Test group, provided the requirements for Appendix B Design Control are satisfied. The V&V group manager has a dotted line responsibility to the Quality Assurance (QA) group. From a project management perspective, the V&V group is considered part of the overall project team.
The V&V group has sufficient authority and resources (i.e., budget and staff) to ensure V&V activities are not adversely affected by commercial and schedule pressures.
Responsibilities for Testing The V&V group manager reviews and approves all products from V&V activities, including Factory Acceptance Test (FAT) plans, specifications, procedures, and reports. The V&V group manager may delegate the approval authority to a V&V lead engineer.
The V&V group has technical competence equivalent to the Software Design group. V&V personnel are trained on the TXS System Software and Hardware as well as the provisions of the software development process. Commensurate with their assigned responsibilities, V&V personnel are sufficiently proficient in software engineering to ensure that software V&V activities are adequately implemented and are knowledgeable regarding nuclear safety applications. V&V personnel are familiar with the design principles and features of the
Enclosure 1 TSC 2007-09, Supplement 4 May 28, 2008 Page 3 of 9 Teleperm XS (TXS) system. V&V personnel are trained in the use and the output of the SPACE tool for verification of the SPACE Function Diagrams.
V&V personnel are familiar with acceptance test procedures, predicting test results, and the form of the generated outputs from the TXS System for validation testing.
Factory Acceptance Testing FAT is a formal project milestone that is attended by both AREVA NP QA and Duke personnel. The FAT fulfills the requirement for system integration and acceptance validation testing. Additional application software integration test cases are added to the scope of FAT to satisfy IEEE Std 1012-1998 validation requirements for application software integration testing since software simulation testing is not being credited as a V&V tool.
The generic TXS platform software and hardware integration is subject to the generic qualification process described in the TXS topical report. The generic qualification approach provides a very high degree of validation independence commensurate with the importance of generic system qualification. In addition, the generic qualification activities ensure proper operation of the function blocks.
The FAT is performed under the responsibility of the V&V group. The software related FAT test plans, specifications, procedures, and reports are prepared in accordance with the Software V&V Plan and 10 CFR Part 50 Appendix B requirements.
The V&V group develops the software requirements traceability matrix to ensure that system functional requirements have been tested in FAT. Similarly, the V&V group develops the software requirements traceability matrix to ensure that specific software requirements have been tested during FAT. The V&V group independently verifies that the software versions being tested match those listed in the Software Configuration Management Plan.
The SVVP and FAT plan will be revised to reflect the modified approach outlined above for validation testing prior to performance of FAT.
Differences for Oconee Unit 1 Because the decision to modify the V&V process for validation testing was recently made, some of the Unit 1 FAT test documents had already been prepared by the Test group. The V&V group performs the independent verification (including requirements tracing) and the Appendix B design review of these FAT test documents to ensure both application software functionality and system functionality. As part of this review, the V&V group independently
Enclosure 1 TSC 2007-09, Supplement 4 May 28, 2008 Page 4 of 9 assesses the test approach, test coverage criteria, and test result acceptance criteria. The V&V group documents problems identified during the review of these Unit 1 FAT test documents as Open Items. The V&V group ensures that resolution of Open ltems generated during these reviews is acceptable.
As indicated above, some Unit 1 FAT specifications and procedures were prepared by a test group (comprised of hardware and software personnel). This approach utilized the specialized skills of these personnel in an integrated fashion to develop FAT documents. These documents were prepared in accordance with 10 CFR Part 50 Appendix B quality assurance requirements.
The V&V group defined criteria and performed independent verification (including requirements tracing) and Appendix B design review of these FAT documents to'ensure system functionality. The V&V group performed the verification work in accordance with the Software Verification and Validation Plan. The V&V group has the authority to require additional test cases and analysis, as deemed necessary. While this approach to validation testing'has less independence on the preparation activities, it has the benefit of two diverse groups addressing testing methods and results. The approach has a strong element-of independence in the overall acceptance of the test documents, since the V&V group performed a full verification and had the authority to require additional test cases and analysis, when necessary.
Duke Oversight of RPS/ES Design and Testing Activities In addition to the AREVA V&V activities listed above, Duke performs additional oversight of the RPS/ESPS Design and Factory Acceptance Testing project activities. The following outlines the actions that Duke performs as part of a major engineering design change.
Duke personnel are required to review the design requirements of a significant engineering change to ensure that the design requirements are included in the system design and functionally tested as part of the engineering change process. For the RPS/ESPS project, Duke personnel from the Oconee Site Engineering and Oconee Major Project organizations reviewed the Requirements Traceability Matrix (RTM) as part of the design review prior to the submittal of the RPS/ESPS LAR. Currently, the RPS/ESPS FAT Procedures are being reviewed to ensure that adequate coverage is being provided in the FAT to test all of the specified design functions.
In order to perform. these reviews, the Duke personnel are tracing the FAT procedures against the Software Design Description drawings to ensure that adequate overlap exists in the FAT.
Comment documentation sheets are being generated for each Comment that needs to be addressed by AREVA due to apparent discrepancies. These comment documentation sheets are being utilized to obtain AREVA's response to Duke's comments. If a significant deviation or
Enclosure 1 TSC 2007-09, Supplement 4 May 28, 2008 Page 5 of 9 discrepancy is idefitified, then Duke generates a Corrective Action Process document in the Oconee database or has AREVA generate a Corrective Action Process document in their database.
Once the comments have been provided to AREVA, comment resolution meetings or conference calls are held to discuss the comments to ensure that the comments are understood by the test preparers at AREVA. As part of these discussions, resolution of the comments is obtained between Duke and AREVA. Satisfactory resolution of the comments is required before Duke provides formal approval of the FAT procedures.
An additional part of the oversight provided by Duke for the RPS/ESPS project includes visits by Duke personnel to AREVA offices to review software and test development activities.
These visits have been with the AREVA V&V team to review their involvement with the RTM development, FAT procedure development, personnel qualifications, and to review V&V comments during document reviews.
As part of the RPS/ESPS project team, the Duke QA group assigned dedicated resources to the AREVA Alpharetta, Georgia offices to ensure that the QA processes used for the development of AREVA documents supporting the RPS/ESPS LAR, in addition to the development of application software, are being followed. A Procurement Quality "Restriction" was applied to the Alpharetta offices after a Duke QA Audit performed in October 2006. This restriction required significant oversight activities including a review of the LAR documents prior to AREVA submittal to Duke for further technical review. During this period (from October 2006 until May 2008) Duke QA spent a significant amount of time at the AREVA Alpharetta offices. Oversight activities included a review of the document development process, qualification and training of project personnel, and an evaluation of the AREVA corrective action program. Also, during this time period Duke QA conducted surveillance activities at AREVA locations in Lynchburg, VA and Erlangen, Germany who support AREVA Alpharetta. Duke QA conducted the triennial qualification QA audit in April 2008. Based on the audit results, Duke QA determined that AREVA has implemented significant improvements to the structure of the engineering and QA functions. Duke QA will continue to perform oversight at the Alpharetta offices including supporting locations throughout AREVA. This oversight effort will include monitoring the corrective action process, conducting surveillance of the hardware and software manufacturing, Basic Hardware Testing, and FAT.
Enclosure 1 TSC 2007-09, Supplement 4 May 28, 2008 Page 6 of 9 Estimated Availability Date for V&V Test Reports Oconee Unit, 1 Software V&V Reports Estimated Availability
- i Date Design V&V Activity Summary Report
- Acceptance Test Plan Verification Test V&V Activity Summary Report
" Acceptance Test Design Specification Verification
" Acceptance Test Case Specification Verification March 2009
- Acceptance Test Procedure Verification
'Acceptance Test Verification
Enclosure 1 TSC 2007-09, Supplement 4 May 28, 2008 Page 7 of 9 Figure 5-1 - Test Document Workflow and Control Points Matrix Support Personnel V&V Group (under the suoervision of the V&V) (responsible for testinaq Initiation of test1 Specification of approach, definition document coverage criteria, test result acceptance criteria Packaging of Comparison of test document documents against the V&V W AL [criteria Q z
z Modification of Specification of ocumen z douments modifications to -*N- match the V&V >
be done criteria CO Release of test ItV&Vapoa definition documeen'tsI VV approval Initiation of test performance sequence Test executi~on Control of test test execution and m Analyssof tstrreultss -m 0
Analysis of test resultsz z>
result acceptancel according criteria**0o; to the V&V test m F* V& acet 0 z0 I-
Enclosure 1 TSC 2007-09, Supplement 4 May 28, 2008 Page 8 of 9 NRC Issue 6: The LAR documentation indicates that use of the SWAT tool (Simulation and Validation Tool) makes Component and Integration tests unnecessary. This approach is unfamiliar to the staff and does not appear to be consistent with industry standards and regulatory guidance. The use of the SIVAT tool was not identified in the TXS topical report, and the software tested by SIVAT is not the actual compiled operational code, but is rather an adapted version of the application code. It appears that the first time the actual operational code is tested is during the Factory Acceptance Test (FAT), and this test was developed by the design and test group, not the Verification and Validation (V&V) Group.
The LAR states that SIVAT is a qualified tool for testing. NRC has not reviewed and approved SIVAT, and therefore does not understand why SIVAT is considered qualified. Topical Report EMP (sic) -2110, "TELEPERM XS: A Digital Reactor Protection System," and the staff SER on the topical report do not mention SWAT. The validation tool which is mentioned in the TXS topical report is RETRANS (report section 2.4.3.3.3, Page 2-61). The report states:
"As a diverse measure to detect potential software faults not found by the means described in Section 3.2.1, the verification tool "RETRANS" developed by GRS-1STec is used as an independent testing tool."
An additional issue which the staff does not understand is that on page 11 of the Areva Software V&V Plan, it states, "The test verifies that the requirements have been translated, without errors, into function diagrams, and that the software automatically generated from these function diagrams provides the functionality required in terms of 1/0 response." The staff does not understand how the proposed software testing using SIVAT can demonstrate that system requirements specification have been correctly translated into the code.
The staff believes that testing performed by unit and integration tests should be performed on the actual operational code, and therefore it may be necessary to perform additional software testing such as the following:
- 1. Perform unit and integration test on the actual operation code instead of simulation. This would require developing test procedures, test results, and V&V reports on the test of actual operational code.
or
- 2. Expand the FAT to include testing that are normally done during unit and integration testing. This may include a) fault injection by deliberately passing bad information from one software unit to another, b) simulating hardware failures, c) communications errors, and d) diagnostic failures.
Enclosure 1 TSC 2007-09, Supplement 4 May 28, 2008 Page 9 of 9 This is only a short example of the types of testing the staff would expect to be added to FAT.
The licensee may choose to provide such additional information as needed for the staff to reach a conclusion that the SIVAT testing already planned will provide an equivalent confidence in a high quality test process, and therefore an equivalent confidence in the safety of the resultant system. One of the items required for this determination would be a determination that the SIVAT tool was qualified in a manner similar to that required for software performing safety-related functions, and that the software lifecycle process of the SIVAT tool development meet the requirements for that type of software.
Response to NRC Issue 6: During the April 29, 2008, Duke/NRC meeting and during subsequent conference calls with NRC Staff on May 5 and 9, 2008, the NRC indicated that a detailed review of the SIVAT tool would be required by NRC Staff before it could accept crediting the SIVAT tool for use in V&V testing. After reviewing the SIVAT testing activities, Duke has decided to make the Unit 2 and 3 FAT Plans consistent with the Unit 1 FAT Plan in that software integration testing will be performed as part of the FAT rather than crediting SIVAT for V&V testing.
Enclosure 2 - List of Regulatory Commitments TSC 2007-09, Supplement 4 May 28, 2008 Page 1 of 1 The following commitment table identifies those actions committed to by Duke Power Company LLC d/b/a Duke Energy Carolinas, LLC (Duke) in this submittal. Other actions discussed in the submittal represent intended or planned actions by Duke. They are described to the NuclearRegulatory Commission (NRC) for the NRC's information and are not regulatory commitments.
Commitment Completion Daite The SVVP and FAT plan will be revised to reflect the modified Prior toFAT approach outlined above for validation testing prior to performance of FAT.
Duke has decided to make the Unit 2 and 3 FAT Plans consistent with Concurrent with the Unit 1 FAT Plan in that software integration testing will be issuance of Unit 2 performed as part of the FAT rather than crediting SIVAT for V&V and 3 FAT Plans testing.