ML083290481

From kanterella
Jump to navigation Jump to search
Simulator Scenario Based Testing Methodology White Paper
ML083290481
Person / Time
Issue date: 12/30/2008
From: Frederick Brown
Division of Inspection and Regional Support
To: Roe J
Nuclear Energy Institute
Vick L, NRR/DIRS/IOLB, 415-3181
References
Download: ML083290481 (6)


Text

December 30, 2008 Jack W. Roe, Director Operations Support Nuclear Generation Division Nuclear Energy Institute 1776 I Street, NW, Suite 400 Washington, DC 20006-3708

SUBJECT:

SIMULATOR SCENARIO BASED TESTING METHODOLOGY WHITE PAPER

Dear Mr. Roe:

Thank you for your letter (ADAMS Accession No. ML082770688) of September 18, 2008, regarding the Nuclear Energy Institutes (NEIs) proposed implementation and guidance white paper on simulator scenario based testing (SBT) (ML082770691). The purpose of this letter is to inform you that, overall, the staff agrees that the white paper provides a workable approach for conducting and documenting simulator SBTs as prescribed in paragraph 4.4.3.2 of ANSI/ANS-3.5-1998, Nuclear Power Plant Simulators for Use in Operator Training and Examination. The staff also agrees that the white paper supports pending requirements for SBT in the proposed revision of ANSI/ANS-3.5 currently being considered for adoption by the American Nuclear Society.

The staffs detailed comments and recommendations regarding the white paper are enclosed for your consideration. Although the evaluation criteria in the white paper are generally consistent with the regulatory requirements and industry guidance with respect to simulator fidelity, the staff noted that many of the criteria were not incorporated in the associated SBT checklist. The NEI checklist focused on exam validity criteria from NUREG-1021, Operator Licensing Examination Standards for Power Reactors, instead of simulator performance criteria from the industrys consensus standard. Therefore, we have proposed a revised checklist that better reflects the SBT methodology discussed in the evaluation section of your white paper. We would be glad to discuss this further with you. Assuming the remaining details are resolved, we plan to incorporate, as appropriate, the SBT methodology into the next revision of Regulatory Guide 1.149, Nuclear Power Plant Simulation Facilities for Use in Operator Training and License Examinations.

The NRC appreciates the opportunity to work with NEI and the industry stakeholders on this important simulator issue. If you require additional information, please contact Larry Vick at 301-415-3181 or Nancy Salgado at 301-415-2942.

Sincerely, Frederick D. Brown, Director /RA/

Division of Inspection and Regional Support Office of Nuclear Reactor Regulation

Enclosure:

As stated

December 30, 2008 Jack W. Roe, Director Operations Support Nuclear Generation Division Nuclear Energy Institute 1776 I Street, NW, Suite 400 Washington, DC 20006-3708

SUBJECT:

SIMULATOR SCENARIO BASED TESTING METHODOLOGY WHITE PAPER

Dear Mr. Roe:

Thank you for your letter (ADAMS Accession No. ML082770688) of September 18, 2008, regarding the Nuclear Energy Institutes (NEIs) proposed implementation and guidance white paper on simulator scenario based testing (SBT) (ML082770691). The purpose of this letter is to inform you that, overall, the staff agrees that the white paper provides a workable approach for conducting and documenting simulator SBTs as prescribed in paragraph 4.4.3.2 of ANSI/ANS-3.5-1998, Nuclear Power Plant Simulators for Use in Operator Training and Examination. The staff also agrees that the white paper supports pending requirements for SBT in the proposed revision of ANSI/ANS-3.5 currently being considered for adoption by the American Nuclear Society.

The staffs detailed comments and recommendations regarding the white paper are enclosed for your consideration. Although the evaluation criteria in the white paper are generally consistent with the regulatory requirements and industry guidance with respect to simulator fidelity, the staff noted that many of the criteria were not incorporated in the associated SBT checklist. The NEI checklist focused on exam validity criteria from NUREG-1021, Operator Licensing Examination Standards for Power Reactors, instead of simulator performance criteria from the industrys consensus standard. Therefore, we have proposed a revised checklist that better reflects the SBT methodology discussed in the evaluation section of your white paper. We would be glad to discuss this further with you. Assuming the remaining details are resolved, we plan to incorporate, as appropriate, the SBT methodology into the next revision of Regulatory Guide 1.149, Nuclear Power Plant Simulation Facilities for Use in Operator Training and License Examinations.

The NRC appreciates the opportunity to work with NEI and the industry stakeholders on this important simulator issue. If you require additional information, please contact Larry Vick at 301-415-3181 or Nancy Salgado at 301-415-2942.

Sincerely, Frederick D. Brown, Director /RA/

Division of Inspection and Regional Support Office of Nuclear Reactor Regulation Distribution:

LVick, NSalgado, FBrown, MCheok, BBoger, Public Adams Accession No.ML083290481 Office IOLB IOLB:BC DIRS Name LVick NSalgado FBrown Date 12/24/08 12/24/08 12/30/08 Official Record Copy

NRC STAFF COMMENTS AND RECOMMENDATIONS REGARDING NEI WHITE PAPER ON NUCLEAR POWER PLANT-REFERENCED SIMULATOR SCENARIO BASED TESTING METHODOLOGY The NRC staffs technical review of the white paper (WP) generated the following comments and recommendations.

Comment 1 The staff generally agrees with the Attachment 1 checklist concept but recommends that the checklist be limited to simulator performance-related items. The performance testing of a plant-referenced simulator is a separate technical matter not to be confused with the qualitative and quantitative scenario attributes required in Appendix D, Simulator Testing Guidelines, of NUREG-1021, Operator Licensing Examination Standards for Power Reactors. The staff recommends that the WP Attachment 1 be replaced with Recommended Revised Attachment 1 found at the end of these comments since it resolves many of the technical concerns related to implementation of SBT as simulator performance test.

Comment 2 Section 2.0, page 2, second paragraph, next to the last sentence: The staff recommends this sentence be deleted or modified to reflect that proper conduct of SBT is intended to alleviate the need for post-scenario evaluation since the performance of the simulator is being evaluated (compared to actual or predicted reference plant performance) during the conduct of the SBT.

Comment 3 Section 2.0, page 2, second paragraph, the last sentence: The staff recommends that this sentence be deleted since it is highly speculative in nature. It remains to be seen whether or not the SBT methodology will actually identify and correct more problems than any other form of simulator performance testing. To date, the proposed SBT methodology has only been demonstrated on one plant-referenced simulator (i.e., Robinson, in July 2006).

The staff acknowledges that the methodology holds the potential to uncover simulator fidelity issues otherwise undetected since an integrated plant response is being evaluated rather than a specific response in a stand-alone testing scheme.

Comment 4 Section 3.2.3, page 4: The staff recommends that the language make clear that simulator scenarios used for performing control manipulations that affect reactivity to establish eligibility for an operators license are those associated with 10 CFR 55.31(a)(5). For example:

Scenarios used to satisfy the reactivity control manipulation requirement in 10 CFR 55.31(a)(5).

Comment 5 Section 3.3, page 4, the first sentence: The staff is concerned that exclusive use of unlicensed test personnel (e.g., SRO certified instructors only with no licensed operators) during the conduct of SBT could result in less than adequate validation and confirmation of the simulators fidelity. The staff recommends that the wording be changed from and/or to and to ensure that licensed operators participate in the conduct of SBT.

Comment 6 Section 3.4, page 4, the first sentence: The staff recommends that key parameters be defined in the WP as: Key parameters are the parameters necessary for a full understanding or explanation of the expected plant response to which the simulator has been designed to respond. Although ANSI/ANS-3.5-1998, Appendix B, Guidelines for the Conduct of Simulator Operability Testing, provides examples of parameters to be recorded, the actual number of key parameters to be recorded during the conduct of SBT is predetermined by the nature of the event(s).

Comment 7 Section 3.5, page 4, the first sentence: The 1998 standard requires, in paragraph 3.1.1 (Real Time and Repeatability), the simulator to operate in real time while conducting any of the evolutions required by the standard. This applies to SBT evolutions as well. The simulator scenario must be run in real time until the scenario termination point is reached. The staff recommends that the sentence be modified to reflect that the simulator must run in real time during the conduct of simulator SBT since time-based relationships, durations, rates, and dynamic performance are important for demonstrating fidelity. The staff does not object to using the simulators freeze feature to stop the simulation to evaluate parameters and performance, as long as the simulator continues to support the SBT in a continuous manner, without any mathematical model or initial condition changes. Bear in mind that subsequent test results from performance of the same SBT without using freeze must be repeatable.

Comment 8 Section 3.7, page 5, the third sentence: The staff recommends that the sentence be modified, for simulator SBT purposes, to state that test personnel, as a minimum, (1) must verify parameters, alarms, and automatic actions directly related to a scenario event, a malfunction, and or an operator input, and (2) are not expected to verify non-relevant alarms and automatic actions, unless they are unexpected.

Comment 9 Section 3.9, page 5: The staff recommends that the sentence be modified to reflect that the response of the simulator resulting from operator action, no operator action, improper operator action, automatic reference unit controls, and inherent operating characteristics shall be realistic and shall not violate the physical laws of nature, such as conservation of mass, momentum, and energy, within the limits of the verification, validation, and performance testing criteria of the standard.

Comment 10 Section 3.10, page 5: To prevent any confusion, the staff recommends that the sentence be modified to reflect paragraph 4.4.3.2, Simulator Scenario-Based Testing, of the 1998 standard. The standard does not include the term tasks. The inclusion of the term tasks could be misconstrued to be more limiting than that allowed on the referenced plant since there may be more than one predetermined way for an operator to perform a given task.

Furthermore, the regulation speaks to the performance of significant control manipulations being completed without procedural exceptions rather than tasks being completed without exceptions.

During the conduct of SBT, it is the simulators fidelity and scope, rather than the operators performance and ability, being assessed as sufficient and adequate for use.

Comment 11 Section 3.12, page 5, the first sentence: The staff recommends that the sentence be modified to state that test personnel must verbally communicate expected plant response, trends, parameter/set point values, and primary alarms they observe throughout each event of the scenario. This is a critical implementation element of the SBT methodology since clear and deliberate verbal communications at all times ensures that expected test results are observed and validated. The staff observed strong communications among simulator test personnel during the conduct of the Robinson SBT demonstration. The staff recommends this expectation be carried forward as guidance in the SBT methodology.

Comment 12 Section 3.15, page 6: The staff recommends that the terminology be changed to SBT Test Results Package since the documentation must be retained for four years after the completion of each performance test or until superseded by updated test results.

Comment 13 Section 3.16, page 6: The staff recommends the sentence be modified to reflect that an electronic copy of the SBT Test Results Package is acceptable for record retention purposes.

Comment 14 Section 3.19, page 7, the first sentence: The staff recommends the sentence be modified to state that test personnel must document discrepancies in accordance with site simulator configuration management procedures.

Comment 15 Section 4.0, page 7: The staff recommends updating the WP references once NRC staff comments have been resolved. One minor editorial comment for Item 4.4 - the term Use needs to be capitalized in the standards title. Also, regarding item 4.5, the correct ADAMS number is ML073240964.

RECOMMENDED REVISED ATTACHMENT 1 NUCLEAR POWER PLANT-REFRENCED SIMULATOR SCENARIO BASED TESTING METHODOLOGY CHECKLIST Scenario Number: Revision: IC: Date Validated:

Item Simulator Performance Initials 1 Simulator performance supported scenario objectives.

2 Simulator initial conditions (IC) agreed with reference plant with respect to reactor status, plant configuration, and system operation.

3 Simulator operated in real time during conduct of SBT.

Note: Use of freeze allowed when evaluating specific performance.

4 Simulator demonstrated expected plant response to operator input and to normal, transient, and accident conditions to which the simulator has been designed to respond.

5 Simulator permitted use of the reference plants procedures so that the scenario was completed without procedural exceptions, simulator performance exceptions, or deviation from the scenario sequence.

6 Simulator did not fail to cause an expected alarm or automatic action and did not cause an unexpected alarm or automatic action. Note:

Attach simulator alarm summary (versus time) to SBT Test Results record.

7 Observable change in simulated parameters corresponded in trend and direction to those expected from actual or best estimate response of the reference plant. Note: Attach predetermined Monitored Parameter List (versus time) to SBT Test Results record.

8 Reference plant design limitations were not exceeded.

9 Each scenario malfunction demonstrated expected plant response to its initiating cause.

10 SBT conducted in a manner sufficient (i.e., meets requirements of ANSI/ANS-3.5-1998) to ensure that simulator fidelity has been demonstrated and met for this scenario. Note: Attach relevant as-run marked-up plant procedures and or procedure portions/pages utilized to support assertion.

11 Modeling and hardware discrepancies identified during the conduct of SBT are documented and entered in accordance with the site simulator configuration management procedures. Note: Discrepancies that directly affect operator response (or action) or expected plant response must be resolved before the SBT test results can be judged as satisfactory.

12 Simulator SBT performance test results: Date (mm/dd/yyyy) and Signature

___ SATISFACTORY / ___ UNSATISFACTORY Note: Attach list of SBT test personnel (include name, job title, and level of effort).

Technical comments attached: