ML18102A666

From kanterella
Jump to navigation Jump to search
Rev 2 to Pse&G Simulation Facility Certification Program
ML18102A666
Person / Time
Site: Salem, Hope Creek  
Issue date: 11/19/1996
From:
Public Service Enterprise Group
To:
Shared Package
ML18102A665 List:
References
PROC-961119, NUDOCS 9612160438
Download: ML18102A666 (44)


Text

PSE&G SIMULATION FACILITY CERTIFICATION PROGRAM N

Simulation Support Supervisor:

~= 921:,.. ~

1 ~

Date:

111 J I 9 / q (e I

I

-~ -

~---

9612160438 961210 PDR ADOCK 05000272 P

PDR Revision 02

Revision Date

==

00 10/07/96 01 10/25/96 02 11/19/96 REVISION HISTORY Description

====================================

Initial version; Simulator Procedures (SPs) retired; submitted to Licensing (Brian Thomas) on 10/08/96 for transmittal to NRC.

More details were provided on the differences between the Salem Simulation and the plants with respect to Radiation Monitoring (Section III.E.4).

The wording in Section IV.A was clarified with respect to the I 0%

parameter grouping and the relationship to Appendix B of

.A1'fSI/.A:N"S-3.5-1993.

More details were provided on the impact of the overlapping certification cycles on the testing schedules (Sections V and Vl).

More details were provided on the plant status code (4BA) that represents "operable" for simulation purposes (Section III.D.4, Appendix L ).

TABLE OF CONTENTS L EXECUTIVE

SUMMARY

IL DEFINITIONS III. CERTIFICATION METHODOLOGY A. Scope of Simulation B. Simulation Interface C. Simulation Testing

1. Verification Testing
2. Validation Testing
3. Performance Testing
4. Testing Benchmarks
  • 5. Testing Requirements
6. Testing Acceptance Criteria D. Configuration Management
1. Simulation Training Database
2. Simulation Modification Database
3. Simulation Action Request Database
4. Plant Modification Database
5. Work Control
6. Simulation Difference Database
7. Simulation Revision Control
8. Customer Oversight
9. Records
10. Reporting
11. Audits E. Salem Simulation F. Hope Creek Simulation IV. EXCEPTIONS V. COMPLETED PERFORMANCE TESTING VI. PLANNED PERFORMANCE TESTING VIL REFERENCES 1

3 3

5 5

6 7

7 7

11 11 11 12 12 13 13 15 15 16 16 16 16 18 18 18 20 20 22 23 24

APPENDICES A

Normal Evolutions 25 B

Operability Tests 27 c

Steady State Parameters 29 D

Transient Parameters 31 E

Performance Test Schedules 33 F

PTD Sample Entries 35 G

Test Credit Example 36 H

Cause and Effect Example 37 I

SAR Form 38 J

SAR Priorities 39 K

SDD Sample Entries 41 L

PJ\\1D Sample Entries 42 2

I. EXECUTIVE

SUMMARY

The Simulation Facility complies with 10 CFR 55.45 and meets the guidance contained in the standard ANSI/ANS-3.5-1993 as endorsed by NRC Regulatory Guide 1.149 Revision 2 with exceptions. The Simulation Facility is tied directly to the INFO Accredited Operator Training Programs and these exceptions deal generally with the items in the standard's lists that are not applicable to these training programs.

The Simulation Facility is used for the training and examination of operators for Salem Unit 1, Salem Unit 2, and Hope Creek. The Salem units share a common simulation because the current differences between the units do not have a significant impact on these activities. Salem Unit 2 is used as the reference plant for this common simulation.

The Simulation Facility currently consists only of the Salem and Hope Creek Full-Scope Simulators.

Part-task and classroom simulations are being considered for future usage. This certification methodology covers any computer configuration running plant-referenced simulation software that is used for the training and examination of operators.

II. DEFINITIONS A. Simulation Facility A collection of computer configurations running software that models the reference plants and which is used for the training and examination of operators. Depending on its function, individual configurations may be interfaced to control panels, plant computers, or other components. Examples include full-scope, part-task, and classroom plant-referenced simulators.

B. Certification Year The period based on the certification date on the latest NRC Form 474 and the subsequent twelve months. It is not related to the calendar year (January to December). For example, if the certification date is November 22, 1996, then the certification year runs from November 22, 1996 to November 21, 1997.

C. Certification Cycle The period based on the certification date on the latest NRC Form 474 and the subsequent four years. For example, if the certification date is November 22, 1996, then the certification cycle runs from November 22, 1996 to November 21, 2000.

D. Simulation Function A single action initiated from the Instructor Station or the control panels that impacts the Simulation Facility software. Examples include a spurious main turbine trip (Instructor Station) and a manual condensate pump trip (control panels). Simulation Functions are normally checked by Functional Tests.

E. Simulation Operation A group of simulation functions that usually includes a mix of Instructor Station and control panel actions. Examples include a main steam line rupture and a runback. Simulation Operations are normally checked by Operability Tests.

3

F. Simulation Support Group (SSG)

The work group that is responsible for maintaining the Simulation Facility.

G. Simulation Action Request (SAR) Database The database that contains information on all submitted Simulation Action Requests (detailed in Section III.D.3). Simulator Action Requests (SARs) are the work control mechanism for the Simulation Support Group (SSG) and are used to request any type of work (enhancements, discrepancy resolutions, plant modification incorporation, testing, phone changes, Emergency Planning support, etc.). SARs can be submitted by anyone and there is no requirement for supporting documentation. The SSG assumes the responsibility for gathering the information necessary to resolve the request. In the case of possible discrepancies, SARs should be submitted whenever the customer has any question about the Simulation Facility; proof is not required.

H. Performance Test Database (PTD)

The database that includes those simulation functions and operations that are required to be periodically tested (detailed in Section 111.C.3.a).

I. Simulation Difference Database (SDD)

The database that contains the differences between the reference plants and the Simulation Facility within the Scope of Simulation that have the potential to impact operator actions (detailed in Section 111.D.6).

J. Plant Modification Database (PMD)

The database that contains the impact review information for each plant modification (detailed in Section 111.D.4).

K. "Applicable" NRC Regulatory Guide 1.149 Revision 2 (Section 1.5) describes "applicable" as denoting items or issues that are incorporated in the "planned training and examination scenarios and exercises". "Applicable" is used in the PSE&G Simulation Facility Certification Program to denote items or issues that are part of the INPO Accredited Operator Training Programs.

L. Simulation Revision A collection of software, databases, and documentation that forms a coherent, accurate, and reliable "picture" of the reference plant simulation for usage by the customers. "Load" has been used commonly to describe a similar grouping but "revision" is a being used here as part of the process to improve the configuration management and to align with standard software engineering practices.

4

M. Customer An individual or group of individuals that uses the Simulation Facility. Since the purpose of the facility is to support the training and examination of operators, the primary customers are Operations and Operations Training. However, since the facility is a major resource for PSE&G, there may be secondary customers such as Emergency Planning. Operations Training shall function as the designated customer representative for day-to-day review and acceptance of work per Section III.D.3.

III. CERTIFICATION METHODOLOGY A. Scope of Simulation The response of the simulation shall be a realistic replication of the reference plant and shall not violate the physical laws of nature within the criteria specified in Section III.C.6. It shall operate in real-time and shall provide controls to alert the customers when it reaches any simulation limits such as gross core degradation. It shall be capable of supporting normal and abnormal/

emergency evolutions as required by the INPO Accredited Operator Training Programs (initial and requalification of RO/SROs).

The Performance Test Database (Section 111.C.3) contains the list of simulation functions necessary to support these evolutions.

I. Normal Evolutions Appendix A lists the normal evolutions that are applicable to the INPO Accredited Operator Training Programs. It encompasses the list contained in ANSI/ANS-3.5-1993 (Section 3.1. 3) with the applicable operator conducted surveillance and unit performance tests individually identified.

2. Abnormal/Emergency Evolutions Simulation functions are provided for the abnormal/emergency evolutions that are applicable to the INPO Accredited Operator Training Programs. This group includes the applicable functions from ANSI/ANS-3.5-1993 (Section 3.1.4) plus additional ones that are used in the training programs.

ANSI/ANS-3.5-1993 (Section 3.1.4) lists the simulation functions that are required to be part of the Scope of Simulation. NRC Regulatory Guide 1.149 Revision 2 (Section 1.5) discusses this list with respect to testing requirements but adds the qualification of "applicable".

B. Simulation Interface The simulation interface shall provide sufficient capabilities to allow the customer to initiate, control, monitor, and critique the simulation usage. It shall allow access to all the simulation functions necessary to support the evolutions described in Section III.A.

5

C. Simulation Testing Testing is divided into three major types: Verification, Validation, and Performance. The first two types are integral to the modification process. Performance Testing is periodic testing that confirms the maintenance of fidelity. It is subdivided into two groups: Operability and Functional Testing. Operability Tests are conducted each certification year. Functional Tests are conducted over the certification cycle.

SIMULATION TESTING I

I VERIFICATION VALIDATION PERFORMANCE I

OPERABILITY FUNCTIONAL STEADY STATE I ~I _TRAN s_rn_NT __ ~ ~-u_s_A_G_E_-B_A_s_E_D~

CYCLE-BASED The Simulation Facility is capable of evolutions currently not applicable to the training programs. Reasons for this "extra" capability include inherent attributes of the simulation models, additional features required for testing, and changes to the training programs. The below figure shows how the Scope of Simulation is a subset of the reference plant with the training programs operating within the Scope of Simulation. The SSG maintains the simulation fidelity within the Scope of Simulation. Verification and Validation Testing are used to check modifications within the Scope of Simulation. Operability Testing is used to check the overall response of the simulation within the Scope of Simulation by running selected operations.

Functional Testing monitors the health of the simulation within the area applicable to the training programs.

Reference Plant Scope of Simulation Applicable to Training Programs 6

1. Verification Testing Verification Testing confirms that the particular modification meets the design specification.

The design specification includes the simulation response (explicit requirements) and any work standards such as electrical loading, structured programming, model executive balancing, database contents, and documentation (implicit requirements). This testing is usually done by the individual engineer, technician, or analyst and shall be documented as necessary on the Simulation Action Request (SAR) as described in Section III.D.3.c.

Example:

The plant has modified a tank level switch setpoint from 35% to 47%. The SSG software engineer reviews the SAR, confirms that the change is operable in the plant, finds the constant in the database, and modifies the database definition per the software standard (description, value, units, reference). The engineer tests that the associated variable changes state when the level reaches the new setpoint of 47%. This testing information is recorded on the SAR. This type of testing is "Verification Testing".

2. Validation Testing
  • Validation Testing confirms that the Simulation Facility accurately replicates the reference plant following modification. This testing is usually done by a Subject Matter Expert (SME) and shall be documented as necessary on the SAR as described in Section 111.D.3.d.

Major modifications that have the potential to significantly affect overall :fidelity, reliability, and repeatability shall require additional documentation. Major modifications include

  • operating system upgrades, platform migrations, and major model upgrades. *For example, a full neutronics upgrade would require documentation on core performance testing, startup testing, power operations, and some of the Operability Tests. The exact composition of this "additional documentation" shall be determined by the SME.

Example:

The plant has modified a tank level switch setpoint from 35% to 47%. The SSG software engineer has completed the work and it is ready for the SSG test operator. The operator reviews the SAR (including the information from the software engineer),

investigates the function of the level switch, and determines the testing scope. This switch is used to open an alternate drain valve on a tank (generates an alarm). It also has an associated malfunction. There is no power requirement on the switch. There is no input to the plant computer systems. The test operator verifies the normal operation of the switch when level reaches 47% including the alarm and the effects of the change in the flow path. The malfunction is also verified. This testing information is recorded on the SAR. This type of testing is "Validation Testing".

Validation Tests may include real-time tests, normal evolutions, simulation limit tests, stimulated software/hardware tests, repeatability tests, core performance tests, startup tests, and surVeillance tests. Pefformance Tests (Section 111.C.3) may also be conducted as part of the validation of modifications.

3. Performance Testing NRC Regulatory Guide 1.149 Revision 2 (Section 1.5) describes the span of Performance Testing required to satisfy 10 CFR 55.45 when using the methodology in ANSI/ANS-3.5-1993. The below testing approach is consistent with this direction. The span covers 1) the Operability Tests and 2) the applicable Functional Tests.

7

a. Performance Test Database The Performance Test Database (PTD) contains the simulation operations and functions that are required to be tested on a periodic basis in order to confirm the maintenance of fidelity. It is initialized with the required operations (Operability Tests,Section III.C.3.b) and from the functions used by the bank of maintained scenario guides which support the training and examination of operators per the INPO Accredited Operator Training Programs (Functional Tests, Sectionill.C.3.c). These scenario guides contain simulation functions such as malfunctions, component failures, and local operator actions and are used to determine which functions are "applicable" to the training programs. The database excludes individual entries for 1) control panel hardware such as switches and potentiometers, 2) control panel input/output overrides which impact the signal immediately before or after the hardware component, 3) simulation interface functions such as RUN, FREEZE, RESET, SNAPSHOT, and BACKTRACK which have no corresponding function in the reference plant, and 4) stimulated software/hardware such as plant computers. Elements of these groups are inherent parts of any performance test and are also checked as part of Verification and Validation Testing following modifications.

After the initialization of the PTD, functions may be added as new scenario guides are generated and approved. Others functions may be added to support simulation testing and to enhance flexibility. New functions are tested as part of the Verification and Validation Testing prior to being released for usage. A function can be removed from the PTD if all the scenario guides using the function are deleted from the bank or the function is not used for forty-eight months. Another option is to mark the function as "inactive" and take it out of the Certification Program. If a function is removed from the PTD or marked as "inactive" and is subsequently rescheduled for usage, it shall be tested prior to the actual usage (Functional Testing).

Each PTD entry includes fields for an abbreviation, a description, status codes (ACTIVE-INACTIVE/SATISFACTORY-RESTRICTED-UNSATISFACTORY), a test group, a test frequency, a last test date (successful test), simulation revision information (for test), a test cross-reference (usually a SAR), a last usage date, scenario cross-references, SAR cross-references, and comments. Each entry is assigned to a specific "test group" which best matches its possible usage during the two year requalification schedule. A test group will usually be associated with one of the major plant systems.

These groups are used to provide a structure to the testing over the certification cycle.

The "status codes" will be initialized as "ACTIVE/SATISFACTORY unless there is an open SAR reported against it. If the SAR impact is minor, then the code will be "ACTIVE/RESTRICTED". If the impact is major, then the problem would be equivalent to a performance test failure and the code will be "ACTIVE/UNSATISFACTORY. The "last test date" will be initialized to the date on the NRC Form 474 with "test cross-reference" set to "NRC Form 474". The "last usage date" will be initialized blank. Appendix F s~ows some sample PTD entries.

The PTD entries with status codes "ACTIVE/SATISFACTORY are available for usage without restriction. "ACTIVE/RESTRICTED" entries will be available for usage with restrictions (see "comment" field of entry). "ACTIVE/UNSATISFACTORY entries are not available for usage. "INACTIVE" entries are not available for usage.

8

When a scenario guide is used in training or examination, the customer shall verify that the associated simulation functions are either "ACTIVE/SATISFACTORY or "ACTIVE/RESTRICTED". If the usage is restricted, then the customer shall review the restrictions*and compensate as required. If the scenario guide is part of the maintained bank, then it is likely that all the associated functions are ready for use. However, if it is new scenario guide, then some of functions may not be ready. In this case, they shall be tested prior to usage. If the new scenario guide is a "one-time" scenario and is not being added to the bank, then it is optional whether the associated simulation functions are tracked in the PTD as "ACTIVE" and thus subject to periodic testing. The other option is to track them as "INACTIVE" so that simulation customers are still aware that the capability exists.

b. Operability Testing Simulation operations are tested every certification year. These Operability Tests consist of both Steady State and Transient Tests. Dedicated test plans shall be used to perform the tests. A SAR shall be assigned to each test plan to allow for easy retrieval after completion. Each of the tests shall have a single entry in the PTD. Appendix B lists the current Operability Tests. It encompasses the list contained in ANSl/ANS-3.5-1993 (Appendix B) with the exceptions listed in Section IV.A).
i. Steady State Testing The selection of the power levels may vary from year to year based on the availability of reference plant data. Depending on the plant operating history, there may be different power plateaus and equipment configurations.

ii. Transient Testing The selection transients may vary from year to year based on the availability of reference plant data and the characteristics of the INPO Accredited Operator Training Programs. If the reference plant experiences a significant transient and the required data is recorded, then that transient may be added to the list of Transient Tests. A transient may be removed from the list if the benchmark is no longer valid or if the transient has little value to the training programs. For example, the benchmark for the runback may have been when the plant had an analog feedwater control system. When the control system is replaced with a digital one, the benchmark may no longer be valid and it is possible that the transient is removed from the list.

The details of individual transients may also vary from year to year. For example, the "Runback" may change from a "Full Runback" to an "Intermediate Runback" to reflect changes in the training programs, to use newly available reference plant data, or to check the simulation behavior under different conditions.

c. Functional Testing "ACTIVE" simulation functions are tested every certification cycle at approximately 25% of the total per certification year with all "ACTIVE" functions being tested each cycle. These Functional Tests can be conducted through either Usage-Based or Cycle-Based Testing.

9

Each malfunction, component failure, local operator action, or other simulation function used in the scenario guides per Section III.C.3.a will normally have a separate entry in the PTD. If multiple functions cause the same effect and are implemented in the identical manner, then a single entry can be used and testing credit shared. A valid example of this treatment may be control rod mechanical binding since it is usually performed in a loop for all control rods. An invalid example may be main steam line breaks since the leaks may go into different containment nodes or the modeling may not be in a loop.

i. Usage-Based Testing During simulation usage, test credit may be taken for simulation functions when the results are sufficient to meet the Testing Requirements per Section III. C.5. A SAR would be submitted with the optional Test Credit Form completed by the S:ME. The SAR has a box that, when clicked on, brings up the form (see Appendix G for example). This form will normally be completed after scenario guide validation but it can also be completed after other simulation activities such as Emergency Planning scenario development, SAR resolution, and operator requests. Once the form is completed, the SAR will follow the normal process (Section IIl.D.3). For those recommendations that are accepted by the SSG, the PTD would be updated accordingly. No additional documentation shall be required.

Example:

The plant has modified a tank level switch setpoint from 35% to 47%. The SSG software engineer has completed the work (Verification Testing) and the SSG test operator has checked the modification (Validation Testing). As part of the Validation Testing, the test operator confirmed that the response of the malfunction on the level switch met the testing criteria (Section 111.C.6.c). Test credit can be taken on the SAR for this function. This type of testing is "Usage-Based Testing".

Example:

A customer checks out a scenario guide prior to using it with a class of initial license candidates. The scenario includes a level transmitter failure, a pump trip, and a reactor vessel head leak. The customer confirms that the simulation response met the testing criteria for the transmitter failure and the leak (Section III.C.6.c). The pump trip was too close to the start of the leak and the customer could not verify the flow response. Test credit can be taken on the SAR for the transmitter failure and the leak. This type of testing is "Usage-Based Testing".

The Cause and Effect documents are critical to meeting the Testing Requirements.

Appendix H provides an example of a typical entry for a malfunction.

SMEs shall be trained on the acceptance criteria before being qualified to recommend test credit.

NRC Regulatory Guide 1.149 Revision 2 (Section 1.5) allows the integration of the Certification Program with the training programs if the testing acceptance criteria are met.

10

ii. Cycle-Based Testing Each active PTD simulation function shall be tested during the certification cycle and the goal is continuous testing of the Simulation Facility. Usage-Based Testing may not result in either 1) the testing of 25% of the total active functions per certification year or 2) the complete testing of all active functions during the cycle.

Cycle-Based Testing is the required Functional Testing for which Usage-Based Testing can not provide the credit. Cycle-Based Testing uses the same documentation process as Usage-Based Testing.

Example:

The SSG test operator is reviewing the PTD to confirm that all the required testing is completed for the fourth year of the certification cycle. The operator finds that the reactor vessel head leak malfunction has not been used in the previous four years. Operations confirms that they still want this malfunction.

The operator runs a test on this malfunction and confirms that it meets the testing criteria (Section III.C.6.c). Test credit can be taken on a SAR for this function. This type of testing is "Cycle-Based Testing".

4. Testing Benchmarks The test results shall be compared to either 1) actual reference plant data, 2) best-estimate engineering data, 3) similar plant data, or 4) Sl\\1E consensus (in order of preference).

When the Simulation Facility is using a best-estimate engineering code in its simulation (such as RELAP5) and no reference plant data is available, then the pedigree of the best-estimate code is sufficient when coupled with Sl\\1E consensus.

5. Testing Requirements Each test must examine sufficient data to determine whether the simulation response meets the acceptance criteria. These criteria may be 1) explicitly contained within a test plan, 2) contained within a Cause and Effect document (detailed response) with a reference to generic acceptance criteria, 3) described in a SAR (detailed response) with a reference to generic acceptance criteria, 4) described in a reference plant procedure or other plant document, or 5) some other reasonable approach that couples data and acceptance criteria consistent with Section III.C.6.
6. Testing Acceptance Criteria
a. Background There are basically two types of acceptance criteria. The first type is more numerical and is used for tests where the parameters are changing slowly (steady state) and it is possible to gather precise data from both the benchmark and the simulation. The second type is more directional and is used for tests where the parameters are changing quickly (transient) and it may not be possible to gather precise data. These two types are designated as Steady State and Transient Criteria.

11

b. Steady State Criteria Appendix C lists three groups of parameters. Group 1 parameters must be within 1 % of the benchmark. Group 2 parameters must be within 2% of the benchmark. Group 3 parameters must be within 10% of the benchmark. Guidance on the definitions of the percentages is provided in ANSI/ANS-3.5-1993 (Appendix C). Group 1and2 parameters shall be documented explicitly. Group 3 parameters may be documented implicitly by an SME review.

Steady State Criteria are used for:

1) Steady State Tests (Section III.C.3.b.i).

Groups 1and2 encompass the lists and requirements inANSI/ANS-3.5-1993 (Section 4.3.1 and Appendix B). In the standard, the Group 3 parameters are not individually listed but are represented by any parameter not in Groups 1 or 2.

c. Transient Criteria The simulation must 1) allow the use of the applicable reference plant procedures, 2) drive parameters in the expected direction, 3) cause the expected alarms and automatic actions, and 4) not cause any unexpected alarms or automatic actions.

Transient Criteria are used for:

1) Verification Tests (Section III.C. l),
2) Validation Tests (Section III.C.2),
3) Transient Tests (Section III.C.3.b.ii),.
4) Usage-Based Tests (Section 111.C.3.c.i), and
5) Cycle-Based Tests (Section 111.C.3.c.ii).

Verification and Validation Tests shall monitor parameters relevant to the modification.

Transient Tests shall monitor a group of common parameters (Appendix D). Usage-Based and Cycle-Based Tests shall monitor parameters relevant to the Cause and Effects. These groups and criteria encompass the lists and requirements in ANSI/ ANS-3.5-1993 (Section4.l.4 andAppendixB).

D. Configuration Management

1. Simulation Training Database The Simulation Training Database is the baseline for the Simulation Facility revision currently being used for the training and examination of operators. It consists of the data sources that define the current reference plant ("As Built") minus 1) differences between the reference plant and the Simulation Facility within the Scope of Simulation, 2) operable reference plant modifications that are not yet installed in the simulation, and 3) discrepancies reported against the current Simulation Facility revision. The documentation for these "exclusions" appears as details in the Simulation Difference Database and open SARs. Item 1 represents the "maintained" delta between the reference plant and the Simulation Facility. Items 2 and 3 represent the "temporary delta due to work in progress.

The Simulation Training Database also includes any additional data sources such as similar plant, best-estimate engineering, and SME data.

12

The original certification initialized both the "maintained" and "temporary deltas based on design data such as plant drawings (P&IDs, electrical), setpoint databases, instrumentation characteristics, heat balance, and control circuit details. The deltas have evolved through 1) changes in the Scope of Simulation, 2) review and incorporation of reference plant modifications and operating data, 3) collection and resolution of simulation discrepancies, and 4) review of other data sources such as industry events. This evolution is documented through the associated SAR packages.

2. Simulation Modification Database The Simulation Modification Database consists of the Simulation Training Database plus 1) planned changes to the Scope of Simulation, 2) operable reference plant modifications that are not yet installed in the simulation, 3) planned reference plant modifications that are not yet installed in the reference plant or the simulation, and 4) the resolution of any reported discrepancies. These "inclusions" appear as open SARs.
3. Simulation Action Request Database Each SAR is entered into the database as a separate entry with an automatically assigned number. The database fields follow the work control process from beginning to end. The major phases include Initiation, Customer Review, Resolution, Testing, Customer Acceptance, Delivery, and Closure (see Appendix I for the actual SAR form).
a. Initiation The first group of fields contains information from the originator (name, date, extension, background information, request). The background information includes applicable items such as the plant, the starting Initial Condition, the Snapshot if the problem was snapped, and the simulation revision.

An optional form can be filled out if the originator is recommending test credit for specific simulation functions (Section III.C.3.c).

b. Customer Review After the SAR is initiated, it is normally reviewed by a customer representative. The SSG Supervisor or designee may also perform this function if appropriate. This review allows the simulation customers to be quickly aware of any simulation issues. It also allows the customer to provide required information such as the work priority (see Appendix J) and the system designator. The customer considers the potential impact of the SAR on the INPO Accredited Training Programs, the upcoming training schedules, the priority definitions, and other issu~s when assigning priorities.

As discussed in Section III.C, the maintained Scope of Simulation includes areas not currently applicable to the training programs. SARs in these areas may be assigned a lower priority since they have a lower potential for impacting the customer but they still must be resolved.

The review is not a validation of the request. Submitted SARs will normally not be validated until 1) requested by the customer or 2) selected for resolution by SSG personnel. Validation activities are a major portion ofresolution and it is less efficient to separate the validation from the correction.

13

c. Resolution Once it is reviewed, the SAR is available for resolution based on the assigned priority.

The responsible person investigates, resolves, and tests any modifications. The testing at this stage is Verification Testing (Section III.C. l).

The documentation on the SAR shall include sufficient information so that another knowledgeable person can understand the resolution.

If the SAR is recommending test credit for specific simulation functions, then this step represents the review and possible concurrence with the recommendation. It is acceptable for the same person to submit the recommendation and concur with it since the SAR will still pass through "Customer Acceptance".

d. Testing Using both the initiator's request and the resolution, an S:ME checks the necessary testing envelope. The testing at this stage is Validation Testing (Section III.C.2).

The documentation on the SAR shall include sufficient information on the testing process to allow the customer to decide on the adequacy of the overall resolution of the SAR.

Depending on the SAR, it is possible that no testing is required. Examples include closing duplicate SARs and resolving recommendations for test credit.

e. Customer Acceptance The customer normally reviews the SAR and decides whether it has been satisfactorily completed. The SSG Supervisor or designee may also perform this function if appropriate. If the SAR is not acceptable, then the comments must be addressed before the SAR moves to the next step.
f. Delivery Based on coordination with the customer, a new revision is designed and delivered (Section III.D.7). A new revision could span the range from one SAR to many SARs.

This new revision would include the work associated with the accepted SARs. This step represents a update to the Simulation Training Database (Section III.D. l).

Simple hardware SARs such as recorder repair are closed after customer acceptance since the "delivery" is immediate and the component is simply being restored to normal operation. Other SARs that are not dependent on revision deliveries such as invalid and duplicate SARs are also closed after customer acceptance.

g. Closure The SAR is closed when all the required actions are satisfactorily and the necessary documentation updated. If the SAR involved test credit, then the PTD would be updated.

14

Inoperable control panel equipment shall be tagged and recorded in the "out-of-service" logbooks as well as submitted as hardware SARs. When these SARs are closed, the tagging shall be removed and the logbooks shall be updated.

4. Plant Modification Database Each reference plant modification is reviewed for simulation impact. The Plant Modification Database (PMD) contains the information for each of the reviews. For those modifications without simulation impact, the database entiy is closed after the review. If there is simulation impact, then a SAR is generated and the modification status is monitored to determine the date that it becomes operable in the plant (status code 4BA, physical changes complete with data sent back to Engineering for document update). The work shall be completed and in the Simulation Training Database within twenty-four months of the operable date. Appendix L shows some sample PMD entries.

Most plant modifications will be installed in the simulation after they become operable in the plant. It is sometimes beneficial for the simulation to proceed the plant. Since the SAR Database contains the pending modifications, it is easy for the customers to identify which ones they would like installed in the simulation before plant operability.

Plant modification packages that are "paper change only" shall not be reviewed as part of this process. Since the plant and the simulation were modified to match the original paperwork, then the fact that the plant does not need further physical changes to match the newer paperwork means that the simulation does not need further changes.

5. Work Control All simulation work shall be controlled to ensure fidelity with the reference plant within the required Scope of Simulation (Section III.A), Simulation Training Capabilities (Section 111.B), and Simulation Testing (Section 111.C). The threshold on SAR submittal, especially on possible discrepancies, is "zero" and all possible barriers should be removed to encourage maximum feedback. Even if the simulation is correct or within the requirements, the SAR resolution provides an opportunity to provide valuable feedback to the customer or to increase fidelity past the requirements. Enhancement suggestions are strongly encouraged since an evolving simulation is one of the best ways to stimulate training.

Each SAR priority has a Closure Goal and a Closure Window (see Appendix J). The Closure Goal is the desired period for SAR resolution. The Closure Window is the period after which an open SAR should be reexamined for impact. The possible results of such a review include closure without further action, a request for additional resources, or a priority change. If a SAR priority has a Requirement, then it has a strict limit based on NRC Certification (reference plant modifications).

15

6. Simulation Difference Database Differences between the reference plant and the Simulation Facility within the Scope of Simulation shall be evaluated for training impact. Those differences that have the potential for impacting operator actions shall be documented and reviewed as part of the continual assessment of the Operator Training Programs. Examples could include background noise, non-simulated control panel equipment, and partially-simulated systems. Minor differences that do not have the potential for impacting operator actions shall not be individually documented. Examples could include small physical differences in engraving letter sizes or switch locations.

The above differences are maintained in the Simulation Difference Database (SDD).

Documentation is not maintained that provides a system by system description as to which component are fully, partially, or not simulated. Appendix K shows some sample SDD entries.

7. Simulation Revision Control Simulation revisions shall be controlled so that each revision is coherent, accurate, and reliable and that the simulation customers are aware of any changes from the previous revision as well as any possible problems with the new one. Revisions shall be numbered as "Z-X.Y' where Z is either S (Salem) or H (Hope Creek), Xis for the major revision, and Y is for the minor revision. For example, H-4.3 is the third Hope Creek minor revision based on the major revision 4.0.

Hardware SARs that fix inoperable equipment are not part of the simulation revision process since they are immediately part of the simulation and are not delayed until the "delivery" of a new revision.

8. Customer Oversight Customer oversight is available through 1) day-to-day contact, 2) access to the SAR Database, 3) informational releases, and 4) periodic meetings to discuss the Operator Training Programs. Simulation Facility issues are an integral part of the continual assessment of the Operator Training Programs.
9. Records Certification records consist of the Performance Test Database (PTD), the Simulation Difference Database ( SDD), the Plant Modification Database (PMD), the Simulation Action Request Database (SARD), SAR attachments, the Cause and Effect documents, the revision summaries, and selected simulation software files. They shall be maintained for at least two complete certification cycles. Whenever possible, the information shall be stored in electronic form.

16

[

SIMULATION RECORDS I

I ELECTRONIC PTD SDD PMD SARD SAR Attachments Cause and Effects Revision Summaries Selected Simulation Files I

I HARD COPY SAR Attachments At the release of each simulation revision (Section ill.D.7), a separate electronic directory shall be created to store the revision information.

Simulation Revision Directory Hope Creek Subdirectory H-4.0 H-4.1 Salem Subdirectory S-3.2 S-4.0 17

10. Reporting Per 10 CFR 55.45, a report shall be submitted to the NRC every four years on the anniversary of the certification (date on the latest NRC Form 474). This report shall include a new NRC Form 474. The report shall include 1) the Exceptions (Section IV), 2) the Completed Performance Testing (Section V), and 3) the Planned Performance Testing (Section VI, Appendix E).

If the Certification Program is changed significantly, then a new NRC Form 474 shall be submitted at the time of the change. It shall include the material relevant to the change.

Examples of"not significant" and "significant" changes are described below.

Not Significant A change to the testing schedule due to minor adjustments of the two year operator requalification schedule (Appendix E).

A change to the Exceptions dealing with test parameter selection (Appendices C and D) or the Operability Test selection (Appendix B).

An expansion in the Scope of Simulation (Section III.A, Appendix A).

Significant Replacement of the testing schedule due to a major change in the two operator requalification schedule (Appendix E).

Inability to meet the approximately 25% per year or the 100% per cycle testing requirements (Appendix E).

A change to the Exceptions dealing with a reduction in the Scope of Simulation (Section III.A, Appendix A).

Additions or deletions to the Simulation Facility such as part-task simulations (Section I).

11. Audits As described in Sections 111.D.1 and III.D.2, the Simulation Training and Modification Databases consist of the entire set ofreference plant data (as maintained in the plant) and simulation subtractions or additions. Only the subtractions and additions are maintained by the SSG. In addition to auditing these SSG sources, sampling of reference plant modification reviews, sampling of relevant reference plant data for incorporation, reviews of closed SARs, reviews of SSG databases, interviews with simulation customers, monitoring of simulation usage, or reviews of the internal SSG processes can be used to assess the status of configuration management.

Since the SAR Database is the collection of all work requests and the entries are not normally validated as they enter the system (Section III.D.3), care must be taken when using the database to determine the "health" of the Simulation Facility.

E. Salem Simulation Salem Unit 2 is the reference plant for the Salem Simulation. Unit 1 and Unit 2 are basically identical units. The Full-Scope Salem Simulator control panels are based on Unit 2. The differences between Unit 1, Unit 2, and the Salem Simulation (as described below) are not so significant that they impact on the ability of the Simulation Facility to meet the requirements of ANS/ANS-3.5-1993 as endorsed byNRC Regulatory Guide 1.149 Revision 2. Differences that are transitory such as "in progress" plant modifications are not described.

18

1. Facility Design Unit 1 has a Westinghouse main generator. Unit 2 and the Salem Simulation have a General Electric main generator. The main generator differences do not have a significant impact on training or evaluation.

Unit 1 is scheduled to have its steam generators replaced with a different model than Unit 2.

The steam generator differences are being evaluated.

2. Technical Specifications There are no significant differences with respect to Technical Specifications.
3. Procedures There are no significant differences with respect to Procedures.
4. Control Room Design Unit 1 control board 1RP6 contains the controls for the gas turbine (Unit 3). Unit 2 and the Salem Simulation do not contain this panel. There is no significant impact on training or examination.

Unit 1 has controls for its Westinghouse main generator while Unit 2 and the Salem Simulation have controls for a General Electric main generator. There is no significant impact on training or examination.

Units 1and2 have a Generator Monitoring System on control panels 1RP7 and 2RP7. This system does not function on the Salem Simulation. There is no significant impact on training or examination.

The Salem Simulation does not have a functional Reactivity Computer on control panel 2RP4. There is no significant impact on training or examination.

The Salem Simulation does not contain all the Radiation Monitoring equipment on control panels lRPl (Unit 1) and 2RP1 (Unit 2). It most closely resembles the Unit 2 configuration.

The Unit 1 and Unit 2 Radiation Monitoring Systems perform identical functions. The difference lies in the processing and display of information. Unit 2 is a more advanced computerized system where the operators obtain information via a computer terminal and from Class lE instrumentation in the Control Room. Unit 1 information is displayed on meters in the Control Room and strip chart recorders at the protection racks. There is no significant impact on training or examination.

Unit 1 has manual change-over for recirculation mode. Unit 2 and the Salem Simulation have semi-automatic change-over. The corresponding lights on 1RP4 and 2RP4 are different. There is no significant impact on training or examination.

19

5. Operational Characteristics The replacement of the Unit 1 steam generators may result in a change in Operational Characteristics that needs to modeled in the Salem Simulation. The steam generator differences are being evaluated. It is possible that separate Salem Unit 1 and Unit 2 simulations maybe maintained where the majority of the software is common but unit-specific modules are maintained.

F. Hope Creek Simulation Hope Creek Unit 1 is the reference plant for the Hope Creek Simulation. Unit 2 was never built.

The differences between Unit 1 and the Hope Creek Simulation (as described below) are not so significant that they impact on the ability of the Simulation Facility to meet the requirements of ANSI ANS-3.5-1993 as endorsed by NRC Regulatory Guide 1.149 Revision 2. Differences that are transitory such as "in progress" plant modifications are not described.

1. Facility Design There are no significant differences with respect to the Facility Design.
2. Technical Specifications There are no significant differences with respect to Technical Specifications.
3. Procedures There are no significant differences with respect to Procedures.
4. Control Room Design Since the Hope Creek Simulation does not model the 1) Fire Protection System, 2) the Shift Supervisor's Console/Desk, 3) the Radiation Monitoring Cabinets, and 4) the Hydrogen Recombiner "B", the associated equipment is not in the Full-Scope Simulator. There is no significant impact on training or examination.
5. Operational Characteristics There are no significant difference with respect to Operational Characteristics.

IV. EXCEPTIONS The following exceptions are taken to ANSI/ANS-3.5-1993 as endorsed by NRC Regulatory Guide 1.149 Revision 2.

A. Common ANSI/ ANS-3.5-1993 (Section 4.1.3.1) specifies that parameters not individually listed shall be within 10% of the benchmark for Steady State Tests. Since there are thousands of such parameters, a list of the most important ones is being used to focus the SSG resources. The selection of these parameters considers the data available on the reference plant heat balances plus the list of parameters in Appendix B of ANSI/ANS-3.5-1993.

20Property "ANSI code" (as page type) with input value "ANSI/ANS-3.5-1993.</br></br>20" contains invalid characters or is incomplete and therefore can cause unexpected results during a query or annotation process.

ANSI/ ANS-3. 5-1993 (Section 4.1.3, Appendix B) specifies parameters for Steady State and Transient Tests. Some of the required parameters are "Total" flows. In some situations, "Individual" flows shall be substituted because they are indicated on the control panels or the plant computer systems. Individual flows also show more information since many transients are not symmetric on the various loops.

ANSl/ANS-3.5-1993 (Section 4.4.2) states that Operability Tests shall be conducted each calendar year. The Simulation Facility Operability Testing shall be based on the certification year. Although each "year" is twelve months, it is preferable that all time periods be based on one date, the date on the NRC Form 474.

ANSl/ANS-3.5-1993 (Appendix B) lists the "Stability Test" as a required yearly test. This test has yielded minimal value to the Simulation Facility over the past years and shall not be conducted on a set periodicity. It will be used as necessary to support Validation Testing.

B. Salem ANSl/ANS-3.5-1993 (Section 4.1.3.1.1) lists "T-hot" and "T-cold" as required parameters for the 1 % steady state tolerance (PWRs). These temperatures are wide range instrumentation and do not have the accuracy of narrow range instrumentation. The "Individual Loop RCS Delta-Temperatures" have been substituted since they perform a better check of the steady state conditions. "T-hot" and "T-cold" are assigned a 10% tolerance.

ANSI/ ANS-3. 5-1993 (Appendix B) lists "Trip of any single reactor coolant pump from approximately 100% power" as a required yearly test for PWRs. Since this transient results in an immediate reactor trip, it has yielded minimal value to the Simulation Facility over the past years. It shall be run instead from "the maximum power level which does not result in immediate reactor trip". The reactor coolant pump trip without a reactor trip provides more valuable data since the loop asymmetries are more important at power.

ANSI/ ANS-3.5-1993 (Appendix B) lists "Maximum rate power from 100% down to approximately 75% and back to 100%" as a required yearly test for PWRs. This test has yielded minimal value to the Simulation Facility over the past years and shall not be conducted on a set periodicity. It will be used as necessary to support Validation Testing.

ANSl/ANS-3.5-1993 (Appendix B) lists both "Pressurizer Pressure" and "Narrow Range Pressurizer Pressure" as required parameters for PWR Transient Tests. There is only one pressurizer pressure range, "Pressurizer Pressure", so the narrow range one shall not be used.

C. Hope Creek ANSI/ ANS-3. 5-1993 (Section 4.1.3.1.3, Appendix B) lists both "Reactor Pressure" and "Reactor Wide Range Pressure" as required test parameters for Steady State Tests. Based on the related set of parameters for Transient Tests, it is assumed that the first parameter refers to narrow range pressure. Only "Reactor Pressure (Wide Range)" shall be used since narrow range pressure does not provide additional information. It is only used on a recorder.

ANSl/ANS-3.5-1993 (Appendix B) lists both "Narrow Range Reactor Pressure" and "Wide Range Reactor Pressure" as required test parameters for Transient Tests. Only "Reactor Pressure (Wide Range)" shall be used since narrow range pressure does not provide additional information. It is only used on a recorder.

21

ANSI/ ANS-3.5-1993 (Section 4.1.3.1.4, Appendix B) lists "Turbine Steam Flow" as a required test parameter for BWRs (Steady State and Transient Tests). This parameter is not indicated on either the control panels or the plant computer systems. "First Stage Pressure" shall be substituted since it is a good indicator of steam flow.

ANSI/ ANS-3.5-1993 (Appendix B) lists "Maximum rate power ramp (master recirculation flow controller in 'manual') down to approximately 75% and back to 100%" as a required yearly test for BWRs. This test has yielded minimal value to the Simulation Facility over the past years and shall not be conducted on a set periodicity. It will be used as necessary to support Validation Testing.

V. COMPLETED PERFORMANCE TESTING A Salem Simulation

1. Closure An exception was taken in the last submittal against the requirement for "Loss of Instrument Air" simulation functions in ANSI/ANS-3.5-1985 (Section 3.1.2). These functions have been added to the Simulation Facility and are used in the training programs.

An exception was taken in the last submittal against the requirement for "125V DC Bus Failures" simulation functions in ANSI/ANS-3.5-1985 (Section 3.1.2). These functions have been added to the Simulation Facility and are undergoing final testing. Once the testing is completed, these function will be used in the training programs.

The last submittal provided a schedule for fixing "Uncorrected Test Failures". Except for the three "125VDC Bus Failures", the problems were resolved within the schedule. The 125V DC Bus simulation was replaced from scratch and involved a greater than expected resource allocation. It is undergoing final testing which will be completed in December 1996.

2. Status The previous certification date was December 1993. The Salem Simulation is in Year 3 of this certification cycle. This new submittal will interrupt this cycle and start a new one.

Year 3 of the previous certification and Year 1 of the new certification overlap.

The planned Performance Testing (Operability and Functional) for Years 1 and 2 was completed with minor deviations from the original schedule due to upgrades, simulation availability, and other issues. The modified testing scope for Year 3 includes the Operability Tests and limited Functional Tests. It will be completed in December 1996. The full, original schedule of Functional Tests will not be completed due to upgrade activities and the overlapping certification cycles. Except for the 125V DC Bus Failures (Section V.A.1 ),

there are no uncorrected Performance Tests.

22

B. Hope Creek Simulation

1. Closure There were no open issues in the previous submittal.
2. Status The previous certification date was December 1992. The Hope Creek Simulation is in Year 4 of this certification cycle. This new submittal will interrupt this cycle and start a new one.

Year 4 of the previous certification and Year 1 of the new certification overlap.

The planned Performance Testing (Operability and Functional) for Years 1, 2, and 3 was completed with minor deviations from the original schedule due to upgrades, simulation availability, and other issues. The modified testing scope for Year 4 includes the Operability Tests and limited Functional Tests. It will be completed in December 1996. The full, original schedule of Functional Tests will not be completed due to upgrade activities and the overlapping certification cycles. There are no uncorrected Performance Tests.

VI. PLANNED PERFORMANCE TESTING A. Schedules The Simulation Facility will have one certification cycle for both the Salem and Hope Creek Simulations. The below table shows the cycle ifthe NRC Form 474 date is November 1996:

Year 1 November 1996 to November 1997 Year2 Year3 Year4 November 1997 to November 1998 November 1998 to November 1999

  • November 1999 to November 2000 The Operability Tests listed in SectionHI.C.3.b shall be conducted each certification year. The Functional Tests described in Section 111.C.3.c shall be conducted so that the combination of Usage-Based and Cycle-Based Testing yields approximately 25% of the active PTO entries for each simulation being tested per certification year and every active PTO entry being tested within the certification cycle.

The schedules for the Functional Tests are based on the two year operator requalification schedules. The testing schedules assign each different test group to the certification years in which the topic is scheduled in requalification training. For example, if the Reactor Coolant System was scheduled for Cycle 1 in the first year of the Salem requali:fication schedule and the cycle start date was January 13, 1997, then the test group "RC" would appear in Year 1 of the certification cycle (assuming an November 1996 NRC Form 474 date). If the next requalification schedule is assumed to be the same, then "RC" would also appear in Year 3. Depending on the simulation functions within the scenario guides, some of the functions will be available for test credit in Year 1 and some in Year 3 (Usage-Based Testing). Cycle-Based Testing will be used to compensate for those functions which 1) are not used in the chosen scenario guides, 2) are not adequately tested in the guides, or 3) are needed to meet the 25% per certification year requirement.

It is possible for Usage-Based Testing to complete over 25% of the required tests in one certification year. In this case, Cycle-Based Testing from the same test groups may be postponed into other years to balance the testing.

23

It is possible for a simulation function to be tested more than once in a certification cycle due to the characteristics of the two year requali:fication schedules and the modification process. The date in the PTD shall reflect the most recent successful test.

Appendix E contains the Performance Test Schedules based on estimates of the requalification schedules for Salem and Hope Creek. Since the requalification schedules are subject to change and the mix of Usage-Based and Cycle-Based Testing within each test group is uncertain until the actual requali:fication cycles, these testing schedules are approximate. A current status plus a short term projection will be available from the SSG upon request.

B. Comments Due to the major upgrade activities scheduled for completion in 1997, it is expected that Year 1 will show Functional Testing in excess of the 25% per certification year objective. Some of these upgrades were originally planned for completion in 1996 and the associated testing would have completed the original schedule of Functional Tests (Section V.A.2 and V.B.2).

VII. REFERENCES IO CFR55.45 ANS/ANSI-3.5-1993 NRC Regulatory Guide 1.149 Revision 2 24

APPENDIX A: NORMAL EVOLUTIONS Salem S2.0P-IO.ZZ-0002 S2.0P-IO.ZZ-0003 S2. OP-IO.ZZ-0004 S2.0P-IO.ZZ-0005 S2.0P-IO.ZZ-0006 S2.0P-IO.ZZ-0008 S2.0P-ST-SSP-0001 S2.0P-ST-TRB-0001 S2.0P-ST-500-0001 S2. OP-ST-CVC-0008 S2. OP-ST-CVC-0009 S2. OP-ST-CVC-0010 S2.0P-ST-INST-0001 S2.0P-ST-NIS-0001 S2.0P-ST-NIS-0002 S2. OP-ST-RC-0004 S2. OP-ST-RC-0005 S2. OP-ST-RC-0007 S2. OP-ST-RC-0008 Hope Creek HC.OP-IO.ZZ-0002 HC.OP-IO.ZZ-0003 HC.OP-IO.ZZ-0004 HC.OP-IO.ZZ-0006 HC.OP-IO.ZZ-0007 HC.OP-IO.ZZ-0008 HC.OP-ST.AC-0002 HC.OP-ST.BB-0001 HC.OP-ST.BF-0001 HC.OP-ST.BJ-0002 RC.OP-ST. GS-0003 HC. OP-ST. GS-0004 HC.OP-ST.GU-0001 HC.OP-ST.KJ-0001 HC.OP-ST.KJ-0002 HC.OP-ST.KJ-0003 COLD SHUTDOWN TO HOT STANDBY HOT STANDBY TO MINIMUM LOAD POWER OPERATION MINIMUM LOAD TO HOT STANDBY HOT STANDBY TO COLD SHUTDOWN MAINTAINING HOT STANDBY MANUAL SAFETY INJECTION - SSPS MAIN TURBINE VAL VE TESTING ELECTRICAL POWER SYSTEMS AC SOURCES ALIGNMENT REACTIVITY CONTROL SYSTEMS - BORATION REACTIVITY CONTROL SYSTEMS - BORATION BORATED WATER SOURCES INSTRUMENT ACCIDENT MONITORING POWER DISTRIBUTION - AXIAL FLUX DIFFERENCE POWER DISTRIBUTION - QUADRANT POWER TILT RATIO REACTOR COOLANT PUMP STATUS REACTOR COOLANT SYSTEM RCP/RHR LOOP STATUS MODES 3-4 SEAL INJECTION FLOW REACTOR COOLANT SYSTEM WATER INVENTORY BALANCE PREPARATION FOR PLANT STARTUP STARTUP FROM COLD SHUTDOWN TO RATED POWER SHUTDOWN FROM RATED POWER TO COLD SHUTDOWN POWER CHANGES DURING OPERATION OPERATIONS FROM HOT STANDBY SHUTDOWN FROM OUTSIDE CONTROL ROOM TURBINE VAL VE TESTING - MONTHLY RECIRCULATION JET PUMP OPERABILITY - DAILY CONTROL ROD DRIVE EXERCISE WEEKLY HPCI SYSTEM FUNCTIONAL TEST (LOWfPRESSURE) - 18 MONTHS REACTOR BLDG/SUPP CHAMBER VAC BREAKER OP TEST -

MONTHLY SUPP CHAMBER/DRYWELL VACUUM BREAKER OPERABILITY TEST FRVS OPERABILITY TEST MONTHLY EMERGENCY DIESEL GENERATOR AG400 OP TEST MONTHLY EMERGENCY DIESEL GENERATOR BG400 OP TEST MONTHLY EMERGENCY DIESEL GENERATOR CG400 OP TEST MONTHLY 25

HC.OP-ST.KJ-0004 HC.OP-ST.SF-0002 HC.OP-ST.SM-0001 HC.OP-ST.ZZ-0001 HC.OP-IS.AB-0101 HC.OP-IS.BC-0001 HC.OP-IS.BC-0002 HC.OP-IS.BC-0003 HC.OP-IS.BC-0004 HC.OP-IS.BD-0001 HC.OP-IS.BE-0001 HC.OP-IS.BE-0002 HC.OP-IS.BJ-0001 HC.OP-FT.AC-0001 HC.OP-FT.AC-0002 HC.OP-FT.AC-0005 EMERGENCY DIESEL GENERATOR DG400 OP TEST MONTHLY RSCS OPERABILITY PCIS/NSSSS ISOLATION FUNCTIONAL TEST 18 MONTHS POWER DISTRIBUTION LINEUP - WEEKLY MAIN STEAM SYSTEM VALVES - INSERVICE TEST AP202, A RESIDUAL HEAT REMOVAL PUMP - INSERVICE TEST CP202, C RESIDUAL HEAT REMOVAL PUMP - INSERVICE TEST BP202, B RESIDUAL HEAT REMOVAL PUMP - INSERVICE TEST DP202, D RESIDUAL HEAT REMOVAL PUMP - INSERVICE TEST REACTOR CORE ISOLATION COOLING PUMP OP203 -

INSERVICE TEST A&C CORE SPRAY PUMPS - AP206&CP206 - INSERVICE TEST B&D CORE SPRAY PUMPS - BP206&DP206 - INSERVICE TEST HPCI MAIN & BOOSTER PUMP SET - OP204 & OP2 l 7 -

INSERVICE TEST MAIN TURBINE FUNCTIONAL TEST - WEEKLY MAIN TURBINE FUNCTIONAL TEST - MONTHLY TURBINE OVERSPEED PROTECTION SYSTEM OPERABILITY TEST - WEEKLY 26

APPENDIX B: OPERABILITY TESTS Salem Steady State Tests Three distinct power levels for which heat balance data is available Transient Tests The following set of tests shall be run from 100% power, steady-state poisons and decay heat unless otherwise noted in the test description. When the benchmark contains known "follow up" operator actions, the test shall use the same actions.

Manual reactor trip Simultaneous trip of all feedwater pumps Simultaneous closure of all main steam isolation valves Simultaneous trip of all reactor coolant pumps Trip of any single reactor coolant pump from maximum power which does not result in immediate reactor trip Main turbine trip from maximum power which does not result in immediate reactor trip Runback Maximum size reactor coolant system rupture Maximum size reactor coolant system rupture combined with loss of all off-site power Maximum size unisolable main steam line rupture*

Slow primary system depressurization to saturated condition Slow primary system depressurization to saturated condition using pressurizer relief or safety valve stuck open (inhibit activation of high pressure Emergency Core Cooling System)

Load Rejection Steam Generator Tube Rupture ATWT ATWT with simultaneous loss of all feedwater Hope Creek Steady State Tests Three distinct power levels for which heat balance data is available Transient Tests The following set of tests shall be run from 100% power, steady-state poisons and decay heat unless otherwise noted in the test description. When the benchmark contains known "follow up" operator actions, the test shall use the same actions.

Manual reactor scram Simultaneous trip of all feedwater pumps Simultaneous closure of all main steam isolation valves Simultaneous trip of all recirculation pumps Single recirculation pump trip 27

Main turbine trip from maximum power which does not result in immediate reactor scram Runback Maximum size reactor coolant system rupture Maximum size reactor coolant system rupture combined with loss of all off-site power Maximum size unisolable main steam line rupture Simultaneous closure of all main steam isolation valves combined with single stuck open safety or relief valve (inhibit activation of high pressure Emergency Core Cooling Systems)

ATWS 28

APPENDIX C: STEADY STATE PARAMETERS Salem Group 1 Individual Loop RCS Average Temperatures Individual Loop RCS Delta-Temperatures CoreMWt MWe Individual Power Range Readings Pressurizer Pressure Pressurizer Water Level (Hot Calibrated)

Individual Steam Generator Pressures Group 2 Individual Loop RCS Flows Individual Steam Generator Water Levels (Narrow Range)

Individual Steam Generator Feedwater Flows Individual Steam Generator Feedwater Temperatures Individual Steam Generator Steam Flows Individual Steam Generator Blowdown Flows Letdown Flow Charging Flow Main Turbine First Stage Pressure Group 3 Individual Loop Hot Leg Temperatures Individual Loop Cold Leg Temperatures Pressurizer Steam-Space Temperature Pressurizer Water-Space Temperature Pressurizer Surge Line Temperature Pressurizer Water Level (Cold Calibrated)

Individual Steam Generator Water Levels (Wide Range)

Individual Feedwater Heater Water-Side Outlet Temperatures Individual Feedwater Heater Steam-Side Pressures Individual Intermediate Range Readings Group Control Rod Positions RCS Average Boron Concentration Pressurizer Average Boron Concentration Containment Pressure (Wide Range)

Average Containment Temperature 29

Hope Creek Group 1 CoreMWt MWe Reactor Pressure (Wide Range)

Total Core Flow Group 2 Individual APRM Readings Individual Reactor Vessel Feedwater Temperatures Individual Recirculation Loop Flows Total Steam Flow Total Feedwater Flow Main Steam Pressure Main Turbine First Stage Pressure Main Condenser Vacuum Individual Calibrated Jet Pump Flows Reactor Water Level (Narrow Range)

Group 3 Individual Reactor Recirculation Suction Temperatures Reactor Water Level (Wide Range)

RWCUFlow RWCU Inlet Temperature RWCU Outlet Temperature Individual Moisture Separator Pressures Individual Main Condenser Hotwell Temperatures Individual Feedwater Heater Water-Side Outlet Temperatures Individual Feedwater Heater Steam-Side Pressures Individual LPRM Readings Individual Recirculation Pump Speeds Individual Control Rod Positions Control Rod Drive Hydraulic System Flow Control Rod Drive Hydraulic System Temperature Drywell Pressure (Wide Range)

Average Drywell Temperature Suppression Pool Pressure (Wide Range)

Suppression Pool Water-Space Temperature Suppression Pool Air-Space Temperature Suppression Pool Water Level 30

~.

APPENDIX D: TRANSIENT PARAMETERS Salem Individual Power Range Readings Individual Intermediate Range Readings Individual Source Range Readings MWt MWe Reactor Vessel Level (Wide Range)

RCS Pressure (Wide Range)

RCS Subcooling Margin Individual Loop RCS Loop Flows Individual Loop RCS Average Temperatures Individual Loop RCS Delta-Temperatures Individual Loop Hot Leg Temperatures Individual Loop Cold Leg Temperatures Pressurizer Pressure Pressurizer Water Level (Hot Calibrated)

Pressurizer Water Level (Cold Calibrated)

Pressurizer Steam-Space Temperature Pressurizer Water-Space Temperature Pressurizer Surge Line Temperature Individual Steam Generator Feedwater Flows Individual Steam Generator Feedwater Temperatures Individual Steam Generator Steam Flows Individual Steam Generator Blowdown Flows Individual Steam Generator Pressures Individual Steam Generator Water Levels (Narrow Range)

Individual Steam Generator Water Levels (Wide Range)

Containment Pressure (Wide Range)

Average Containment Temperature Containment Sump Water Levels Total Low Pressure Injection Flow Total High Pressure Injection Flow Total Containment Spray Flow Total Relief Flow Total Leak Flow 31

~.

Hope Creek Individual APRM Readings Individual LPRM Readings Individual IRM Readings Individual SRM Readings MWt MWe Reactor Water Level (Narrow Range)

Reactor Water Level (Wide Range)

Reactor Water Level (Fuel Zone)

Reactor Pressure (Wide Range)

Individual Recirculation Loop Flows Individual Calibrated Jet Pump Flows Total Core Flow Total Steam Flow Main Steam Pressure Main Turbine First Stage Pressure Total Feedwater Flow Drywell Pressure Drywell Temperature Suppression Pool Pressure Suppression Pool Water-Space Temperature Suppression Pool Air-Space Temperature Suppression Pool Water Level Total Low Pressure Injection Flow Total High Pressure Injection Flow Total Low Pressure Core Spray Flow Total Relief Flow Total Leak Flow 32

APPENDIX E: PERFORMANCE TEST SCHEDULES Salem Operability Tests (Section III.D.3.b)

Every certification year OP Operability Tests Functional Tests (Section III.D.3.c)

Year 1 cc ccw SW SSW CV eves RD Control Rods RP Protective System Setpoints/Logic EL Electrical Distribution TU Main Turbine GE Main Generator NI Nuclear Instrumentation RC RCS PR Pressurizer MS Main Steam RM Radiation Monitoring PC Plant Computer Systems Year2 cw Circulating Water CN Condensate HD Heater Drains CA Instrument Air I Service Air TA Turbine Auxiliaries SG Steam Generators vc Containment Systems DG Diesel Generators AF Auxiliary Feedwater FP Fire Protection BF Main Feedwater PI Protection/lnstrument Power RH Residual Heat Removal Year3 Same as Year 1 Year4 Same as Year 2 33

~.

Hope Creek Operability Tests (Section III.D.3.b)

Every certification year OP Operability Tests Functional Tests (Section 111.D.3.c)

Year 1 AD Auto Depressurization cc Computer Systems CR Reactor Core cu Reactor Water Cleanup (RWCU) ex Condensate Storage/Transfer EC Emergency Core Cooling Systems (ECCS)

EG Main Generator FW Condensate and Feedwater HR Hydrogen Recombiner IA Instrument Air MS Main Steam OG Off Gas RC Reactor Core Isolation Cooling (RCIC)

RM Radiation Monitoring RR Reactor Recirculation RW Rad waste SL Standby Liquid Control (SBLC)

TU Main Turbine Year2 AN Annunciators CD Control Rod Drive Hydraulic (CRD) cs Core Spray System cw Cooling Water Systems DG Diesel Generators.

ED/ER Electrical Distribution EP Emergency Operating Procedures HP High Pressure Coolant Injection (HPCI)

HV Heating & Ventilation MC Main Condenser NM Neutron Monitoring PC Primary Containment RH Residual Heat Removal (RHR)

RP Reactor Protection System RS

  • Rod Sequence & Control (Including RWM)

RZ Redundant Reactivity Control System (RSCS)

TC Turbine Control Year3 Same as Year 1 Year4 Same as Year 2 34

Simulation a

H H

s s

s s

Description Loss of Condenser Vacuum Manual Reactor Trip RCP Seal Filter Clogged Dropped Control Rod Feedwater Bypass Line Leak Pressurizer Surge Line Leak SAR generated to fix the problem b

SAR generated to add the new malfunction Simulation APPENDIX F: PT Last Test Test Test Test Cross Status Group Frequency Date Reference ACTIVE/SAT CE Cycle 08/16/95 H-95-098 ACTIVE/SAT OP Year 07/12/96 H-96-124 ACTIVE/UNSAT CV Year 08/01/95 S-95-076 ACTIVE/RES RD Cycle 11/12/95 S-95-201 INACTIVE/SAT FW 01/23/92 S-92-005 INACTIVE/UNSAT RC Simulation Revision MPLE ENTRIES Last Usage Date 07/01/96 05/04/95 12/04/95 05/21/92 Scenario Cross References H-SG-2310 H-ESG-2311 S-SG-1318 S-SG-4501 S-ESG-4601 S-ESG-3410 S-SG-1456 SAR Cross References S-96-090 a S-96-123 a S-96-450b S

Salem Simulation S-4.3 Salem Revision 4.3 ( 4.0 is major delivery, 4.3 represents the third minor change from this revision)

H Hope Creek Simulation Status ACTIVE INACTIVE SAT RES UN SAT Test Frequency Year Cycle Last Test Date MM/DD/YY Test Cross Reference Various Part of the current Operator Training Program Not part of the current Operator Training Program Provides satisfactory response Available for use (satisfactory response) with some restrictions Does not provide satisfactory response.

Operability Test (Certification Year)

Functional Test (Certification Cycle)

Not ACTIVE in PTD, no testing requirements until activated Date oflast successful test Reference for the last successful test 35 Simulation Revision Information S-5.0 H-8.l S-4.3 S-4.0 S-2.0 Comments Leakoffflow does not change Edge rods are not worth enough Inactive 03/12/93 (SAT at time)

New malfunction, not yet tested

APPENDIX G: TEST CREDIT EXAMPLE The below paragraphs provide an example of a completed form.

============================================================

SIMULATION FUNCTIONAL TEST CREDIT Testing credit may be taken when an Sl\\tfE confirms that the simulation accurately follows the associated Cause and Effect document and that the response satisfies the acceptance criteria as listed below.

1) The applicable plant procedures can be used as needed to support the activity.
2) The parameters (pressures, flows, temperatures, etc.) go in the correct directions.
3) The expected alarms and automatic actions occur.
4) No unexpected alarms or automatic actions occur.

If a simulation function does not meet all of these requirements, then the Sl\\tfE should not recommend credit and should fill out a SAR describing the problem. If it meets the requirements but there are still improvement items, then the Sl\\tfE should recommend credit and should fill out a SAR describing the improvement items.

Simulation Functions Recommended for Credit (filled out by Sl\\tfE)

Malfunction NM-20 Malfunction NM-21 Malfunction RP-05 Malfunction RP-02 Malfunction RR-19 Remote Function NM-01 Usage Cross-Reference (if any):

Simulation Revision:

Sl\\tfE Initials:

Date:

Comments:

LPRM Failure (100%)

APRM Failure (0%)

Loss of Selected RPS Scram Channel Loss of RPS MG Set Recirc Flow Transmitter Failure (0%)

LPRMBypass SG-14 H-1.2 DK 09/03/96 Feedback from operator indicated that the simulation inertia may be too small (spins down too fast).

36

'1 APPENDIX H: CAUSE AND EFFECT EXAMPLE The below paragraphs provide an example of a malfunction Cause and Effect entry.

==========================================================

EHC FLUID LEAK Abbreviation:

TCOl 0-100 gpm Range:

Component:

EHC Line Before Main Stop Valve #2 Selected Automatic Actions:

1. EHC header pressure <= 1300 psig, standby EHC pump auto starts
2. EHC header pressure <= 1100 psig, Main Turbine trips
3. On Main Turbine trip, reactor scrams if reactor power > 30%

Selected Alarms:

1. Computer point "Low EHC Hdr Press" (El-F5)
2. Overhead Annunciator "Turbine Hydr Reservoir Trouble" (D3-E5)

Major Effects:

EHC pump current will initially increase due to the increased flow. The Main Stop Valve #2 will close due to insufficient EHC pressure Oeak location). The standby EHC pump will auto start when the header pressure is less than 1300 psig. At llOO psig, the Main Turbine will trip, causing a reactor scram if power is greater than 30%. If operation continues, then all EHC fluid will empty, resulting in EHC pump cavitation and seizure. The remainder of the transient is essentially the same as a Main Turbine trip.

User Notes:

1. The total EHC fluid volume is about 650 gallons and the leaking fluid is lost from the EHC system.
2. This malfunction is non-recoverable and can only be "removed" with an a IC reset.

Selected

References:

LER 94-020 (07/20/94)

Last Amendment 08/01/94 H-1.0 37 Last Test 09/03/96 H-4.2

APPEND~I: SARFORM Extension:

Originator:

itiation:

Date: ----

Request:

____ Starting Initial Condition: ___ Problem Snapshot:

Revision:

Plant:

References:

Test Credit Form:

Customer Review: Priority: _

Initials:

Date: ----

System:

Resolution:

Initials:

Date:

Revision:

Testing:

Initials:

Date:

Customer Acceptance:

Date: ---

Initials:

Date:

Revision:

Closure:

Initials:

Date:

38

APPENDIX J: SAR PRIORITIES These definitions provide the basic structure for SAR priorities. Individual SARs may have assigned priorities outside of these guidelines based on impact assessments by the client or the Simulation Support Group (SSG) Supervisor. The priorities provide the normal classification for work activity from highest (Priority 1) to lowest (Priority 11). The SSG Supervisor has the flexibility to assign work outside of the priority structure.

"Closure Goal" is the desired period for SAR resolution. "Closure Window" is the period after which an open SAR should be reexamined for impact. The possible results of such a review include closure without further action, a request for additional resources, or a priority change. "Requirement" is any strict limit based on NRC Certification.

Priority 1

A problem that makes the Simulator unavailable for a scheduled activity. This type of problem can not be avoided by the client and is generally fixed immediately. Safety concerns fall under this grouping. Personnel call-outs and overtime will be used if necessary.

Closure Goal:

Within I hour Closure Window:

Within 4 hours4.62963e-5 days <br />0.00111 hours <br />6.613757e-6 weeks <br />1.522e-6 months <br /> Examples:

Loss of power to control boards, software failure during exam scenario, broken meter glass (safety concern).

2 An area within the scope of simulation that can not be used or causes the crew to halt further action (requires instructor prompting to continue). Clients avoid these areas while using the Simulator. Major operator feedback will usually be in this group. These problems will be worked within normal working hours unless the area is required for a scheduled activity.

Closure Goal:

Within two requalification segments or scheduled usage Closure Window:

Within four requalification segments or past scheduled usage Example:

LBLOCA failure when coincident with loss of all AC power.

3 An area within the scope of simulation that can be used but which requires the instructor to explain to the student the Simulator versus plant differences. For problems in this group, the crew would continue without instructor prompting. Minor operator feedback will usually be in this group or Priority 6.

Closure Goal:

Within four requalification segments Closure Window:

Within six requalification segments Examples:

Pump vibration alarm should require manual reset, discharge pressure should be 100 psig (simulator shows 65).

4 A modification/addition to the scope of simulation based on a review of reference plant modifications, industry events, or regulatory guidance.

Closure Goal:

For plant modifications, within 12 months of operable in plant; others within 12 months of initiation Closure Window:

Requirement:

Example:

For plant modifications, within 18 months of operable in plant; others within 18 months of initiation For plant modifications, within 24 months of operable in plant Change power supply for pressure transmitter per plant modification.

39

5 A modification/addition to the scope of simulation needed to support scheduled activities. This grouping includes Initial Condition modifications.

Closure Goal:

Based on negotiation with client; Initial Conditions modifications within one requalification segment Closure Goal:

Same Example:

Add a new malfunction for a pump sheared shaft.

6 An area within the scope of simulation that can be improved but which does not require the instructor to explain to the student the Simulator versus plant differences. Minor operator feedback will usually be in this group or Priority 3.

Closure Goal:

Within 12 months of initiation Closure Window:

Within 24 months of initiation Example:

Recirculation valve stroke time appears a little slow.

7 Activities that support Certification requirements but are not covered by other priorities. This grouping includes SARs that recommend test credit for simulation functions.

Closure Goal:

Within certification requirements Closure Window:

Same Examples:

Perform annual steady state 100% test; take test credit for "Loss of Condenser Vacuum".

8 A hardware problem with minimal impact on Simulator operation.

Closure Goal:

Within 12 months of initiation Closure Window:

Within 24 months of initiation Example:

Engraving abbreviation is different from the plant.

9 A modification/addition that would improve the ability of the instructor to operate the Simulator but is largely transparent to the student.

Closure Goal:

Within 12 months of initiation Closure Window:

Within 24 months of initiation Example:

Provide indication to the instructor of the actual LOCA flow rate.

10 An area that should be added to the scope of simulation to improve the overall Simulator capabilities (no firm schedule for usage).

Closure Goal:

Within 24 months of initiation Closure Window:

Within 48 months of initiation Example:

Upgrade electrical distribution to include full modeling of the grid.

11 An activity that contributes to the maintenance of the Simulator but which is largely transparent to the student and the instructor.

Closure Goal:

As determined by SSG Supervisor Closure Window:

Same Example:

Upgrade operating system.

40

Simulation Description H

Fire Protection Terminal is not simulated.

s Reactivity Computer is not simulated (2RP3).

APPENDIX K: SD MPLE ENTRIES Evaluation The usage of cue cards is sufficient to meet traming needs. The estimated cost of $97K from Foxboro to install the Fire Protection system is not worth the benefit.

This computer is used infrequently by Operations and has minimal training impact.

"Ops TRG" is the abbreviation for the Operations Training Review Group. It is a periodic meeting between Operations and Training to discuss common issues.

41 J /'

Cross-Reference Ops TRG 08/96

~ ~ -~-- -

j

Simulation Identification Impact H

3EE-362 YB s

2EC-342 YS APPENDIX L: PMD SAMPLE ENTRIES Description Install new recorder at SW intake structure.

Add synch check relays to the DG breakers.

Plant Status 3A 04/25/96 4BA 09/20/96 Operable Date (4BA) 09120/96 SAR#

H-96-231 S-96-103 Plant Status is tracked as a code along with the date of the code. For example, '4BA" is the first code where the modification is considered operable in the plant for simulation purposes.

Plant Status Code Examples 1

Initiated 2A Design Complete 3 A Issued to Station 3CP Partially Installed 3F Station Operability Review and Sign-Off Complete 4BA Sent to Engineering for Document Update 5

Closed Impact Key YS YH YB Y?

Software Impact Only Hardware Impact Only Software and Hardware Impact Impact to be Determined P**

_j

_ _J