ML20245K870

From kanterella
Jump to navigation Jump to search
Simulator Certification Submittal,Vol 2: 'Nuclear Simulator Engineering Manual.'
ML20245K870
Person / Time
Site: Millstone, Hatch  Dominion icon.png
Issue date: 06/29/1989
From:
NORTHEAST NUCLEAR ENERGY CO.
To:
Shared Package
ML20245K850 List:
References
NUDOCS 8907050288
Download: ML20245K870 (458)


Text

O VOLUME 2 NUCLEAR SIMULATOR ENGINEERING MANUAL i O

4 lo p ,u m ar

NUCLEAR SIMULATOR ENGINEERING MANUAL 1

Ih TABLE OF CONTENTS V

l No. Rev. Issue Date Procedure Title NSEM-1.01 3 5/11/89 Control of the Nuclear Simulator Engineering Manual NSEM-1.02 1 5/11/89 Simulator Certification Program Overview NSEM-2.01 0 2/9/88 Defining Training Requirements NSEM-2.02 0 2/9/88 Defining The " Certified" Trainer NSEM-2.03 0 5/30/88 Software Design Verification NSEM-3.01 3 1/12/89 Definition and Control of the Simulator Design Data Base NSEM-3.02 2 11/9/88 Control of simulator Design Documentation NSEM-4.01 1 5/11/89 Verifying simulator Capabilities via System Tests

/ i

(, ) NSEM-4.02 0 1/12/89 Initial Conditions NSEM-4.03 0 5/4/88 Certified Remote Functions NSEM-4.04 0 6/29/88 Major Malfunction Testing NSEM-4.05 1 5/11/89 Malfunction Testing NSEM-4.07 0 3/23/89 Master Testing Schedule NSEM-4.08 0 3/24/88 Simulator Operating Limits NSEM-4.09 0 5/4/88 Simulator Operability Testing NSEM-4.10 0 6/29/88 Normal Operations Verification NSEM-4.11 0 8/17/88 Instructor Station NSEM-4.12 0 4/13/89 Simulator Physical Fidelity / Human Factors Evaluation NSEM-4.13 1 5/11/89 Real Time Simulation Verification Rev.: 17

(~'T Date: 6/27/89

(_) Page: 1 of 2

i _:

NUCLEAR SIMULATOR ENGINEERING MANUAL 1'

TABLE OF CONTENTS.(continued)

No. Rev. Issue Date Procedure Title NSEM-5.01 3 5/26/88 Simulator Modification control Procedure NSEM-5.02 0 4/13/89 Retest Guidelines NSEM-6.01 0 1/12/89 Student Feedback

'NSEM-6.02 0 1/12/89 Development of New Simulator Guides NSEM-6.03- 1 6/27/89 Collection of Plant Performance Data NSEM-6.04 0- 1/12/89 Major Plant Modifications

.,~.

k_

Rev.: 17 ,

Date: 6/27/89  !

O Page: 2 of 2

NORTHEAST UTILITIES NUCLEAR SIMULATOR ENGINEERING MANUAL NSEM - 1.01 CONTROL OF THE NUCLEAR SIMULATOR ENGINEERING MANUAL

"::::::al: 4/LSinlulator g

MIPM TetMnical ra. Support Approved:

Dir Nuclear Training Revision: 3 Date: 5/11/89 SCCC Meeting No: 89-005 i

1.0 PURPOSE The. purpose of this procedure is to provide requirements for the layout, format, review, distribution and maintenance of the Nuclear Simulator Engineering Manual (NSEM).

2.0 APPLICABILITY This procedure. applies to all individuals within the Nuclear Training Department who are involved with the Simulator Maintenance, Training & Certification Programs.

3.0 REFERENCES

3.1 ANSI /ANS3.5-1985 - This standard states the minimal

' functional requirements on design data and simulator performance and operability testing.

3.2 NRC RG 1.149-Rev. 1, April 1987 - This guide describes an acceptable methodology for certification by endorsing ANSI /ANS-3.5, 1985 with some additional requirements.

3.3 10CFR 55.45(b) - Mandates a timetable for simulator facility certification and specifies additional testing requirements.

(

3.4 INPO Good Practices TQ-504 - Describes techniques for effectively controlling simulator configuration.

3.5 NUREG 1258 December, 1987 - Describes the procedures and techniques which will be employed to audit certified facilities.

3.6 INPO 86-026, Guideline For Simulator Training, October, 1986.

3.7 INPO 87-006, Report on Configuration Management in the Nuclear Utility Industry, July, 1987.

4. 0' DEFINITIONS 4.1 Nuclear Simulator Engineering Manual - A controlled manual which contains the required procedures for development and implementation of the Simulator Certification Program.

Rev.: 3 Date: 5/11/89 NSEM-1.01 Page: 1 of 10

/

A 4.2 Simulator Configuration Control Committee (SCCC) -

\~-) The committee responsible for overall simulator design control and management of NTD resources involved in the simulator development, maintenance and certifica-tion effort. The committee shall include as permanent members; the Director of NTD the Managers of OTB and STSB and the Operations Consultants.

The Director-NTD shall chair the committee and the Manager-STSB shall function as the secretary.

The minimum permanent members required to constitute

.a quorum are two management representatives (Director / Managers) and one Operations Representative (Operations Consultants). The Manager-STSB'shall act as chairman in the Director's absence.

4.3 OTB - Operator Training Branch of the Nuclear Training Department.

4.4 STSB - Simulator Technical Support Branch of the Nuclear Training Department.

4.5 ASRMS - Administrative Services and Records Management Section.

4.6 PDCR - A Plant Design Change Record which contains all k' necessary information and forms to accomplish in an orderly manner the modification of a plant system, structure or component.

4.7 LER - Licensee Event Report required by NRC 10CFR 50-73 which describes those events which shall be reported within 30 days after discovery of the event.

4.8 SOER - Significant Operating Event Report is generated It by INPO and distributed to industry members.

includes recommendations concerning the event which must be addressed by concerned facilities.

4.9 NRC Form 474 -The form submitted to the NRC by the facility licensee for the certification, recertifi-cation, and for any change to a simulation facility performance testing plan after the initial submittal of such a plan.

l 4.10 EOPs - Emergency Operating Procedures address the required response by operations personnel to emergency conditions.

O Rev.: 3 Date: 5/11/89 NSEM-1.01 Page: 2 of 10

l 5.0 RESPONSIBILITIES 5.1 Director, Nuclear Training 5.1.1 Responsible for direction of the NTD Configuration Management Program and the establishment of' goals.

5.1.2 Responsible for the assignment of personnel necessary to achieve and maintain simulator configuration.

5.1.3 Responsible for chairing the SCCC.

5.1.4 ' Responsible for final approval of all NSEM procedures.

5.2 Manager, Simulator Technical Support.

5. 2.1- Responsible for review and approval of-those NSEM procedures developed by STSB. -5.2.2 Responsible for assigning personnel to develop STSB procedures required for the Configuration Management Program and the simulator certification effort.

Responsible for coordinating the various

( 5.2.3 NTD branches and sections to ensure that the simulator certification effort is accomplished according to the time table.

5.2.4 Responsible for chairing the SCCC in the absence of the Director, Nuclear Training.

5.2.5 Responsible for coordinating the review and distribution of all NSEM procedures.

5.3 Manager,' Operator Training 5.3.1 Responsible for review and approval of those NSEM procedures developed by OTB.

5.3.2 Responsible for assigning personnel to develop l the OTB procedures required for the Configu- i ration Management Program and the simulator certification effort.

i i

Rev.: 3 Date: 5/11/89 j NSEM-1.01 Page: 3 of 10 l

v 5.4 Supervisor, ASRMS 5.4.1 Responsible for assigning personnel to perform records management and other clerical activities associated with configuration management and the simulator certification effort.

5.4.2 Responsible for maintaining the documentation necessary to support the Configuration Manage-ment program and simulator certification activities.

5.5 Simulator Configuration Control Committee (SCCC) 5.5.1 Responsible for reviewing and commenting on all procedures which make up the Nuclear Simulator i Engineering Manual. j i

5.5.2 Responsible for ensuring continuity across all f four. certification programs. l 5.6 Assistant Supervisor operator Training (AsoT) l l

5.6.1 Responsible for review and approval of plant j specific certification tests or plant specific l documents.  !

i 6.0 INSTRUCTIONS 6.1 Layout of the Nuclear Simulator Engineering Manual (NSEM) 6.1.1 Section 1 of the NSEM shall contain those  ;

procedures which provide administrative guidelines, including:

. Branch Responsibilities

. Maintenance of the NSEM l/\

6.1.2 Section 2 of the NSEM shall contain those procedures dealing with defining the scope of simulation for certification, including:

. Scope of Hardware Certification

. Scope of Software Certification

. Scope of Functions (Instructor directed, malfunctions, etc.) to be certified O

Rev.: 3 Date: 5/11/89 NSEM-1.01 Page: 4 of 10

l' ' 6.1.3 Section 3 of the NSEM shall contain those procedures dealing with the simulator Design k.

Database and Documentation, including:

. Documentation Standards

. Generic Database Content

. Unit-Specific Data Indexes

. Designated Locations For Data / Documents 6.1.4 Section 4 of the NSEM shall contain those procedures dealing with simulator performance testing and verification, including:

. Annual Operability Testing

. Initial conditions Controls Simulator Operating Limits Real-time Simulation Testing

. System Testing Malfunction Testing Remote Function Testing

. Normal Operation Testing Instructor Station Interface Testing Plant-referenced Physical Fidelity 6.1.5 Section 5 of the NSEM shall contain those procedures dealing with Simulator Configuration

<' ~N Management, including:

(_) . Simulator Modification Control l i

. Design Data Base Update

'DR Retest Requirements Performance Test Update I

l 6.1.6 Section 6 of the NSEM shall contain those procedures dealing with outside events which j affect the Training Program and/or Trainer Configuration, including:

. Collection and Review of New Reference Plant Performance Data

. Student Feedback LERs, SOERs, PDCRs, etc.

Curriculum Testing

. Development of New Simulator Training )

Guides

. Major Plant Design Changes ]

l l

[N_-  !

Rev.: 3 Date: 5/11/89 NSEM-1.01 Page: 5 of 10

(p): 6.2 Format Of NSEM Procedures 6.2.1 Cover Sheet Each procedure in NSEM shall have a cover sheet delineating the procedure number, title, SCCC meeting number, approval sign-off, revision  ;

number and date. I The proper Branch Manager shall indicate approval-of a NSEM procedure by signing the cover page,

'i.e., Responsible Individual sign-off.

The Director, Nuclear Training shall indicate

. approval of a NSEM procedure'by signing the j cover page. j 6.2.2 content ]

Procedures in NSEM shall be written using the format specified below. Where specific headings are not needed, "None" shall be inserted in the section thereby retaining consistency of section "

numbering.

6.2.2.1 Purpose (Numbered 1.0) e-I This section shall state the concise objective of the procedure.

6.2.2.2 Applicability (Numbered 2.0) 1 This section shall clearly define 'j the organizational boundaries and i exceptions for application of the f procedure.

l 6.2.2.3 References (Numbered 3.0) i I

This section shall list the governing documents which form the basis for the procedure, and other documents which are referenced within the procedure.

6.2.2.4 Definitions (Numbered 4.0) >

This section shall define terms which )

are unique or specific to understanding the procedure.

1 Rev.: 3 Date: 5/11/89 NSEM-1.01 Page: 6 of 10 .

6.2.2.5 Responsibilities (Numbered 5.0)

,(

). This section shall identify by title the individuals responsible for implementing the procedure, and briefly identify their key responsibilities.

6.2.2.6 Instructions-(Numbered 6.0) u.

This section shall provide instructions  ;

which fulfill the stated purpose of the procedure. This section shall also include appropriate quantitative and qualitative acceptance criteria for determining and documenting that.the activities have been satisfactorily accomplished.

6.2.2.7 Figures (Numbered 7.0)

This section shall include figures, forms'and tables and shall7.2",be labeled etc.

as " Figure 7.1", " Figure A descriptive title and the figure number shall be located at the top center of the page.

6.2.2.8 Attachments (Numbered 8.0)

This section shall include additional or amplifying information such as guidelines and examples or may contain plant specific tests or plant specific documents. Attachments shall be labeled as " Attachment 8.1", Attachment 8.2", etc. The attachment number and a descriptive title shall be located at the top center of the page.

6.2.2.9 Appendices (Numbered Alphabetically)

This section shall contain amplifying  !

information, plant specific tests or plant specific documents.

Rev.: 3 Date: 5/11/89 NSEM-1.01 Page: 7 of 10

l 6.2.3 Page Identification 4 )L ,

The abbreviation NSEM followed by the appro-priate procedure number shallInbeaddition, located at the the bottom center of each page.

revision. number, date and page number of total pages shall be included in the lower right corner of all.pages.following the cover sheet.

Original procedures shall be designated as i "Rev. 0". Page number of total pages for figures and attachments shall follow the of "7.1-1 figure or attachment number as follows: " for

" for figures and "8.2.6 of -

attachments.

6.2.4 Acronyms When an acronym or abbreviation first appears

.in a procedure, the complete title shall be-used, followed by the acronym or abbreviation in parentheses.

6.2.5 Text Revision Identification Revisions shall be indicated by the use of a solid vertical line in the right-hand margin opposite the section of the text which has been (Q revised. A sheet providing details of those

. s_/ changes should be-included as an attachment in Section 8.0 (marginal note directory).

6.3 Preparation Of NSEM Procedures 6.3.1 Nuclear Training Department personnel shall prepare and submit new procedures and procedure revisions through their line management to the Manager, STSB. 6.3.2 Plant specific tests or plant specific documents that exist as attachments or appendices to approved NSEM procedures are exempt from the review, approval and distribution requirements of this procedure.

Some examples are: System Tests, Malfunction Tests, Normal Operating Tests, Malfunction Cause & Effects Documents and Simulator Operability Tests.

6.3.3 Plant specific tests or plant specific  ;

documents that are covered under Section 6.3.2 and are therefore not reviewed and approved by the SCCC shall be subject to the following:

Rev.: 3 Date: 5/11/89 NSEM-1.01 Page: 8 of 10

~(> /] 6.3.3.1 Plant specific tests or plant specific documents that are contained in attachments or appendices to approved NSEM procedures will meet the labeling requirements called out in Sections 6.2.2.8, 6.2.2.9 and 6.2.3.

6.3.3.2 Revision number and dates will be assigned as a result of approval at the ASOT level. Revision of a unit-specific document (e.g., attachment or A appendix) does not constitute revision /d\

of the overall procedure, therefore, SCCC approval is not required.

6.3.3.3 A signature showing ASOT level approval, revision and date will be on the cover sheet of each plant specific attachment or appendix.

6.3.3.4 Distribution of plant specific procedures or documents covered by Section 6.3.3 will be to the originating unit only (i.e. MPi, or MP2 or.MP3 or CY).

6.4 Review Of NSEM Procedures

.f*

l 6.4.1 The Manager, STSB shall distribute copies of the procedure to the following personnel listed below, 7 working days prior to its presentation to the SCCC for approval.

. Permanent members of the SCCC.

. The Supervisors of Operator Training for the 4 units.

. The Supervisor of Simulation Computer Engineering (SCE) and the Supervisor of Hardware Maintenance.

6.4.2 Those individuals identified in paragraph 6.4.1 shall review the proposed procedures for intent, clarity, compliance with governing documents and consistency with the entire Simulator Certification Program. Written comments shall be submitted to the Manager, STSB within 5 working days.

l 6.5 Approval Of NSEM Procedures 6.5.1 All new procedures and/or procedure revisions shall be brought before the SCCC for discussion

~

and disposition of all review comments.

O '

Rev.: 3 Date: 5/11/89 NSEM-1.01 Page: 9 of 10

,x) 6.5.2 The Director, Nuclear Training shall be

\_/ responsible for approving all procedures and/or procedure revisions.

6.6 Distribution Of NSEM Procedures 6.6.1 The Manager, STSB shall ensure that copies of NSEM procedures and a current table of contents are distributed to all NSEM holders (Figure 7.2), that issuance and receipt of copies are controlled, and that a file of NSEM procedure revisions is maintained.

6.6.2 Receipt of controlled copies of NSEM procedures by NSEM holders shall be indicated by returning the NSEM Procedure Transmittal Form (Figure 7.1). A record of the return of those forms shall be maintained (Figure 7.2).

6.6.3 NSEM Holders 6.6.3.1 The Director, Nuclear Training 6.6.3.2 The Manager, STSB 6.6.3.3 The Manager, OTB 6.6.3.4 The Four Individual Supervisors of '

Operator Training (SOTS) 6.6.3.5 The Supervisor, ASRMS

(~N s ',) 6.6.3.6 Supervisor of Simulation computer Engineering (SCE) l 6.6.3.7 Supervisor of Hardware Maintenance 6.6.3.8 The Four Unit Software Coordinators 6.6.3.9 The Four Unit Operations Consultants 6.7 Deletion Of NSEM Procedures  !

6.7.1 When an NSEM procedure is deemed not applicable, the SCCC shall conduct a review to determine if there is sufficient cause for deleting this procedure. ,

i 6.7.2 If the SCCC rules to delete the procedure, the i Manager, STSB shall notify NSEM holders by useA of the NSEM Procedure Disposition Form 7.3.

revised table of contents shall be issued.

]

7.0 FIGURES 7.1 NSEM Procedure Transmittal Form '

7.2 NSEM Procedure Receipt Acknowledgment Record 7.3 NSEM Procedure Disposition Form 8.0 ATTACHMENTS

' 8.1 Marginal Note Directory Rev.: 3 Date: 5/11/89 NSEM-1.01 Page: 10 of 10

-j

Figure 7.1 NSEM PROCEDURE TRANSMITTAL FORM PLEASE COMPLETE AND RETURN THIS SHEET TO:

Manager, STSB ATTN:

This acknowledges receipt of'the following Nuclear simulator Engineering Manual Procedure information:

(9 V

A table of contents, Revision is also attached to reflect current status of the NSEM procedures.

Insert'the new table of content.s and remove superseded table of contents.

TRANSMITTAL NO. , dated .

Signature of Manual Holder Date Manual No.

O Rev.: 3 Date: 5/11/89 NSEM-1.01 Page: 7.1-1 of 1

Figure 7.2.

NSEM PROCEDURE RECEIPT ACKNOWLEDGMENT RECORD ,

{

'y

\~s ' TRANSMITTAL NO. DATE DESCRIPTION:

a CONTROLLED COPY HOLDERS MANUAL NO. HOLDER ACK.

1 Director, Nuclear Training 2 Manager, STSB 3 Manager, OTB 4 Supervisor of Operator Training, MP3 5 Supervisor of Operater Training, MP2 6 Supervisor of Operator i . Training, MP1 7 Supervisor of Operator Training, CY 8 Supervisor, ASRMS 9 Supervisor of Simulation Computer Engineering (SCE)

Supervisor of Hardware Maintenance I

10 11 Operations consultant (MP1) 12 Operations Consultant (MP2) l 13 Operations consultant (MP3) 14 operations consultant (CY) 15 Software coordinator (MP1) l 16 Software Coordinator (MP2) 17 Software Coordinator (MP3)

.() 18 Software Coordinator (CY)

Rev: 3 Date: 5/11/89 NSEM-1.01 Page: 7.2-1 of 1 L-____----_-_____-_---_- .__ _ _ _ _ .

1 l

l

/'~T

,,) Figure 7.3 l!

I NSEM PROCEDURC DISPOSITION FORM f TO: Date Transmittal No.

FROM: Manager, STSB

SUBJECT:

Disposition of NSEM Procedures Procedure No.

(New / Old) Rev. Date Location Disposition r3 s

l l

t ,

'ss)

Rev.: 3 Date: 5/11/89 NSEM-1.01 Page: 7.3-1 of 1 l

C____ _

Attachment 8.1 fS vs MARGINAL NOTE DIRECTORY

1. Deleted extraneous procedure references.
2. Specified process for revising unit-specific attachments and/or appendices.

%q Rev.: 3 Date: 5/11/89 NSEM-1.01 Page: 8.1-1 of 1

~ = _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ .

gjg x d

,y x..

k '

NORTHEAST UTILITIES j NOCLEAR SIMULATOR ENGINEERING MANUAL NSEM - 1.02 SIMULATOR CERTIFICATION PROGRAM OVERVIEW Responsible [)

Individual: M V. (u . >> 4

[g~e imblator Technical Support Approved-p ect p Nuclear Training Revision: 1 Date: 5/11/89 SCCC Meeting No: 89-005 L


__i-.-__._m_ _ _ _ _ _ _ _ _ _ . _ _ _ _ _ _ _ . _ , _ _ _ _ _ _

[~'Y1.0' PURPOSE L) . .

1.1 To provide an overview of Northeast Utilities' Certification Program for its four nuclear plant simulators.

1.2 To assign responsibilities within the Nuclear Training Department for specific components of the program.

1.3 To identify the goals of the program to the Nuclear Training Department staff.

2.0 APPLICABILITY This procedure applies to those Branches / sections within the Nuclear Training Department which are part of the simulator Certification Program, i.e., Simulator Technical Support Branch, Operator Training Branch, and Administrative Services and Records Management Section.

3.0 REFERENCES

3.1 ANSI /ANS3.5-1985 - This standard states the minimal functional requirements on design data and simulator performance and operability testing.

\- 3.2 NRC RG 1.149-Rev. 1, April.1987 - This guide describes an acceptable methodology for certification by endorsing' ANSI /ANS-3.5, 1985 with some additional  !

requirements.

3.3 10CFR 55.45(b) - Mandates a timetable for simulator facility certification and specifies additional testing requirements.

\

3.4 INPO Good Practices TQ-504 - Describes techniques for l/ (

effectively controlling simulator configuration. .

3.5 NUREG 1258, December 1987 - Describes the procedures l/ \(

and techniques which will be eroloyed to audit certified facilities.

3.6 INPO 86-026, Guideline For Simulator Training, October, 1986.

3.7 INPO 87-006, Report on Configuration Management in the q Nuclear Utility Industry, July, 1987.

3.8 EPRI Project RP2054-2, " Development of a Simulator Qualification Methodology".

I Rev.: 1 Date: 5/11/89 Page: 1 of 9 NSEM-1.02

p.a *

.4.0 DEFINITIONS f}\_/ 4.1 ' Design Data Base - The reference plant data which is b the basis for the current simulator hardware t configuration and software models.

4.2 NSEM - The Nuclear Simulator Engineering Manual contains all of the procedures necessary for the development and implementation of the certification program. It is a controlled document and its purpose is to insure consistent application of the

. certification process.

4.3 STSB - Simulator Technical Support Branch of the Nuclear Training Department.

4.4 OTB - Operator Training Branch of the Nuclear Training Department.

4.5 ASRMS - Administrative Services and Records Management Section of the Nuclear Training Department.

4.6 Deficiency - An identified difference in a simulator quality or element-(hardware and/or software) that requiresLreview and resolution.

4.7 PDCR - A Plant Design Change Record which contains all f- necessary information and forms to accomplish in an

( g-) orderly manner the modification of a plant system, structure'or component.

4.8 LER - License Event Report required by NRC 10CFR 50-73 A which describes those events which shall be reported /SL\

within 30 days after discovery of the event.

4.9 SOER - Significant Operating Event Report is generated by INPO and distributed to industry members. It includes recommendations concerning the event which must be addressed by concerned facilities.

5.0 RESPONSIBILITIES 5.1 Director, Nuclear Training l

Responsible for the overall direction of NU's Certi-fication Program and the assignment of those NTD resources necessary for achieving and maintaining simulator certification in a manner compliant to 10CRF55.45(b) and responsive to NU's corporate goal of cost containment.

p D Rev.: 1 Date: 5/11/89 Page: 2 of 9 NSEM-1.02

~5.2 Manager, Simulator Technical Support Branch i) 5.2.1 Responsible for coordinating the program among

.the various NTD branches and sections to ensure that simulator certification is accomplished according to the timetable established by the Director, Nuclear Training.

5.2.2 Responsible for all Configuration Management activities, including:

o Simulator Design Data Bases o Simulator Documentation o Modification Control 5.2.3 Responsible for the development and maintenance of the-Nuclear Simulator Engineering Manual (NSEM). .

5.2.4 Responsible for the resolution of identified deficiencies between the simulator and the reference plant in a timely manner; responsive to the training needs, regulatory requirements and NU's goal of cost containment.

5.2.5. Responsible for the development, implementation and maintenance of specific NSEM procedures identified in Attachment 8.1.

5.3 Manager, Operator Training Branch 5.3.1 Responsible for the development, implementation and maintenance of specific NSEM procedures identified in Attachment 8.1.

5.3.2 Responsible for monitoring simulator / reference plant fidelity, identifying differences and pri.oritizing resolution to ensure that each simulator can support the training program with which it is used.

5.4 Supervisor, ASRMS 5.4.1 Responsible for the assignment of personnel required for the performance of records management and other clerical activities associated with Configuration Management and the simulator certification efforts.

5.4.2 Responsible for the development, implementation and maintenance of specific NSEM procedures identified in Attachment 8.1.

O Rev.: 1 Date: 5/11/89 Page: 3 of 9 NSEM-1.02

6.0 INSTRUCTIONS The goal of NU's Certification Program is an action plan which shall:

o Ensure that each simulator possesses the capability to support the training program with which it is used.

o Provide for certification in a timely, cost effective manner, addressing the specific requirements of Regulatory Guide 1.149, NRC 10CRr55.45(b) and NUREG 1258 while remaining responsive to NU's corporate goal of cost containment.

o Ensure on-going certification compliance with the requirements set forth in ANSI /ANS-3.5,1985.

The Certification Program consists of three main components: Definition of the Scope of simulation, validation of the Scope of Simulation, and Configuration Management. Figure 7.1 provides an overview of the relationships and interactions between the major functions of the program.

6.1 Definition of the Scope of Simulation

() A determination of the scope of simulation required to support the training curriculum shall be made for each simulator. This determination shall be based upon the NU Simulator Training Guides which encompass:

o All events specified in ANSI /ANS-3.5, 1985 and Regulatory Guide 1.149, 1987.

o The training requirements as specified in the various plant start-up and operating procedures, o Outside events (e.g., LERs, reference plant design changes, etc.) that affect the training programs and/or trainer configuration.

See Figure 7.2 for an overview of this process.

6.2 Trainer validation The scope of simulation defined in Step 6.1 shall be validated by performance testing a.;d verification.

Figures 7.3 and 7.4 provide an overview of this process.

O Rev.: 1 Date: 5/11/89 Page: 4 of 9

6.2.1 Simulator Performance. Testing 4

A specific performance test shall be developed for each simulator which will fulfill the testing requirements.of ANSI /ANS-3.5,1985, INPO 86-026 Guideline for Simulator Training, and NUREG-1258. u 6.2.1.1 The Performance Test is outlined in Figure 7.3 and shall include the following sub-tests:

o Instructor Interface Capabilities o Steady-State Stability o " Certified" Remote. Functions o System Operability

o. Rea'-Time Simulation 1

o Normal Operations Capability o " Certified" Malfunctions j o Transients o Plant Process Computer Capabilities 6.2.1.2 The Acceptance Test Procedure (ATP) developed and implemented as part of simulator procurement shall form the basis for these tests. The "imalator Performance Test shall be a dynamic document and shall be updated

./ to reflect modifications made to the

( simulator and/or new plant performance data.

6.2.1.3 The Simulator Performance Test shall be used to establish baseline-fidelity for the initial certification submittal and shall serve as the vehicle for continuing fidelity verification.

6.2.2 verification This section contains activities which are requirements of certification but do not fit within the context of performance testing.

6.2.2.1 Defined Simulator Operating Limits This shall be accomplished by determining model limitations, identifying key parameters and their boundaries and providing a method for alerting instructors when an operating limit is surpassed.

Rev.: 1 Date: 5/11/89 Page: 5 of 9 NSEM-1.02

- - 6.2.2.2 Plant-Referenced Physical Fidelity

(~] Physical fidelity shall be verified by
  • )

conducting periodic comparisons of the trainer to the reference plant in'the areas of panel simulation, instrument and control configuration and ambient j operating environment. Aay identified f discrepancies shall-be svaluated to 1 determine the consequence to the simulator's ability to be used as an effective training tool..

)

6.2.2.3 Software Design Verification l The Simulator System Documentation Manuals shall be reviewed against the System Simulation Diagrams to verify .

I that all instrumentation, malfunc-tions, remote functions and plant l components identified in the scope of simulation are modeled. This shall only be done at initial certification.

6.1.2.4 Initial conditions Administrative controls shall be gg established to maintain the fixed set of initial conditions required by the

j' Operator Training Program.

6.3 CONFIGURATION MANAGEMENT The Simulator Configuration Management System (CMS) shall provide control over the simulator configura-tions and design data bases to ensure that each simulator can effectively support the training curri-culum, that regulatory commitments are satisfied and that NU's corporate commitment to cost containment is followed.

6.3.1 Simulator Design Data Base The specific reference plant data which forms the basis for the current Simulator Hardware Configuration and Software Models shall be identified, collected and validated. The i

' collected data shall be stored in specific locations, dependent upon its type and respon-sibility for its maintenance and access shall be assigned to specific individuals. Existing I NU documentation programs shall be used wherever (See Figure f possible to interface with the data.

7.5 for an overview of this process.)

Rev.: 1 Date: 5/11/89 Page: 6 of 9 NSEM-1.02

( 3.2

. Simulator' Documentation p)L

(_ ' Simulator-specific documentation is needed for certification and maintenance. This documenta-tion shall be controlled and updated, but shall not be considered to be part of the Simulator Design Data Base.

6.3.2.1 Simulator System Documentation Manuals These manuals shall provide design materials for each simulated system model and shall be maintained through vendor support software in accordance with the Simulator System Documenta-tion Standard.

6.3.2.2 Software The simulation models shall be consid-ered to be documentation and shall be updated and detailed in accordance with the Software Documentation Standards.

6.3.2.3 Simulator Test Results This category shall includ the E f~ :'

completed construction ATP and all

( in-service performance and operability tests.

6.3.2.4 Closed Simulator Design Changes (SDCS)

Following update of the Simulator Design Data Base, closed SDCs shall be maintained in a historical file.

i6.3.3 Modification Control A process shall be employed which controls /3k ,

Simulator Configuration and complies with NRC regulations and industry standards. (See Figure 7.6 for an overview of this process.)

6.3.3.1 Procedures A set of procedures shall be implemented to establish control over the coordina-tion, resolution and documentation of identified differences between the simulator and the reference plant; and maintains the integrity of simulator software, hardware and design data base.

(3

%.) Rev.: 1 Date: 5/11/89 Page: 7 of 9 j NSEM-1.02 j

l n

6.3.3.2 Simulator configuration control f~'i Committee (SCCC)

Q A committee shall be established which is responsible for overall simulator design control and the management of all NTD resources involved in the simulator modification effort. The SCCC shall comprise representatives from both the operator Training Branch and the Simula-tor Technical support Branch and shall be chaired by the Director, Nuclear Training.

6.3.3.3 Simulator Design change Tracking A computer-based data retrieval program

-shall be used to track the status of all identified simulator discrepancies and a report shall be generated on a weekly basis. This report shall contain updated, pertinent information useful to OTB and STSB in the conduct of simulator ,

modification activities.

6.3.4 Expansion of the Scope of Simulation outside1 Events which have the eu ability for

. l -c affecting the training programs ond/or trainer

\ configuration shall be periodically monitored.

(Refer to Figure 7.7 for details.)

l 6.4 IMPLEMENTATION The Certification Program shall be implemented first on the MP2 trainer. Procedures and techniques will be developed and refined there and then the process shall continue on the'other trainers. These procedures shall.be structured such that they are applicable to all four simulators, only true plant specific or simulator specific functions and data should be unique.

This approach will limit the effort lost on incorrect or inappropriate methodologies, enforce conformity and consistency as the procedures and methodology are applied from one trainer to the next and allow for final definition of regulatory interpretations.

l l

l

(

Rev.: 1 Date: 5/11/89 .

Page: 8 of 9

)

I NSEM-1.02

l/ '7J. 0 FIGURES 7.11' Certification Program-Overview.

+. 7.2 . Scope of simulation'

-7.3 ' Trainer validation 7.4 Simulator Performance. Testing 7.5 Simulator. Design Data Base A 7.6 Modification' control g 7.7 -Expansion of Scope of simulation 8'.0- ATTACHMENTS 8 '.' l Nuclear Simulator Engineering Manual Procedures

8.2 Marginal Note Directory O

i s

LO Rev.: 1 Date: 5/11/89 Page: 9 of 9

( NSEM-1.02

ATTACHMENT 8.1

() NUCLEAR SIMULATOR ENGINEERING MANUAL PROCEDURES SECTION 1 STSB NSEM-1.01, CONTROL OF THE NUCLEAR SIMULATOR ENGINEERING MANUAL STSB' NSEM-1.02, OVERVIEW OF NU'S CERTIFICATION PROGRAM A

SECTION 2 OTB NSEM-2.01, DEFINING TRAINING REQUIREMENTS OTB NSEM-2.02, DEFINING THE " CERTIFIED" TRAINER

,m .

STSB NSEM-2.03, COMPARISON OF SIMULATION DIAGRAMS TO SIMULATION MODELS NUCLEAR SIMULATOR ENGINEER.ING MANUAL PROCEDURES SECTION 3 STSB NSEM-3.01, DEFINITION AND CONTROL OF THE SIMULATOR DATA BASE STSB NSEM-3.02, CONTROL OF SIMULATOR DOCUMENTATION {

O Rev.: 1 ,

Date: 5/11/89 )

Page: 8.1-1 of 3 NSEM-1.02 j

NUCLEAR SIMULATOR ENGINEERING MANUAL PROCEDURES O'SECTION 4

OTB NSEM-4.01, SYSTEM TESTS OTB NSEM-4.02, CERTIFIED INITIAL CONDITIONS OTB NSEM-4.03, CERTIFIED REMOTE FUNCTIONS OTB NSEM-4.04, MAJOR MALFUNCTION TESTING OTB NSEM-4.05, MINOR MALFUNCTION TESTING A

NSEM-4.07, MASTER TESTING SCHEDULE T.), OTB/STSB OTB/STSB NSEM-4.08, SIMULATOR OPERATING LIMITS OTB NSEM-4.09, SIMULATOR OPERABILITY TESTING OTB NSEM-4.10, NORMAL OPERATIONS OTB NSEM-4.11, INSTRUCTOR STATION INTERFACE STSB NSEM-4.12, PHYSICAL FIDEllTY VERIFICATION STSB NSEM-4.13, REAL-TIME SIMULATION VERIFICATION iO v

Rev.: 1 Date: 5/11/89 Page: 8.1-2 of 3

1 NUCLEAR SIMULATOR ENGINEERING MANUAL PROCEDURES n i

' () i a

SECTION 5 STSB NSEM-5.01, SIMULATOR MODIFICATION CONTROL PROCEDURE OTB NSEM-5.02, RETEST GUIDELINES

'SECTION 6 OTB NSEM-6.01, STUDENT FEEDBACK OTB NSEM-6.02, DEVELOPMENT OF NEW SIMULATOR GUIDES OTB NSEM-6.03, COLLECTION OF PLANT DATA OTB NSEM-6.04, MAJOR PLANT MODIFICATIONS i

I O

Aev.: 1 Date: 5/11/89 Page: 8.1-3 of 3 NSEM-1.02

(.

ATTACHMENT 8.2 MARGINAL NOTE DIRECTORY

1. ' Incorporated latest references.

'2. Added. definitions for PDCR, LER and SOER.

3. . Changed wording to-reflect progression of program out of development phase.

4.. 'Added figures and attachment sections inadvertently omitted .

from previous' revision.

i S'. .

Deleted extraneous procedure references, LO  !

l Rev.: 1 Date: 5/11/89 NSEM-1.02 Page: 8.2-1 of 1

(.-

3v

[

A M

R O

GT N Y t

i S U S T

F E l f F T R

E P S ETNR O P I

T A S E E E T 1

R AOR E .

O Ci L T r A N vt A Ti UTI e 4

L RE NU N O RD 7

U C A R M

E S I

e**

R U

G I

F W

E I

V R m E )

V L A

O U N

A M

M N G A O I

T N

I R

D E

R N G r O R 1

R A C

E E

I U N t

E H P E O F

G I

7 I

F N Q A T I

M I

G E RT O N

)L T

  1. , E O R E

A R N E

f R

N Y f E C S0 T

A f

f f

iR U

R C R R GO R O

1 O

t F

S D I

D E

AL U

P l

f i O P T A A N M G

I T

A A

L L

U S I

F E S f

U F N U L

M MI 2 S M OT f

3 D 5 7

O I

S I S R

7 E

7 E E R

A R U T E R u U G A L C

U G

i r

G I

F n

C I

N

(

F -v I

T R

E C

)

N S 4

.) i4:

GW tt D *f A N 4

N f N w a1 t

i fC x O 1 f ' t o4 C tJ f't I

A r t i

'H t F

i A AR lu TA F' u i t 7 N 5 U C f

R D uP (

5 E S E

A W t' G UT GO O I 8 'P(

3 4 1: I I

J M MN f 9W R f

9 1 Af 49 b D t

fN4f WJ N O N Nf  :

5. 3. 1 TY N I.

RA 3 l'

( O tR t C 5 1

) E f oE -

E G TR iC P S SDN O ED J

i N

?

T t F tuAB N A A G I

U SHA u

O O a T

F- t/

fi A i RT 7 Mt S2 .

7- R C 1 5 D A l

OT T N 5 G A T f R R

" It "G GI

" i

O D N L A A MR E

I L

I B

A N 5 A M E EN E P 8 E N T

T G HA C 9 S T 1 S E UM YAF C S I 5

A U CC R E S OFI 3 S

O D

OD WC T E E S P

I N A V I N

OS T L O V R S A I

UR /

MP VE 'S I

T C C S N

N E S H OR I

~

F O A UF P C T F E I A

A L OH L E

L W U M

V MI E S D

N -"

O I

T A )

L C T

2 7

U "

D R' S E.

M E E F OE I E I 2 NFI N R.

t S F A S I

T R 0 2

E R DE R F' U - C T

M, G F M E

" S I

F O S N (

  • C I

E P I

@it ,I.3lasII

_t

, III t e I O _

C _

_ S T

S _

_ L N

E M

NmN I

I E

R E I

_ DRU _

_ T Q _

. E

. R _

_ ) _

_ 2 _

D5 1 9

_ P, 5 E 8 4 _

I 9 3 1

_ i N 5 I A F 1 D1 t I

_ i N ' C .

T 1 A iM P3 E 5 AN E I

D C

. t R _

A l

I O. S -

t 1 I

U N _

l I R NS S 1 G O I 0 L T N 3 . TC

_ Y1 A S ENAI/ G _

f S E E _

E VS N R S i

,U R l l

1

_ E i D E AN O I 0 .. V M U L I D _

I T 2

_ S G LA N CN I

E A

_ S _

M ( _

E _

S . . _

N .' 3 I I8l 3 II 8Il II I It l

l ll

q l

)

]

p V

FIGURE 7.3 TRADfER VAUDATION WRIFICAfl0N t

uSEW-4 08 Suut.ATOR SPERATING Las75 .

NSEW-4.12 )

M.Asif-sEPUENIID N Piptufv mSEW-4C2 CDmritD 37 y N Congnons NEEW-2.03 uGML Cthf%KftNtts

(#87uL IIRfrican0N msLW) i f

1 i

PCRFORWANCE TEST NEM-4.11 usurv Tkaf M esgNhaCFW SffEradX (BestnuCTS 57 Rfisgrry M Thesese mSEM-4 09 ANSMess-14.1984 (ECTMus 14. APWEMBE A1.3) l

[

I WUFV SftADY STAft ifaMJfy amavan.3.s.1ess (EC11ou 4.1 ase NSGi-4.01 APWDert A12(1)).

1Efrv trEmemJfv WIA

  • W 4 mSEW-4 03 ]

arv =f = comner u Die IIEkOft FUNCfl0NS OPERAft COIWEC%v N$gg.4.13 AmevaNS-3.5.tpso (aPWDert 41.30) fucha PSfE 7.2 A8e A12(1)).

~ WEIWY IEAL ThE BSAAROM 4

ustw-4.os ANOVass&=18,1968 QPSDdpM Ali),

vuury fwar w trumner ur or ision matrunCnows apoiafts CUIWECfLv. NSEW-4.10 mo menOn .n (AP corn a.AN5/ANS=18.1984 " " ** I, I usury moinsat opunancNs capasurv g (msvans-s.a.1oes esDem as.2(t) l I wstu-4,04 e

i Warv faar net cumur arrs

or una wagumene,ss a.se aman fnmaoes a =

, l -- 4P=.s.m "1.E'"8.1g=ase C

a.>* EiT

,. . . . . . .rm, i I umsfosse c I mamme-a.w.amurYg aiens

l,E.'m"'"4Ll__.s u..j^i c u.

tI J.r.Causenes7,4,,,

a 8 8 ase emes ooCuuon 3 ase coessness I

g. . .mfEs .....

escene u

.esmemorecomenz

,( .n- uss neeLv Rev.: 1 Date: 5/11/89 Page: 7.3-1 of 1 l l

NSEM - 1.02 l l

l

1 lI; 1

7 7 0 2 f 0 ~

0 G 4- 9 o 4 N I

- 5 8 N M - /1

, - E ) M 2 M GI A C b E 1 -

E NT N ( S E S

1 4 0 S

I T N A S N /.

N 4 N I 57 SIA M 1 _

EM T R _ S 9 ER . N R A O F )R S 4 5 O CO i R 8 1 .ee M NF A

OE T P F H I Y A O 9 1 T S N vt g E M EDA RP - AJ 1 .E DO EI eaa S _

CF T D R TA RDP N _

OIR IF OrP R R I SM 5.U I

I F UT %T U RQR5S G E N T DN 3G U' Q "t _

EEE 0 l - . E i P R C (2 T 1 1 iE SC A S N E G RD O

_ ENY T AT /A R GW N

Y MuISD RBNN l

l i R N ,/ T uOA AA ~ ) SO T NIO t F T E A T

OTC D RS ) L S 2. '

5

  • T I A E E U A E
  • R E P T E N4 CM C (S NS I

G PF AO 5 ""

NL FI I

A ~

I STtAI p OO TST CN* MT '

95 C t T T I R 8il LT Y N O "' R ENE A EADUD

  • O G T ER I

C 1 F 5. T

,F I

US NB E F L C E "*

F R

N I

CO NF O 3 J" A L -

NUT A A S S W (S(" P E

T M D A S R ET OI RT NI S)

A (7 F UM/1 E RQOS EEUN I

4 T PR SAS E

C _

N - _

4 A _

7 M

)f R

E R

U G

O F T-N~

O I

F R a O B E B S C S T

cT i S S

T CT F S I

S T I F _

P T I S

CE S r E S E S

FW I

t CT / CT / IC B

/ IC n/ E B E . B E E T P E2 T

P E3 T P E R

T O PM O S CP O S CP O S N C S

O AA N AA NM AANM AA' u P R MR P MR P T PW OO O L Of O

O L OF RO Ow L

A L E E F E F V

E V

L V E

V E E R E R E E D a U P D P U D M

I S

A T

A D IX F D S S ET IN N Y )7 SCS n E ENE 1

n P Tlt 8 U 10 P i /

RA T ff A C 2 UM A 0i _

6A (

DR P DT n .

2 F N E OT U NSr A ) 5 4

0- N A CF R E t

O S T ,u " 3 D L

OR RET A T I

T P -

PPS '

F N A N l

C u A L O

T T T oNi U NNN t A w 4 M DA PAAA E M u S I

SUL V

E R c

[ T T L L L S: A PPP D OniF N

O I

CA RV U Rw T C NE P f E

N P I S

(

l ll! f(I

FIGURE 7.5 SIMULATOR DESIGN DATA BASE

((')

O OPROCEDURES l

,7 7 OPEN PLANT l I

' $0C'S DRAWINCS l

STS ENDOR DEMNE GENERIC M l

I TECH I DESIGN DATASAGE l O

CONTENT ANSVANS-3.5,1988 (SECTION S.1 AND APPENotX A2)

HP0 ccoo PRACnCE SIWULATOR C0 GURADON

,,A ,

APPEN0lX jf $75

/

PPC N

\ WANUALS ATA y A$RWS COLLECT UNIT-SPECIMC DATA STS 1f vAuDATE DATA STSO/ASRWS gggg.3.02 UNIT-SPECinC DESIGN CONTROL UPDATE BACKLOG DATAaASE m AND OF COWPLETED -

" WMNTENANCE WOOlnCADONS ANSVANS-3.S.1985 (UPDATE)

(APPCNDIX A2) h Rev.;

/'

siuuLAn0N macRAus

soc PACxACEs Date; 1

5/11/89

"' Page; 7.5-1 of I I l

{

V NSEM - 1.02 i

I

+ saastates errrum poCcuanTmoR mancALs L-__-__-_-. - _ _

l L

ricUns 7.a

@M l

DEnCIENCY DATAf  ;- REPORTS C SUPPORTING DUE Daft j (DR'S)

) MSEM-5.01 p ... . .... . . . . .

I I STATUS l I EPORTC _

3 I

I I g

I U l NAR0eARE ONLv uoos 3gygag l I I (SOC'S)

I g NAR0eARE I WApffENANCE I p is l

DtFT.

ESnWA _

50 dE As cg)e ,

I l

~

I M ORMR pyg ag4gyggg ORDER l , .

i I

ROAEW r0R I I

COMPLETENESS AND ADO WISSING DATA, SCCC l

' I

[ LONC-RANGE PLANNING I '

I l (BUDGETARY RESOURCE CONrUCTS CONSIDERAn0NS l I

I g I STATUS -

g REPORTS ' g STSe I I STSe it

\- l [ NARDWARE D M NTORY

}

SlWULATOR UPDATE REQUIREMDdTS I

j>

ANSVANS-3.5.1985 I l

( ) (SECn0NS S.2 AND 5.3)

I SDC I g

SCHEDUUNG I I STSg orb I

I CURRENT y 3 SOFnWARE TRAINING I

CNCINEUtNC NEEDS l CONSIOCRAn0NS I I UAms -

I l I

I REPORTS i l l

I i ,....._...........I. 4 I NSEW-5.02

' I s: pERr0RWANCE i TRAINER g

E ' REQUIREMENTS l UO0tnCAT10NS I AN ANS-3.5.1983 g 1 I", TION 5.4.1)

~

I, n0VRE 4 l

I I I I I I 3

STATUS - I REPORTS g

' l I ....1 6 .. . ... .. .. ..

STSE/ASRWS STS8/ASRW&'OTB U

UPDATE S1WULATOR pER OR STS DESGN DATABASE

/,

(

ANSVANS-3.5.1985 (SECn0N S.1) k (SECn0N 5.4)ANSVANS-3.5.1985 )

% nGURE 4 Rev.: 1 SiWULATOR UpoATE DESioN oATAe4SE NUREc t2Se EvAtuAnow PRoccouRE FOR SiWuunos rActunES Date: 5/11/89 Fage: 7.6-1 Of 1 ,

ANSVmS-3.5.1985 (SECn0NS 5.2 MD 5.3.

APPEN04X A4). {

NSEM - 1.02

l

}l '

9 I 2 81

/f7 0 1

1 O 1 _

1

/ . _

57 4 .ee M 0 vt g E 6 eaa S SC , RDP N

- ). W

  • R PN T

C P.E E .

S B 7 L N D PR M T DCE O M U

II8IlIII gIII lg8II, RRA L G 2 _ _ OCL UN 0 .P CT I 6 _

J

_ A Mi eE R S I

R RTE M_

E _

U C

S_ D' N_ R EF S OIDI I

  • _ T UOF A

L G R R .

T

_ U M N NS. C T

E

_ B T

S

' S E

STOOP C MS.

I I _ O R _

A ET G R S _

_ E -C

  • _ ON NO SAL AHAE N

T R M "Y SI T _

_ AR NT DO .PCRC PF AS

_ S

_ I

  • EN S.E PO UINDN L _

A

_ R E*

LRU R OC

_ CDCT 7 WDER EPCN E S

N _

_ M E

O R

T O

I R P T

A I

U _

M I

_ S LY OT S

^'

_ R A C R E S I

S _

NR E NI T E _

f l

f Y

CS

_ ER I

A I NTRS) _

F CO R U

_ E

_ S C T B l

O f l C O I

R' _

)I - O A P

_ EET WP _.

I F

I E (O _

I ER

_ RA E A UR V C I

_ D

'}% E

_ E S E

_ NP R P

7

_ EO _

e O _

- m r

u C _

YD D _

i g S _

DIE AF T F E

I _

ET I OI T _

F _

F _

RR LE N ER _

O _

A C C _

N _

O I

_ S KT _

S _ . D S L CS _

_ S NE EU N

_ FS A D H _

R* D A _ ICE O C' S . RM .E A DL _

P _

E SWMEL N FDE RO _

X _

I F

E MRA T S UQ N I

R T _ 1 0

E _ D H S EY R O C 6

C K 5 8

9)

_

  • A _

_ 3 M B 1 2 _

- 0 E . _

_ W A u.

_ 6 E T T S

H

5. 5 _

_ I NA t.

t 3N _

V>D

_ - t

-O

_ M E I

_ E RlPEC T ST

_ _ S N E

N A E C

4 t

D N E EN A D / S I

_ PS - A Nu U S(

E T N S O - T E R S A D CR EE FO

_ T EAU t

T L F Rut G R I

_ - L E E _

WuG _ OR P t

C TI B S NI T N -

( O EWIA -

NR T -

,,I8II 8 IIII IIIIl _

j-flg

(~ .

.\

j ,

NORTHEAST UTILITIES NUCLEAR SIMULATOR ENGINEERING MANUAL NSEM - 2.01

\.

DEFINING TRAINING REQUIREMENTS

. I Approved:

D @ , Nuclear Training Revision: 0 Date: 2/9/88 l

SCCC Meeting No: 88-002 O

m._ .1

l i

~

C)/ .

1.0 PURPOSE' ..

The purpose of this procedure is to define the scope of simulation required to support performance-based nuclear power plant operator training on the Northeast Utilities ,

simulators for Millstone 1, 2, 3 and Connecticut Yankee.

The scope of simulation will be defined based on training requirements. The training requirements used to define the j scope of simulation will be a result of a systematic .i' approach to training process. The output of this procedure will be a list of hardware (panels, components & instru- I ments) and software (a corrected'and updated set of Simulator System Diagrams) which comprise the required scope of simulation to support the training requirements.

2.0 APPLICABILITY ,

This procedure applies to the Nuclear Training Department .

(NTD), including the Operator Training Branch (OTB), the Simulator Technical Support Branch (STSB), and other Northeast Utilities (NU) organizations performing functions ,,

in support of the NU Simulator Certification Program. -

/~

i

3.0 REFERENCES

3.1 ANSI /ANS3.5-1985 - This standard states the minimal functional requirements on design data and simulator performance and operability testing.

3.2 NRC RG 1.149-Rev. 1, April, 1987 - This guide describes an acceptable methodology for certification 1 by endorsing ANSI /ANS-3.5, 1985 with some additional

- requirements.

3.3 NUREG 1258 - Describes the procedures and techniques which will be employed to audit certified facilities.

3.4 Electric Power Research Institute (EPRI),

Qualification Plan for the Millstone II Simulator (Rev. 0 12/85), prepared by General Physics Corporation.

3.5 NU Simulator Certification Program (draft) 3.6 INPO 86-026 Guidelines for Simulator Training, October 1986

/3 Rev.: 0 Date: 2/9/08 Page: 1 of 14 NSEM-2.01 l

l -- - _ .____ -

O

\,,/ 3.7 10CFR 55.45, Operating Tests 3.8' NTDD-l[,-Simulator Certification and Configuration

-Management Control 4.0 DEFINITIONS 4.1 Deficiency Report (DR) -- a form.(STS-BI-FIA) used by l the Operator Training Branch (OTB) and the Simulator Technical Support Group (STS) to record all identified simulator deficiencies between the simulator and reference plant.

4.2 Simulator Instructor Guide (SIG) - training document

.providing guidance and instructions for the conduct of. simulator training.

4.3 Simulation System Diagram (SSD) -

functional -

representation of the simulator modeling for a given .

system.

4.4 Task Selection / Training Planning Report (TS/TPR) -

  • j matrix representation of the-tasks for a given job ..

specifying the' training environment and lesson plan numbers for providing training on each.

k. 1.5 4 Simulated Hardware List - a list of all controls, indications, and annunciators physically represented on the simulator, not including the instructor's station.

4.6 Simulated Systems /Flowpaths List - a compilation of the-simulator's Simulation System Diagrams with systems /flowpaths identified in tabular form attached to each. (Flowpaths may be indicated in hi-liter on the SSD for clarity)

5.0 RESPONSIBILITIES 5.1 Assistant-Supervisor simulator Training (ASST) l 5.1.1 Responsible for assigning Simulator Instructors to conduct performance-based review / analysis.

5.1.2 Responsible for assigning Simulator Instructors to perform independent resolution of discrepancies.

O Rev.: 0 Date: 2/9/88 Page: 2 of 14 NSEM-2.01

- _ _ - _ _ _ _ - _ _ - - - _ _ _ _ _ _ _ . - _-_ - _ --_ _ __ ~

./^

i -

5.1.3 Responsible for reviewing and approving the eutputs of this procedure, defining the scope

, of simulation for hardware and software systems required for simulator certification.

5.2 Simulator Instructors 5.2.1 Responsible for conducting performance-based review / analysis activities.

5.2.2 Responsible for writing Deficiency Reports (DR) as required.

6.0 INSTRUCTIONS 6.1 validating the simulator Instructor Guides (SIG)

NOTE: The purpose of this section is to ensure that the -

Simulator Guides to be used encompass all the

  • training requirements specified by the TS/TPR.

~

I 6.1.1 Assigned simulator instructors shall validate --

the TS/TPR to the SIG's. ,

r- 6.1.1.1 Obtain copies of the unit's TS/TPR

( ,}f for Reactor Operator (RO) and Senior Reactor Operator (SRO).

6.1.1.2 Ensure tasks selected for simulator training are referenced to at least one SIG. l 6.1.1.3 Ensure tasks not selected for simulator training contain "NTR" (No Training Required) in the simulator column.

6.1.1.4 Document the review by signing in the appropriate location on NSEM-2.01, Form 7.1, Section 1.

6.1.2 Assigned simulator instructors shall validate the SIG's against ANSI /ANS-3.5.

4 C/ Rev.: 0 l Date: 2/9/88 Page: 3 of 14 i NSEM-2.01 I

~

Obtain copies of all of the unit's

_f.1.2.1 SIG's in' existence as of the date of this review, Licensed Operator Initial Training (LOIT), Licensed Operator Upgrade Training (LOUT),

Licensed Operator Requalification Training (LORT), and any others).

6.1.2.2 On NSEM-2.01, Form 7.1, list all of the SIG's to be reviewed under Section 2.

6.1.2.3 Review each SIG against the list of ANS-3.5 evolutions and transients listed on NSEM-2.01, Form 7.1, Section 3.

6.1.2.4 For each ANS-3.5 requirement .

satisfied by a SIG, list the guide ,

number in the corresponding location .

on NSEM-2.01, Form 7.1, Section 3.

NOTE: Some SIG's may satisfy more than one _ _ ,

ANS-3.5 requirement. The SIG should -

be listed for each.

6.1.2.5 After all SIG's have been reviewed, check NSEM-2.01, Form 7.1, Section 3 to ensure each ANS-3.5 requirement is referenced to at least one SIG.

6.1.2.6 ror any ANS-3.5 requirement not referenced to a SIG, list the plant operating procedure or other document used to accomplish the evolution.

6.1.2.7 Sign and date the completed NSEM-2.01, Form 7.1.

6.2 Defining the scope of simulation for Hardware Systems ,

NOTE: The purpose of this section is to compile a list of hardware necessary to support training requirements.

6.2.1 Compile a list of all hardware items physically represented on the simulator.

NOTE: Up-to-date existing hardware lists or panel layout drawings can be used as acceptable alternatives.

Rev.: 0 Date: 2/9/88 '

Page: 4 of 14 NSEM-2.01

f l' *

. f% ' .

\- 6.2.1.1 For ease of use, group hardware items

-- by panel and by system. Component listings should be by noun name or equipment number.

NOTE: When listing ha'ndswitches and controllers, the associated indicating lights, positien indicating meters, and input / output demand meters are automatically incluJed and need not be identified.

Lamicoid nameplates are automatically included with all hardware items and need not be identified. Do not specify switch positions or .

functions. Proper operation will be verified in NSEM-4.01.

6.2.1.2 where multiple similar components -

exist, only a single entry is required. Example: RCP 'A' (B,C,D) 6.2.1.3 Annunciator layout drawings or the Unit's Control Room Annunciator Book --

(CRAB) should be used for listing rs annunciators.

b 6.2.2 Assign unique code numbers to the list of hardware items compiled in 6.2.1.

6.2.2.1 Annunciator windows should be coded using the grid location system.

Example: C01-A4 (panel C01, 1st row from top, 4th column from left) 6.2.2.2 Hardware should be coded by panel, system, and order of appearance.

Example: H1-1-1 (panel C01, 1st system listed, 1st component listed) 6.2.3 Review the SIG"s and procedures Fpecified on NSEM-2.01, Form 7.1, Section 2 to determine hardware simulation required to support

+

training.

6;2.3.1 Initiate an NSEM-2.01, Form 7.2 for each SIG and procedure to be reviewed.

l O Rev.: 0 Date: 2/9/88 Page: 5 of 14 NSEM-2.01

t 0-l E

(^l .

L

~

6.2.3.2 When determining hardware items, L identify all associated equipment and

. indication required for proper operation. Example: ammeters, level, pressure, temperature, and flow instruments, and required dampers and valves.

6.2.3.3 When an item from the Simulated Hardware List is identified for the first time, place a checkmark in the

" Required" column on the list of hardware or mark off the annunciator window grid location, respectively.

Also, list tho hardware item on the NSEM-2.01, Form 7.2 for the document being reviewed.

6.2.3.4 If an annunciator window or hardware ,

item is identified which does not ,

appear on the Simulated Hardware List:

a. Verify that the item was not -

inadvertently omitted. If so, add and code the item, or

(

b. Initiate an NSEM-2.01, Form 7.3 identifying the item by noun name art equipment number and the SIG and/or procedure requiring the item.

'5.2.4 After reviewing all SIG's and procedures specified on NSEM-2.01, Form 7.1, Section 2, ,

evaluate the remaining hardware items and l I

annunciators not checked off during the performance of 6.2.3.

6.2.4.1 Equipment which could be used by trainees in performing diagnosis or which might be used or operated by control room operators should be marked off with a single asterisk in the " Required" column of the hardware list or annunciator location grid.

/

(

Rev.: 0 Date: 2/9/88 Page: 6 of 14 NSEM-2.01 ,

       "'                                  Equipment and annunciators whose

_6.2.4.2 presence is required solely for

                              -            support of physical fidelity should be checked off with'a double asterisk.

6.2.4.3 Any remaining hardware items and/or annunciators should be listed by equipment name and number and hardware code number or annunciator grid location on NSEM-2.01, Form 7.4. 6.2.5 Perform an independent review to resolve non-modeled hardware items listed on NSEM-2.01, Form'7.3. 6.2.5.1 Two independent instructors should review the SIG's and/or procedures . specified to determine if acceptable , alternate equipment exists. . i6.2.s.2 Items so resolved should be checked - off and the justification specified. ,, 6.2.5.3 Non-modeled items determined to be t required shall be DR'ed and the DR number.should be entered on the NSEM-2.01, Form 7.3. 6.2.5.4 A line item on the close-out for DR's so initiated shall be to update the NSEM-2.01, Form 7.3 and the Simulated Hardware Litt. 6.2.6 Perform an independent review to resolve simulator hardware items listed on NSEM-2.01, Form 7.4. 6.2.6.1 Two independent instructors should , evaluate the hardware items listed to determine if their presence could have a negative impact on training. 6.2.6.2 Those items determined to have a negative impact on training should be DR'ed for removal. The DR number should be entered on NSEM-2.01, Form < 7.4.  ! Rev.: 0 Date: 2/9/88 Page: 7 of 14 NSEM-2.01 L- L __ _-_ - _ _ _ _ _ _ e

( -

                                                                              ~6.2.6.3  A line item en the close-out for DR's so initiated shall be to update the
                                                                            -           NSEM-2.01, Form 7.4 and the simulated Hardware List.

6.2.6.4 Remaining items determined not to have a negative impact on training i should be checked off on the NSEM-2.01, Form 7.4 and marked with a triple asterisk on the Simulated Hardware List. 6.3 validating the simulation System Diagrams (SSD's,) NOTE: The purpose of this section is to ensure that existing Simulator System Drawings contain no errors i relative to the more complete plant Piping and Instrumentation Drawings (P&ID's). . I 6.3.1 Assigned simulator instructors shall validate . the SSD's to the plant P and ID's and l electrical distribution drawings. -

                                                                                                                                    ~~

6.3.1.1 Obtain a set of the unit's systems - P and ID's and electrical () distribution drawings. 6.3.1.2 Obtain a complete set of the simulator's SSD's. 6.3.1.3 All electrical distribution and fluid system SSD's should be compared with their respective plant drawing (s) for the following:

a. Functional similarity of flowpaths.
b. Functional similarity of components and component .

locations in the system.

c. Functional similarity of system interfaces.
d. Functional similarity of instrumentation with control room indication, alarm function, or control function.

i Rev.: 0 Date: 2/9/88 Page: 8 of 14 NSEM-2.01

L

  /7)
                                              ~

N_ J e. Functional similarity of computer points. NOTE: SSD's depicting logic and modeling for electronic and electric control systems do not compare readily with plant schematics and wiring diagrams. The discernible effects of functional similarity will be 2 verified during system testing. 6.3.1.4 Any identified discrepancies should be noted on an NSEM-2.01, Form 7.5 and maintained with the SSD during the software review. 6.4 Defining the Scope of Simulation for Software Systems NOTE: The purpose of this section is to compile a list of - corrected and updated Simulator System Drawings necessary to support training requirements.

                                                                                                                       ~

6.4.1 On the electrical distribution and fluid fN system SSD's, identify the flowpaths that would be used during normal, abnormal, (' ') transient, or emergency plant operating conditions (flowpaths may be traced in hi-liter for clarity). 6.4.1.1 Compile a list of identified flowpaths for each SSD. Code flowpaths by system designator and numerical order. Example: RB-3, RBCCW, third flowpath listed. NOTE: Flowpaths which involve more than one system or SSD should be listed for one system only and referenced on the SSD lists with which it interfaces. 6.4.1.2 Flowpath-lists should be maintained with their respective SSD for use during the SIG review. 6.4.2 SSD's depicting modeling for electric and electronic control systems should be listed by system designator for use during performance of 6.4.3, e.g.: RPS, ESAS, RRS. Rev.: 0 Date: 2/9/88 Page: 9 of 14 NSEM-2.01

(

         \                              NOTE:        - SSD's depicting simulator modeling concepts
                                                       .s'hould not be considered for this review, e.g.:   core nodalization, thermodynamic relationships, model hierarchies.

6.4.3 Assigned instructors shall review all of the SIG's and procedures specified on NSEM-2.01, Form 7.1, Section 2 to identify systems and flowpaths required to support training. NOTE: Systems and flowpaths in use at an initial condition need not be identified. The status of IC's will be certified in NSEM-4.02. 6.4.3.1 Initiate an NSEM-2.01, Form 7.6 for each SIG and procedure to be reviewed. 6.4.3.2 Review each SIG and procedure to - identify actions which will change - existing flowpaths, crecte new . flowpaths, or initiate flowpaths in previously idle systems. NOTE: When identifying flowpaths, consider rs those which would be exercised by ( correct and reasonably incorrect trainee actions. Absurd, but I physically possible, flow paths { should not be considered. 6.4.3.3 Review the respective SSD's and flowpath lists to ensure the flowpath and all required components are represented. t 6.4.3.4 On the simulated Systems /rlowpaths List, check off each flowpath/ system when identified for the first time. Also, by code, list the flowpath/ system on the NSEM-2.01, Form 7.6 for

                        '-                                 -       the SIG or procedure being reviewed.

6.4.3.5 If a flowpath or component is identi- I I fled which does not appear on the flowpath list or the respective SSD, perform the following: )

a. Re-evaluate the SIG or procedure to verify the flowpath determination is correct.

l Rev.: 0  ! l Date: 2/9/88 Page: 10 of 14 NSEM-2.01

                                                                                                                      )

[f '~

                                                   .          b. Re-evaluate the SSD to ensure the
                                                ~

flowpath was not omitted. If omitted, identify, code and proceed.

c. Evaluate plant drawings to ensure the flowpath exists.
d. On a flowpath diserapancy form, NSEM-2.01, Form 7.7, list the SIG or procedure number, describe the flowpath or component identified, and specify the determination of the validity per step c.

6.4.3.6 If an SSD has an NSEM-2.01, Form 7.5 attached, review the functional dissimilarities specified for adverse impact on flowpaths identified. ,"

a. If no adverse impact, proceed.
b. If an adverse impact is --

determined, list the SIG or . procedure number in the appropriate column on the

      -['                                                           NSEM-2.01, Form 7.5.

6.4.4 After all SIG's and procedures have been reviewed, list all unchecked systems and flow-paths on NSEM-2.01, Form 7.8. 6.4.5 Perform an independent review to resolve any functional dissimilarities identified on the NSEM-2.01, Form 7.5 for each SSD. 6.4.5.1 Two independent instructors should review the SSD and respective plant drawings to verify that the functional dissimilarity exists. If not, close out the line item. 6.4.5.2 If an adverse impact is listed against a verified item, evaluate the SIG or procedure to verify the adverse impact. 6.4.5.3 Evaluate functional dissimilarities to determine if they produce any incorrect effects which would be discernible to a trainee. l O Rev.: 0 Date: 2/9/88 Page: 11 of 14 NSEM-2.01

l' .

               ~

I k- 6.4.5.4 If a functional dissimilarity has no. adverse impact and does not produce

                .           any incorrect discernible effects, the line item may be closed out.

6.4.5.5 A DR should be issued against any functional dissimilarity not closed out under 6.4.4.4. A line item on the DR closecut should be to update the SSD and close out the line item on the respective NSEM-2.01, Form 7.5. 6.4.6 Perform an independent review to resolve flowpath discrepancies listed on NSEM-2.01, Form 7.7. 6.4.6.1 Two instructors should reverify the flowpath discrepancy per step - 6.4.3.5. , 6.4.6.2 If the SIG or procedure requires an actual flowpath which is not modeled, . submit a DR to include the flowpath/ component. A line item on the DR , ' closecut should be to update the SSD,

  /~                          the NSEM-2.01, Form 7.7 and the Simulated Systems /Flowpaths List.

( )) 6.4.6.3 If a SIG requires a non-existent flowpath, an NTM 2.06 Form 7.2 should be submitted to revise the SIG and update the NSEM-2.01, Form 7.7. 6.4.7 Perform an independent review to disposition systems and flowpaths listed on NSEM-2.01, Form 7.8. 6.4.7.1 Two instructors should reevaluate each item using the criteria of Steps 6.4.1 and 6.4.3.2. 6.4.7.2 Items satisfying any of the specified criteria should be checked off on the Systems /Flowpaths List and the resolution specified on NSEM-2.01, Form 7.8. O Rev.: 0 Date: 2/9/88 Page: 12 of 14 NSEM-2.01

 \

l DT _ (ms/ 6.4.7.3 Items.failing to meet any of the l

                                   ..         specified criteria should be marked CNR, (certification not required), on                               3 the Systems /Flowpaths List;                                        i applicable portions of SSD's should                                 i be circled in red and annotated CNR,                                j and the resolution specified on NSEM-2.01, Form 7.8.                                                 l 6.5 Disposition of Forms Generated 6.5.1     Forward completed originals of the following to the Assistant Supervisor - Simulator Training (ASST) for review and approval:

6.5.1.1- NSEM-2.01, Form 7.1 6.5.1.2 Simulated Hardware List 6.5.1.3 Copies of all SIG's and procedures, i specified on NSEM-2.01, Form 7.1, . with their respective NSEM-2.01, Form  ; 7.2 and Form 7.6 attached.  !

                                                                                            ~

6.5.1.4 NSEM-2.01, Form 7.3 NSEM-2.01, Form 7.4 ( ) 6.5.1.5 6.5.1.6 Simulated Systems /Flowpaths List with the applicable copies of NSEM-2.01, Form 7.5 attached. 6.5.1.7 NSEM-2.01, Form 7.7 6.5.2 The AfiST will forward the approved originals specified in 6.5.1 to controlled Document Storage for retention with simulator , certification records. 6.5.3 Copies of the approved Simulated Hardware List and Simulated Systems /Flowpaths List shall be provided as inputs to NSEM-4.01. 6.5.4 Copies of the approved Simulator Hardware List , and Simulator System Diagrams (with corrections) shall be transmitted via formal memo to STSB for verification of hardware physical fidelity and software model verification. 1 Rev.: 0 Date: 2/9/88 l Page: 13 of 14 NSEM-2.01

                                                                                                                                    )
     . ,n                                                 .

7.0 FORMS 7.1 SIG Validation Form 7.2 SIG/ Procedure Required Hardware / Annunciator Form 7.3 Hardware Discrepancy Form 7.4 Modeled Hardware Not Required for Training Form 7.5 SSD Functional Dissimilarity Form 7.6 SIG/ Procedure Required System /Flowpath Form 7.7 Software Discrepancy Form  ! 7.8 Systems /Flowpaths Not Required for Training Form 8.0 ATTACHMENTS . NONE i l i O Rev.: 0 Date: 2/9/88 Page: 14 of 14 NSEM-2.01

Form 7.1 Ch - SIG VALIDATION (jJ Section 1 , I have reviewed the.RO and SRO TS/TPR for the training programs. All tasks selected for simulator training are referenced to at least one SIG. All tasks not selected for simulator training have "NTR" (No Training Required) entered in the simulator column. Reviewer's Signature Date

                                                                                                                                     .1 Section 2
  • 1 LOIT, LOUT, The below listed SIG's for the
  • and LORT programs are to be reviewed per NSEM-2.01: ...

O

                            .~

lO Rev: 0 Date: 2/9/88 Page: 7.1-1 of 10 NSEM-2.01

Section 2 Continua

  • ion f

o 1 O 1 I i l O Rev: 0 J Date: 2/9/88 898: 7.1-2 of 10 NSEM-2.01 l

1

      <'                                                                                                         i Section 3-
               .ANS 3.5~ Normal Plant Evolutions:

l.. Plant Startup Cold Shutdown to Hot Standby: l 2.- Nuclear'Startup' Hot Standby (to 100% Power):

3. Turbine Startup and Generator. Synchronization: .--

O

                             .4.        Reactor Trip followed by Recovery to 100% Power:

I P Rev: 0 Date: 2/9/88 Page: 7.1-3 of 10  : NSEM-2.01

u, ,

5. Operations at Hot Standby:
                                  .6.-   Load Changes:                                                               i i
                                  .7.. Startup, Shutdown, and Power Operations with Less than Full             -

Reactor Coolant Flow (If allowed by Technical - Specifications): l

  • 1 i

IY

8. Plant Shutdown from 100% Power to Hot Standby and Cooldown to Cold Shutdown:

J Rev: 0 ) Date: 2/9/88 l Page: 7.1-4 of 10 NSEM-2.01  ;

19. Core Performance Testing-(calorimetric, SDM determination, r'~S reactivity coefficient measurements, rod worth testing):

(J[ - 1

10. - Surveillance Testing on Engineering Safeguards Facility Equipment:

ANS 3.5 Malfunctions: ,

1. Loss of Coolant, Steam Generator Tube Rupture, inside '-'

containment, outside Containment, large break, small break, .  ! saturated Reactor Coolant System, failure of safeties and Power Operated Relief Valves:

        .(         )
2. Loss of Instrument Air:

1 Rev: 0 Date: 2/9/88 Page: 7.1-5 of 10 NSEM-2.01  ;

i 2 1 _ W .: ;3. Loss of/ Degraded Electrical Power: l -

4. Loss' of Reactor Coolant System Flow:
5. Loss.of Vacuum:

O

6. Loss of service Water:

7.. Loss of Shutdown Cooling, Residual Heat Removal: O Rev: 0 Date: 2/9/88 Page: 7.1-6 of 10 NSEM-2.01

             ~8.'                       ' Loss of' closed' Cooling:
                                                        ~

9.. Loss of Normal.Feedwater:

10. Loss of All Feedwaters .
                                                                                                                                   .J O          11. . Loss of-Reactor'Proection System Channel:

I 1 I

           .12.                            Control Rod Failures:

l

                                                               .                                                                     I

{ l l l l l O Rev: 0 Date: 2/9/88 Page: 7.1-7 of 10 NSEM-2.01

1

                                                            '13.
                                                                    . Inability.to Drive control Rods:
              %)                                                                    ..:
14. Fuel Clad railures:
15. Turbine Trip: I
16. Generator Trip:
17. Failure in Automatic Control Systems affecting Reactivity and Core Heat Removal:

O Rev: 0 Date: 2/9/88 Page: 7.1-8 of 10 NSEM-2.01 s-=__-_______,__._.__ -_ _ . _

4

18. Failure of Reactor Coolant System Pressure and Volume Control Systems: ,
19. Reactor Trip:
20. Main Steam / Feed Line Break (in/out containment:

O

21. Nuclear. Instrumentation Failures:

i

22. Process' Instrumentation Alarms and Failures:

4 t. l Rev: 0 Date: 2/9/88 Page: 7,1-9 of 10 NSEM-2.01

l l -~ L - h 23. Passive Failures in' Engineered Safety Features or Emergency Feedwater:

24. Anticipated Transient Without Scram:
25. Reactor Pressure Control System Failure:

I (~ .

    ?
               .All of the SIG's listed in Section 2 have been reviewed against ANS 3.5 evolutions and transients per NSEM-2.01. All ANS 3.5 requirements are referenced to at least one SIG or appropriate plant procedures or other documents have been identified and added to the SIG list in Section 2.

Reviewer's Signature Date Approved by ASST Date r~ Rev: 0 Date: 2/9/88 Page: 7.1-10 of 10 NSEM-2.01

o a a O. . Form 7.2

                                                                                ~

SIG/PR0EEDURE REQUIRED HARDWARE / ANNUNCIATORS 1

The following hardware and annunciators are required to support the. accomplishment of SIG/ Procedure on the l Simulator:

l 1 O Rev.: 0 , Date: 2/9/88 i Page: 7.2-1 of 1 l NSEM-2.01

L-Form 7.3 HARDWARE DISCREPANCY

s. . . .

Equip. Name Equip. No. SIG/ Proc. No. Resolution l O O Rev.: 0 Approved by ASST Date: 2/9/88 i Page: 7.3-1 of 1 Date NSEM-2.01

i k ;c.x. q 1

   . s.

t g ,N-Form 7.4 Q . l 3 MODELE3. HARDWARE NOT REQUIRED FOR TRAINING Code No.

               ' Equip. Name                                                                                      Equip. No.                       Resolution 1

I O i l 1 l 1 Rev.: 0 Approved by ASST Date: 2/9/88 Page: 7.4-1 of 1 Date NSEM-2.01

e

                                                                                                                                          .I Form 7.5 SSD FUNCTIONAL DISSIMILARITY SSD No.                    Code lio.              SIG/ Proc. No. Description    Resolution o

i-1 O 1

l 1

J M l i

                                                                                                                                          'l i

i O Rev.: 0

                                  - Approved by ASST                                                           Date: 2/9/88                   )'

Page: 7.5-1 of 1 Date- NSEM-2.01

O . Form 7.6 SIG/ PROCEDURE REQUIRED SYSTEM /FLOWPATH The following systems /flowpaths are required to support the accomplishment of.SIG/ Procedure on the simulator:

                                                                                                ~'

O l

                                                                                                       ,1 l

l Rev.: 0 Date: 2/9/88 Page: 7.6-1 of 1 NSEM-2.01

l; f.'l'

                        .. ;           Form 7.7 SOFTWARE DISCREPANCY Sys./Flowpath Description           Valid (Y/N)    Resolution SIG/ Proc. No.

O O Rev.: 0 Approved by ASST Date: 2/9/88 Page: 7.7-1 of 1 Date NSEM-2.01

Form 7.8 SYSYEM/FLOWPATHS NOT REQUIRED FOR TRAINING l System Flowpath code-Designation SSD No. If Applicable Resolution s

                                                                                                              !i
                                                                                                       ~~

O O Rev.: 0 Approved by ASST Date: 2/9/88 Page: 7.8-1 of 1 Date NSEM-2.01

L., l-

                                                          ~

j; , 9

                                                             ~~

NORTHEAST UTILITIES NUCLEAR SIMULATOR ENGINEERING MANUAL NSEM - 2.02 DEFINING THE " CERTIFIED" TRAINER O Approved: DMerter,> Nuclear Training Revision: 0 Date: 2/9/88 SCCC Meeting No: 88-002 0

   .                                                                                 t
 - t,
   ,-~
                                              ~

1.0 PURPOSE

                                               ~

The purpose of this procedure is to identify initial conditions (IC), remote functions (REM), and malfunctions (MALF) required to support performance-based nuclear power plant operator training on the Northeast Utilities simulators for Millstone 1, 2, 3 and Connecticut Yankee. 2.0 APPLICABILITY This procedure applies to the Nuclear Training Department (NTD), including the Operator Training Branch (OTB), simulator Technical Support Branch (STSB) and other Northeast Utilities (NU) organizations performing functions in support of the NU Simulator Certification Program.

3.0 REFERENCES

1 3.1 ANSI /ANS3.5-1985 - This standard states the minimal , functional requirements on design data and simulator performance and operability testing. _ ,

  /T                            3.2   +KC RG 1.149-Rev. 1, April 1987 - This guide describes an acceptable methodology for certification

(_/ by endorsing ANSI /ANS-3.5, 1985 with some additional requirements. 3.3 NUREG 1258 - Describes the procedures and techniques which will be employed to audit certified facilities. 3.4 Electric Power Research Institute (EPRI), Qualification Plan for the Millstone II Simulator, (Rev. O, 12/85), prepared by General Physics Corporation. 3.5 NU Simulator Certification Program (draft) 3.6 INPO Guidelines for Simulator Training (INPO 86-026) October 1986. 3.7 10CFR 55.45, Operating Tests 3.8 NTDD-17, Simulator Certificarter and Configuration Management Control  ! l Rev.: 0 Date: 2/9/88 Page: 1 of 9 NSEM-2.02

m l 1 ! r f _

  • 4.0 DEFINITIONS 4.1 Deficiency Report (DR) - a form (STS-BI-FlA) used by the Operator Training Branch (OTB) and the Simulator Technical Support Group (STS) to record all identified simulator deficiencies between the simulator and reference plant.

4.2 Simulator Instructor Guide (SIG) - training document providing guidance and instructions for the conduct of simulator training. 4.3 Simulation System Diagram (SSD) - functional representation of the simulator modeling for a given l system. 4.4 Initial Condition (IC) - an operational status at which the simulator can be initialized. Included are ~ time in core life, xenon, decay heat, power level, " system and component operational status. , 4.5 Remote Function (REM) - an instructor initiated input to the simulator model which will provide the same -- discernible effects as the corresponding manual operation in the reference plant. Malfunction (MALF) - an instructor initiated input to (_) 4.6 the simulator model which will provide the trainees with similar discernible effects, (initial indications and response to corrective actions), as those of a corresponding equipment malfunction in the reference plant. 5.0 RESPONSIBILITIES 5.1 Assistant Supervisor Simulator Training (ASST) 5.1.1 Responsible for assigning Simulator Instructors to conduct performance-bared review / analysis. 5.1.2 Responsible for assigning Simulator Instructors to perform independent resolution of discrepancies. 5.1.3 Responsible for reviewing and approving the outputs of this procedure identifying the IC's, REM's, and MALF's to be certified.

  \                                                                            Rev.: 0 Date: 2/9/88 Page: 2 of 9 NSEM-2.02
  .t                                                                                    l
   /~g
    )

l

, I,
                        ~

L 5.2 Simulator Instructors 5.2.1 Responsible for conducting performance-based review / analysis activities. 5.2.2 Responsible for writing Deficiency Reports (DR) as required. 6.0 INSTRUCTIONS 6.1 Identifying Required IC's 6.1.1 Assigned simulator instructors should review all SIG's, plant operating procedures, and other documents listed on NSEM-2.01, Form 7.1, Section 2 to determine initialization conditions required to support their _ performance. , 6.1.1.1 On NSEM-2.02, Form 7.1, specify the following initialization - requirements for each document ~^ reviewed: .

a. Time in core life.
   \-                                b. Operational mode / power level.
c. Xenon trend (stable, increase, decrease).
d. Time after startup/ shutdown, (if applicable).
e. Abnormal system /corponent alignments or status.
f. Other specified conditions, e.g.: control rod position, turbine speed, main generator status, etc.)

6.1.2 consolidate initialization requirements to develop a list of " basic" IC's, 6.1.2.1 Group initialization requirements by time in core life. O Rev.: 0 Date: 2/9/88 Page: 3 of 9 NSEM-2.02 l

 ,a (t)                      '

rurther subdivide these groups by

                                  - 6' . l . 2 . 2 operational mode, i.e.:

startup/ power, hot standby / hot shutdown, cold shutdown / shutdown cooling mode. 6.1.2.3 rurther subdivide each group based on the following requirements:

a. Relative power level (Modes 1 and
2) or RCS temperature (Modes 3, 4, and 5).
b. Xenon levels and trend ( Modes 1, 2, & 3).
c. Specified conditions requiring -

lengthy or excessive - modifications. , NOTE: An initialization with control ' rods inserted may be grouped with ( a zero power condition with ' ' control rods partially withdrawn since a manual reactor trip can be used to quickly establish the (~)N (, requirement. However, main steam in and out of the turbine building should be separated since the time to effect either from the other is lengthy. 6.1.2.4 On NSEM-2.02, Form 7.2, fill in the required information for each of the final initialization groupings. 6 1.2.5 On NSEM-2.02, Form 7.2, add any initialization conditions which could not be readily developed from one of the above and might be required to support training and/or , simulator testing. 6.2 Identify Required REM's 6.2.1 Obtain an up-to-date list or hardcopy printout of all REM's. l Rev.: 0 Date: 2/9/88 Page: 4 of 9 NSEM-2.02 L__--__---_ -

{

   .in lU 6.2.2 _A3 signed simulator instructors should review all SIG's, plant operating procedures, and
                                          - other documents listed on NSEM-2.01, Form 7.1, Section 2 to identify those REM's required to support their performance.

6.2.3 As each REM is identified, it should be checi.ed off on the REM list. NOTE: Variable and tri-state REM's should be fully checked off regardless of the value or status specified. 6.2.4 If performance of an evolution, specified in a 6.2.2 document, requires a remote action which does not appear on the REM list, it should be entered in the appropriate columns on NSEM-2.02, Form 7.3 for later resolution. - 6.2.5 After all documents specified in 6.2.2 have . been reviewed, evaluate the unchecked REM's as follows: 6.2.5.1 Check off REM's for simulator 3 components identified in 6.2.3, ( i e.g.: B-C cross tie if the A/ corresponding A-B cross tie was previously checked. 6.2.5.2 Check off and annotate REM's which might be requested by trainees. 6.2.5.3 Check off and annotate REM's which might be used in future SIG's. 6.2.5.4 Check off and annotate REM's which might be required to support simulator testing. 6.2.6 List remaining unchecked REM's on NSEM-2.02, Forg 7.4 for independent evaluation. 6.3 Identifying Required MALF's NOTE: Se.ctions 6.2 and 6.3 may be performed concurrently. 6.3.1 Obtain an up-to-date- list or hardcopy printout of all the MALF's. l Rev.: 0 Date: 2/9/88 NSEM-2.02 1

I

     .                                                                                                   1 l
   -(N                                    ,
     \)

6.3.2 -Assigned simulator instructors should review all SIG's, plant operating procedures, and 1 other documents listed on NSEM-2.01, Form 7.1, 1 Section 2 to identify those MALF's required to support their performance. 6.3.3 As each MALF is identified, it should be checked off on the MALF list. 6.3.3.1 For variable MALF's, annotate the MALF list with the severity specified by the SIG each time the MALF is used, increased, or decreased. NOTE: Variable and multiple component MALF's should be fully checked off regardless ' of severity or number of individual

  • components specified. ,

6.3.4 Where composite malfunctions or simultaneous . MALF's are used, evaluate the interaction of the individual MALF's to determine if they _ need to be tested together for certification. The following criteria should be used:

        )

6.3.4.1 One of the MALF's directly affects i Reactor Coolant System parameters (inventory, temperatures, pressure, sub-cooled margin). 6.3.4.2 The other MALF(s) significantly J increases the magnitude of the  ! affect. Examples: Maximum Loss of j Coolant Accident with full Loss of Normal Power, loss of all feedwater with loss of instrument air, Maximum i unisolable steam line break with full Loss of Normal Power. 6.3.5 Simultaneous / composite malfunctions identified in 6.3.4 should be specified on NSEM-2.02, Form 7.7. 6.3.6 If performance of an evolution, specified in a j 6.3.2 document requires a malfunction which  ; does not appear on the MALF list, it should be entered in the appropriate columns on l NSEM-2.02, Form 7.5 for later resolution. (::) Rev.: 0 i l Date: 2/9/88 l Page: 6 of 9 ) I NSEM-2.02 l

l I l

       /~N               -

l. 5

             )
       \#          6.3.7     After all ds fuments specified in 6.3.2 have
                            ' teen revised, evaluate the unchecked MALF's as                        l
                          . follows:                                                               j l

6.3.7.1 Check off and annotate MALF's which might be used in future SIG's. 6.3.7.2 Check off and annotate MALF's which might be required to support simulator testing. 6.3.8 List the remaining unchecked MALF's on NSEM-2.02, Form 7.6 for independent evaluation. 6.4 Independent Evaluation of Remaining REM's and MALF's 6.4.1 A minimum of two independent simulator ~ instructors should review the REM's listed on

  • j NSEM-2.02, Form 7.4 using the criteria ~

specified in 6.2.5. 6.4.2 Those REM's deemed to be required under one of the categories should be checked off and 1so . annotated on both the REM list and the NSEM-2.02, Form 7.4.

             )

6.4.3 A minimum of two independent simulator instructors should review the MALF's listed on NSEM-2.02, Form 7.6 using the criteria specified in 6.3.7. 6.4.4 Those MALF's deemed to be required under one  ; of the categories should be checked off and so annotated on both the MALF list and the NSEM-2.02, Form 7.6. 6.5 Resolution of Non-Modeled REM's and MALF's 6.5.1 Assigned simulator instructors should evaluate the- REM's listed on NSEM-2.02, Form 7.3, in ths context of the evolution for which they are required, to' determine if an acceptable alternative exists. > NOTE: I/O override, alternative REM's, MALF's, or verbal instructor response may be considered acceptable alternative if they do not produce a difference in effects which would be discernible to a trainee. 1 4

         \                                                   Rev.: 0 l

Date: 2/9/88 Page: 7 of 9 NSEM-2.02

, :s".

  .3 l

i h

                                    ~

6'.5.2 -specify the acceptable alternatives on

                                     , NSEM-2.02, Form 7.3.

6.5.3 If a SIG requires use of an acceptable alternative to accomplish a remote action, eraore that the acceptable alternative is specified. 6.5.4 Initiate SIG revision for any instance where 6.5.3 is not true. I 6.5.5 Submit a DR to incorporate any required REM i for which an acceptable alternative does not exist. Record the DR number on the NSEM-2.02, Form 7.3. 6.5.6 Repeat Steps 6.5.1 through 6.5.5 for MALF's - listed on NSEM-2.02, Form 7.5. 6.6 Disposition of Forms Generated 6.6.1 Forward completed originals of the following to the Assistant Supervisor - Simulator , Training (ASST) for review and approval:

   )                                   6.6.1.1    NSEM-2.02, Form 7.1 6.6.1.2    NSEM-2.02, Form 7.2 6.6.1.3    REM list with NSEM-2.02, Form 7.4 attached.

6.6.1.4 NSEM-2.02, Form 7.3 6.6.1.5 MALF list with NSEM-2.02, Form 7.6 attached. 6.6.1.6 NSEM-2.02, Form 7.5 6.6.1.7 NSEM-2.02, Form 7.7 6.6.2 The, ASST may include REM's and/or MALF's listed on NSEM-2.02, Forms 7.4 and 7.6 in the group to be certified by checking the REM and/or MALF list and annotating the NSEM-2.02, Form 7.4 and/or 7.6. O Rev.: 0 Date: 2/9/88 Page: 8 of 9 NSEM-2.02

 . [% ' ,
                           ~

N, 6.6.3 The ASST will forward the approved originals specified in 6.6.1 to Controlled Document

                            -  Storage for retention with Simulator Certification records.

6.6.4 A copy of the approved NSEM-2.02, Form 7.2 shall be provided as an input to procedure NSEM-4.02. 6.6.5 A copy of the approved REM list shall be i

                              .provided as an input to procedure NSEM-4.03.

6.6.6 Copies of the approved MALF list shall be provided as inputs to procedures NSEM-4.04, 4.05, 4.06. 7.0 FORMS . 7.1 Initialization Requirements . 7.2 Basic IC's For Certification - 7.3 Required REM's Not-Modeled .

       )      7.4   REM's Not Required Certified 7.5   Required MALF's Not-Modeled 7.6   MALF's Not Required Certified 7.7'  Composite MALF's for Certification 8.0 ATTACHMENTS None i

l Rev.: 0 Date: 2/9/88 Page: 9 of 9 NSEM-2.02 E-_

4 FORM 7.1 INITIALIZATION REQUIREMENTS Core . Time Abnormal Special

                             - Doc. No. Life Mode /Pwr           Xe/ Trend'  SD/SU    Align     Conditions O

1 O Rev.: 0 Date: 2/9/88 Page: 7.1-1 of 1

l. ,

p. E

                                                                    '-                  FORM 7.2-IC's FOR CERTIFICATION
                                                                                        . Time    RCS            Special (Core                                                                                     Conditions j ..                      Life                            Mode /Pwr      Xe/ Trend       SD/SU    P/T i

l

                                                                                                                                  . i
                                                                                                                             ~~

O  ! l I l l l l 0 O l l Rev.: 0 Date: 2/9/88 Page: 7.2-1 of 1 NSEM-2.02

x

           , .C; f;U . i r                              .

FORM 7.3

                                                     ~
-~ REQUIRED' REM'S NOT MODELED
                                                ~

Sig/ Proc /- Doc No. Remote Action-(Describe) Acceptable' Alternative /DR# v )

     \

l Rev.: 0 Date: 2/9/88 Page: 7.3-1 of 1 NSEM-2.02

    ..                                                                                        -)

i I FORM 7.4

                              . REM'S NOT REQUIRED CERTIFIED 4

l-,, REM #- Evaluation Results-- ')

                                                                                             .1 T

1 i Performed by: Dates  ! Performed by: Date: Approved by: _~ Date: ASST Rev.: 0 Date: 2/9/88 Page: 7.4-1 of 1 NSEM-2.02 i

i

       ', '/,i -
                                                                                                                            ~

FORM 7.5

                                                                                   --       REQUIRED MALF'S NOT-MODELED Sig/ Proc /-                                             -
                       ' Doc No.                                              ~ Malfunction (Describe)                                    Acceptable' Alternative /DR4 n ..

t: .

                                                                                                                                                                                                            ~'

k l '+ s

                                                                                                                                                                                                                   )

l l l Rev.: 0 Date: 2/9/88 NSEM-2.02 d- l=________ _

                                                                 .               FORM'7.6:
                                                             ^
                                                                    ,MALFfS NOT REQUIRED-CERTIFIED-Malf #           -
                                                                  * ~

Evaluation'Results i

     -t
        'l l

L

                                                                                                                             ...      \'

l(

               ~

Performed by: Date:

                                                                         ~

Performed by: - Date: Approved by: Date: ASST l '. i Rev.: 0 Date: 2/9/88 Page: 7.6-1 of 1 NSEM-2.02 _E_'_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ . _ _

p O  : FORM 7.7 iI COMPOSITE MALF'S FOR CERTIFICATION-Initial. MALF Severity Additional MALF(S)/ Failures Severity O O Rev.: 0 Datc: 2/9/88 "I'* * ~1 f1 NSEM-2.02

w-

t
        .k)

NORTHEAST: UTILITIES NUCLEAR SIMULATOR ENGINEERING MANUAL NSEM - 2.03 SOFTWARE DESIGN VERIFICATION as O

                                                                                          $$$N$uI!I                                         -
                                                                                                      . py(.ge                           imulat'or Techni                           Support Approved:                                                       .

D , Auclear Training i Revision: 0 Date: MAY 30, 1988 SCCC Meeting No: 88- 007 O

4 7 1.0 PURPOSE I The purpose of this procedure is to verify that the Control Panel Instrumentation, Process Computer Analog Points, Instructor Console Interfaces (Malfunction & Remote Functions), and Plant Components shown on the updated

                                 . System Simulation Diagrams are modeled in the Northeast Utilities simulators for Millstone 1, 2, 3 and Connecticut Yankee reference plants.

1 l 2.0 APPLICABILITY This procedure applies to the Nuclear Training Department (NTD), including the Simulator Technical Support Branch (STSB), the Operator Training Branch (OTB), and any other Northeast Utilities (NU) organizations performing functions in support of.the NU Simulator Certification Program, l

3.0 REFERENCES

3.1 NSEM-1.02: Simulator Certification Program Overview l J 3.2 ANSI /A'S N 3.5-1985 - This standard states the minimal functional requirements on Design Data and Simulator Performance and Operability Testing. 3.3 NSEM-2.01: Defining Training Requirements The output of this procedure will be a list of_ hardware l (panels, components and instruments) and software (an updated set of Simulator System Diagrams) which comprise the required scope of simulation to support the training requirements. 3.4 NRC RG 1.149-Rev. 1, April 1987 - This guide describes an acceptable methodology for certification by endorsing LANSI/ANS-3.5, 1985 with some additional requirements. 3.5 10CFR 55.45(b) - Mandates a timetable for simulator facility certification and specifies additional testing requirements. 4.0 DEFINITIONS 4.1 Simulator Systems Documentation Manuals - The document that contains the design specifications, scope of computer model, (hardware & software) description, capabilities, and assumptions / simplifications for each simulated system model. Rev.: 0 Date: 5/30/88 Page: 1 of 3 NSEM-2.03

L ! l 1 f 4.2 Simulation System Diagram (SSD) - Functional represen-I \ tation of the simulator model for a given system. l ll 4.3 Design Automated and Auditing Documentation System (DAADS) l

                     - The output of DAADS provides the design information of the individual model of the simulated plant system.                The output of DAADS also forms a part of a simulator system manual.

4.4 Deficiency Report (DR) - A form (STS-BI-FlA) used by the OTB and the STS to record all identified simulator deficiencies between the simulator and the reference plant. i 4.5 Simulator Design Change (SDC) - A documentation package ] consisting of relevant DR's and all forms indicated on STS-BI-FIE which is designed to track the resolution of DR's and ensure-that ANSI /ANS 3.5-1985, and NRC Reg. 1.149 requirements are satisfied. 5.0 RESPONSIBILITIES 5.1 Supervisor, Simulation Computer Engineering (SCE) - Overall responsibility for coordination of the Simulator Software Design Verification. 5.1.1 Responsible for assigning engineering resources to accomplish software design verification. 5.1.2 Responsible for reviewing and approving the outputs of this procedure. 5.2 SCE Personnel 5.2.1 Responsible for conducting a verification of the system simulation diagrams against Sections 2, 3, 4 and 5 of the simulator System Documentation Manual. 1 5.2.2 Responsible for documenting and resolving any deficiencies identified in this procedure. 6.0 INSTRUCTIONS 6.1 Model verification - Assigned SCE personnel shall verify the Simulator System Documentation Manuals to the Simulation Systems Diagrams: Rev.: 0 Date: 5/30/88 , Page: 2 of 3 NSEM-2.03

Obtain the output of NSEM-2.01 from OTB, a [~} 6.1.1 copy of the DAADS printout, and a copy of the (_) Software Data / Documentation Update Requirements Form, STS-BI-FlFl(2,3,4) from all open SDC's. 6.1.2 Update the Simulation System Diagrams on CADS to reflect the output of procedure NSEM-2.01. 6.1.3 Update the Simulation System Diagrams on CADS in accordance with all SDC's Form STS-BI-FlFl(2,3,4) " Software Data /Documen-tation Update Requirements". 6.1.4 Check off the following items on the Simulation Systems Diagrams and PAADS printout when identified for the first time: Malfunctions, Remote Functions, Analog PPC Points, Meters, Recorders, Controllers, Switches, Annunciators, Air, Solenoid and Motor operated valves, and Pumps / Motors. 6.1.5 If an item is identified and does not appear on either the SSD's or the DAADS, then complete NSEM-2.03, Form 7.1. IN 6.2 Disposition of identified discrepancies 6.2.1 Forward the completed originals of NSEM-2.03, Form 7.1 to the Supervisor, Simulation Computer Engineering (SCE) for review and approval. 6.2.2 The Supervisor, SCE will forward approved originals to ASRMS for retention with the Simulator certification records. 7.0 FIGURES 7.1 DAADS/SSD's Discrepancies 8.0 ATTACHMENTS NONE y. Rev.: 0 (_, Date: 5/30/88 l Page: 3 of 3 NSEM-2.03 l

         -----m-_._    _ _ _ _
         -f . ' ',

Form 7.1 73

        '\                 .

DAADS/SSD'S DISCREPANCIES SIMULATION SYSTEM ID: SSD NO. ITEM DESCRIPTION RESOLUTION l 4 I l U l l PERFORMED BY: DATE: APPROVED BY: DATE: Supervisor, Simulation Computer Engineering s Rev.: 0

      \

Date: 5/30/88 Page: 1 of 1 NSEM-2.03

l '+ . .. i .5-NORTHEAST UTILITIES p NUCLEAR SIMULATOR ENGINEERING MANUAL i NSEM - 3.01 DEFINITION AND CONTROL OF THE SIMULATOR DESIGN DATA BASE O (). Responsible Individual: gef ff73imulator Techdical

                                                                         /(/ Support Approved:

firectJ,p, Nuclear Training Revision: 3 Date: 1/12/89 A V SCCC Meeting No: 89-001 2

       ^*'
                                                                                                                                         )

1 l

        ,                     1. 0 .'  PURPOSE k/                             1.1   To define the specific reference plant (s) data that                                        ,

constitutes the Simulator Design Data Base. l J 1.2 To define the controls required to adequately manage the-Simulator Design Data Base (s). 1.3 To develop unit specific data indexes and collect the data from which the simulator (s) were designed and/or upgrading has been or may be based. This is a phased approach. All units will be in compliance by the end of 1989. 2.0 -APPLICABILITY 2.1 This procedure applies to all persons involved in the development and maintenance of the design data 5ases for the Millstone'1, 2, 3 and Connecticut Yankee simulators.

3.0 REFERENCES

3.1 ANSI /ANS 3.5-1985 - This standard states the minimal functional requirements on design data and simulator performance and operability testing.

                                      ~3.2    NRC RG 1.149 Rev. 1, April, 1987 - This guide describes an acceptable methodology for certification by endorsing ANSI /ANS-3.5, 1985 with some additional
                                             . requirements.

3.3 10CFR 55.45(b) - Mandates a timetable for simulator facility certification and specifies additional testing requirements. I 3.4 INPO Good Practice TQ-504 - Describes techniques for effectively controlling simulator configuration. 3.5 NUREG 1258 Draft, December, 1987 - Describes the procedures and techniques which will be employed to audit certified facilities. 3.6 INPO 86-026, Guideline For Simulator Training, October, 1986. 3.7 INPO 87-006, Report on Configuration Management in the Nuclear Utility Industry, July, 1987. I Rev.: 3 Date: 1/12/89 Page: 1 of 9 NSEM-3.01 Y _- - _ _

 .N,
                ?4 .0L                 DEFINITIONS O                                4.1L   Design Data Base     The reference plant data which is the basis'for the current simulator-hardware-configuration and software models.

14.2 NSEM-5.01 - Simulator Modification ~ Control Procedure L

                                 .4.3         SDC - A Simulator Design' Change _ package which-contains the relevant DR(s)-and all forms indicated on STS-BI-F1E. .It is designed to provide control over changes, ensure that documentation is complete
    ~

and satisfy'the requirements.of ANSI /ANS 3.5-1985.

                                                                                                              \

4.4 DR -'The Deficiency Report Form (STS-BI-FlA).used by ' OTB and STSB to record identified simulator deficiencies.between the simulator and1the reference i plant within.the scope of simulation.

                                 ;4.5         Designated Unit Software Coordinator - Person (s)               j assigned by Supervisor,' Simulator computer' Engineering (SCE) to oversee activities of Software              l Engineering Group'.                                             )

4.6 OTB - Operator Training Branch of the Nuclear '

Training Department.

4.7 STSB - Simulator Technical Support Branch of the 1 f D Nuclear; Training Department.

                                  '4.8       ' Plant Maintenance Management' System (PMMS)       A.

process and approach'to the planning management'and'  ;

                                             ' control of maintenance activities. It contains pertinent data'on a specific piece of equipment or instrument.

4.9 PDCR - A' Plant Design Change Record which contains. all necessary information and forms to accomplish in an orderly. manner, the modification of a plant system, structure or component. 4.10 Reference Plant Data Form (STS-B1-F1H) - A form used to validate supplemental data for simulator systems modification / tuning when conventional data sources are unavailable. Example: Empirical data, assumptions, special tests. I Rev.: 3 3 Date: 1/12/89 Page: 2 of 9 NSEM-3.01 ________ _ _ _ _ _ - - _ _ _ s

~

c. L ..

   /s         4.11      Controlled Data - Data which meets all requirements I(_)                     set'forth in control procedure.

4.12 Reference Library - Location where controlled identified reference materials are stored. 5.0 RESPONSIBILITIES 5 .1' Manager, Simulator Technical Support Branch (STSB) Overall responsibility for the development and implementation of the Simulator Design Data Base Control Procedure. 5.2 Nuclear Training Department Supervisors Responsible for. assigning the resources for the coordination and implementation of this procedure. 5.3 Designated Unit Software Coordinator Responsible for identifying simulator data base update requirements and providing unit documentation control. 5.4 Supervisor, Administrative Services and Records Management (ASRMS) Responsible for the collection of the specific data which shall be included in the design data bases. 5.4.1 Responsible for providing suitable locations for the storage of design data base material. 5.4.2 Responsible for developing an access control system for the design data bases. 5.4.3 Responsible for maintaining an up-to-date active design data base for each simulator. 5.5 Unit Operations consultants 5.5.1 Responsible for identifying the specific centents of each design data base. . 1 5.5.2 Responsible for validating data in the design data bases. l I ( Rev.: 3

   '"                                                              Date:  1/12/89 Page:  3 of 9 NSEM-3.01 1

l \ 3

  -.                                                                                j J

b 6.0' INSTRUCTIONS , J The following list identifies the categories of data which j shall be included in each simulator's design reference i plant. data base. Attachments 8.1, 8.2, 8.3 and 8.4 identify the specific data in each category that shall be maintained in each simulator's design data base. q t ASRMS maintains up to date, multiple copies of the various J types.of reference documents (e.g. Technical l Specifications, FSAR, Plant Procedures, etc.). Only one l location is listed for each of the data types referenced in l this procedure. The reader should. understand that other controlled copies may exist in other locations. 6.1 Plant Procedures and Forms 6 . l '.1 The Nuclear Training Department (NTD) shall be on the distribution list for all plant procedures and forms. As new revisions are received, they shall be updated by the Administrative Services and Records Management Section (ASRMS). 6.1.2 A current revised copy of all plant procedures and forms shall be located in the Reference Library.

 .(~O}

6.1.3 The control of plant procedures shall be under the jurisdiction of the Administrative Services and Records Management Section (ASRMS). 6.2 Plant Drawings 6.2.1 The Nuclear Training Department (NTD) shall be on the distribution list for all the drawings identified in the design data base indexes. ASRMS shall update the design data bases as new drawings are received. 6.2.2 The current revision of all plant drawings identified in the design data indexes shall be included in each simulator's design data base. 6.2.3 All drawings shall be maintained on aperture cards located in the Reference Library. 6.2.4 The Generation Records Information Tracking Systsm (GRITS), a computerized program for the control of all NUSCo drawings, shall be used as a control mechanism. , Rev.: 3 Date: 1/12/89 ' Page: 4 of 9 NSEM-3.01

      ~

l,c,

p. 6.2.5 The control of plant drawings shall be under kl s) the jurisdiction of the Administrative Services and Records Management Section (ASRMS).

l~ L 6.3 Technical Manuals 6.3.1 A current copy of all technical manuals identified in the design data indexes shall be IF included in each simulator's design data base. 6.3.2 A copy of each technical manual shall be y maintained in the Reference Library. L 6.3.3 The control of technical. manuals shall be

                              'under the jurisdiction of.the Administrative Services and Records Management Section i                               (ASRMS).

6.4 Final Safety Analysis Report (FSAR) 6.4.1 A controlled copy of the FSAR for each simulator-shall be located in the Reference l Library. L 6.4.2 The Nuclear' Training Department (NTD) shall be on'the distribution list for all revisions to the FSAR's. All copies shall be maintained i current by ASRMS. 6.5 Technical Specifications , 6.5.1 Controlled copias of Technical Specifications L shall be located in the Reference Library. 6.5.2 The Nuclear Training Department (NTD) shall be i on the distribution list for all revisions to the Technical Specifications. All copies shall be maintained current by ASRMS. 6.6 Plant Process computer (PPC) Technical Manuals n 6.6.1 A current copy of PPC Manuals identified in L the Data Index shall be located in each unit's engineering office. 6.6.2 ASRMS shall maintain these current.

  .h v

Rev.: Date: 3 1/12/89 Page: 5 of 9 NSEM-3.01

t7 l 6.7 Simulator Design Changes (SDC)

    ?              6.7.1   All OPEN SDC's shall be considered part of the simulators' design data bases.

6.7.2 'The OPEN SDC's shall be filed numerically in the' individual software engineering offices. 6.7.3 open SDC's folders shall be color coded by unit, i.e. yellow-MP1, green-MP2, blue-MP3 and red-CY. 6.7.4 The SDC computer tracking system (CMS) shall be used to locate specific SDC folders. 6.7.5 The control of the SDC shall be under the jurisdiction of the applicable Unit Software Coordinator. A simple library card record type system will be used should it be necessary to remove a SDC folder from its storage location. 6.7.6 The Unit Operations Consultants shall review open SDC's for their particular units on a periodic basis. l 6.8 Simulator Reference Plant Data Forms

     /             (STS-BI-FlH) l i
  ' k ])

6.8.1 This form'shall be used by OTB to provide data -' necessary for simulator modifications when no other data is available; i.e. assumptions, and empirical. data. (Ref: NSEM 5.01) i 6.8.2 One copy of this form shall remain with the applicable SDC folder and the original shall be filed in the design data base Supplemental Data File located in the individual software engineering offices. 6.8.3 The control of Simulator Reference Plant Data Forms shall be under the jurisdiction of the applicable Unit Software coordinator. A simple library card record type system will be i used should it be necessary to remove simulator reference plant data from its storage location. I L

  ;                                                                                                                  Rev.:     3 1/12/89 Date:

Page: 6 of 9 NSEM-3.01

f- 6.9 Plant Photo (s)/ Video Tape (s)

 't 6.9.1   A photo record of all reference plant hardware within the scope of simulation shall be included in the design data base for each simulator.                                                                 >

6.9.2 Photographs shall be taken on a periodic basis, per the NSEM 4.12 and NSEM 4.07. 6.9.3 A current copy of the photographs shall be maintained as part of the design data base in the individual software engineering offices. A simple library card record type-system will be used should it be necessary to remove plant photo (s)/ video tape (s) from its storage location. 6.9.4 Video tape (s) and/or still photo (s) shall also be made for any plant modifications that occur between the periodic video tape recordings. These shall be identified with the Deficiency

                                 -Report number. These supplemental video (s)/

still photo (s) .shall be filed in each unit's software engineers office. , 6.9 5 The Production Maintenance Management System

  /"'N                             (PMMS) shall be used to supplement the video (j                              tapes /still photos allowing complete identi-fication of reference plant hardware.

6.9.6 OTB shall review the plant photographs on a periodic basis according to the NSEM. 6.10 Supplemental Data rile 6.10.1 This data base file shall be arranged by , simulator system and maintained in the individual software engineering offices. I 6.10.2 This file shall include such design data as:

                                   . Simulator Vendor /NU Telecons
                                   . Simulator Vendor Data Requests                                          ;
                                   . Plant Performance Data                                                   i
                                   . Simulator Reference Plant Data Forms (STS-BI-FlH)                                                            J j
                                   . Other Unit Specific Data l
   /                                                                  Rev.:  3
   \w-)                                                               Date:  1/12/89 Page:  7 of 9 NSEM-3.01

(

4y h

                     ><                     6.10.3 The control of the supplemental data file
        .(                         )..             shall be under the jurisdiction of the applicable Unit Software Coordinator. A simple. library. card record type system will be used should it be necessary to remove supplemental data from its storage location.

I 6.11 Industry and Reference Plant Studies / Reports 6.11~.1 A controlled copy of all applicable studies / reports shall be maintained in the Reference Library. 6.11.2 These studies / reports shall be under the jurisdiction of ASRMS. 6.12 Reference Plant Data Book (PDB) 6.12.1 A compilation of reference plant data ft,r specific plant transients / evolutions. The data defines plant response to specific events which have occurred at the reference plant. 6.12.2 Material contained in the PDB may be used for training development or as supporting data for DR submittal. f

                                   )        6.12.3 Due to uncertainties imposed by operator actions, initial conditions, etc., events contained in the PDB shall be validated via an approved test procedure prior to use in            j developing acceptance criteria for certification testing.

6.12.4 Maintenance of.the PDB shall be the responsi-bility of the applicable Unit Operations Consultant. 6.12.5 The PDB shall be located in the reference library under the control of ASRMS. O Rev.: Date: Page: 3 1/12/89 8 of 9 NSEM-3.01 1 _ _ _ _ _ _ _ _ _ - - - _ _ _ _ _ _ _ __ _ _J

                                          ' 7 .' O FIGURES 7.1   Data Base Location Chart l-8.0  ATTACHMENTS' 8.1:  Millstone' Unit 1 (MP-1) Data Index
                                                   ' 8. 2  Millstone Unit.2 (MP-2) Data Index 8.3   Millstone Unit 3 (MP-3) Data Index 8.4   Connecticut Yankee (CY) Data Index 8.5   Marginal Note Directory l

Rev.: 3 Date: 1/12/89 l Page: 9 of 9 NSEM-3.01

           -i-it i

1 Figure 7.1 [ A s 1 DATA BASE L0 CATION CHART

  .4 TRAINING                                 - '

DATA REFERENCE CONTROL SW ENG TYPE LIBRARY ROOM OFFICE

                . Plant Procedures /For"i .. X                                       .........................................

Plant Drawings .... ..... X .........................................

                ' Technical Manuals                                .......         X .........................................
                'FSAR .................... X ..........................................

Technical Specifications X ..........................................

     \

[~' Open SDC's ............................................... X ........ Supplemental Data File ................................... X ........ Videos /Still' Photos ...................................... X ........ PPC' Technical Manuals .................................... X Reference Plant 3 Data Book ............. X ......................................... PMMS ................. NU PMMS (TSO) ................................ Studies / Reports ......... X ......................................... 3 Rev.: 3 Date: 1/12/89 Page: 7.1-1 of 1 NSEM-3.01

. q. ..

Attachment 8.5 j .:. MARGINAL NOTE DIRECTORY l l

1. Added reference to NSEM procedures which specifically address the frequency requirements for photographs.
2. Section added to address new data type.
3. Location of new data type added to figure.

i Rev.: 3 Date: 1/12/89 Page: 8.5-1 of 1 NSEM-3.01 j

' / a

. (_,/

NORTHEASF UTILITIES NUCLEAR SIMULATOR ENGINEERING MANUAL NSEM - 3.02 CONTROL OF SIMULATOR DESIGN DATA DOCUMENTATION Responsible I I

                                                                                                             //    j/

Individual: , Q . ( Y A ///fa kge , Sim'ulator Te'dhnical

                                                                                       /ahIi          Support               ;

I Approved: D Nuclear Training i Revision: 2 Date: 11/09/88 i SCCC Meeting No: 88-004 i

               - _ _ - _ _ _ _ _ - _ . _ _ _ _ _ - .                                                                        }
    .,                  _g                                                                        - - - - - - - - - - -

[. i n.

      ~

N l l' l.0 PURPOSE l 1.1 To define the documents required for certification and maintenance of the Millstone 1,2,3 and Connecticut Yankee simulators. 1;2 To define the contents of each document.

2. 0. APPLICABILITY 2.1 The procedure applies to all persons involved in the development and maintenance of the simulators.

I

3.0 REFERENCES

1 3.1 ANSI /ANS 3.5-1985 - This standard states the minimal functional requirements on design data and simulator  ! performance and operability testing. 3.2 NRC RG 1.149 Rev 1, April, 1987 - This guide describes an acceptable methodology for certification by endorsing ANSI /ANS-3.5, 1985 with some additional requirements. fs 3.3 10CFR 55.45(b) - Mandates a timetable for simulator facility certification and specifies additional ' testing requirements. 3.4 INPO Good Practice T0-504 - Describes techniques for effectively controlling simulator configuration.

                                 '3. 5        INPO Good Practice TQ-505 - Describes techniques for effectively controlling simulator configuration.

3.6 NUREG 1258 December, 1987 - Describes the procedures and techniques which will be employed to audit certified facilities. 3.7 INPO 86-026, Guideline For Simulator Training, /\ October, 1986. 3.8 INPO 87-006 Report on Configuration Management in the Nuclear Utility Industry, July, 1987 C Rev.: 2 Date: 11/09/88 Page: 1 of 4 NSEM-3.02

w

       .c I

l 4.0 DEFINITIONS (_f 4.1 Simulator System Documentation Manuals - The document 4 that.contains the design specifications, scope of j computer model, (hardware & software) description, j capabilities, and assumptions for each simulated system model. 4.2 S3 - Simulator Software System is a total integrated software system which supports the development, running (execution) and testing of a real-time simulator and trainer environment. 4.3 Computer Program Coding - The computer system software necessary for the simulation operation, i.e. non-modeling s/w, of the simulators. 4.4 SDC - A Simulator Design Change package which contains the relevant DR'(s) and all forms indicated on . It is designed to provide control over l STS-BI-FlE. changes, ensure that documentation is complete and satisfy the requirements of ANSI /ANS 3.5-1985. 4.5 DR - The Deficiency Report Form (STS-BI-FlA) used by OTB and STSB to record identified simulator jhg deficiencies between the simulator and the reference plant within the scope of simulation. [) 4.6 OTB - Operator Training Branch of the Nuclear Training Department. 4.7 STSB Simulator Technical Support Branch of the Nuclear Training-Department 4.8 Design Automated Auditing Documentation System (DAADS) - An automated computer program for information storage and retrieval resident on the simulator computers. 4.9 NSEM-5.01 - Simulator Modification Control Procedure. 4.10 CDS - Controlled Documentation Storage 5.0 RESPONSIBILITIES 1 5.1 Manager, Simulator Technical Support (STS) - Overall responsibility for the development and /\ implementation of the Simulator Documentation . Procedure. I Rev.: 2 , . k-)' . Date: 11/09/88 Page: 2 of 4 NSEM-3.02 l Li _____ _ _ _ . _ - . _ _ _ _ _  !

ya 5.2 Supervisor, simulator Computer Engineering (SCE)

    ~                                      Responsible for assigning the necessary resources for the coordination and implementation of this procedure.

5.3 Designated Unit Software coordinator - Responsible for identifying and updating the simulator documents. 4

                                                                                                                                )

5.4 Supervisor, Administrative Services and Records .)

                                          ' Management (ASRMS).

Responsible for providing a suitable location for the simulator documents. ] I 6.0 INSTRUCTIONS The following documents are required for the simulator certification and maintenance process. They shall be controlled and updated, but shall not be considered part of i the simulator design data base. 6.1 Simulator System Documentation Manuals 6.1.1 The manuals shall be updated and generated according to NSEM-3.02 using the DAADS program jh( i' and the Simulator Systems Documentation Standards (Attachments 8.1 and 8.2). 6.1.2 The manuals shall be maintained by software engineers in accordance with the Software Data / Documentation Update Requirements Form. 6.1.3 The manuals shall be located in the individual software engineering offices. 6.2 Computer Program Codings 6.2.1 The computer Program coding shall be commented A and updated according to NSEM-3.02 using S3 Q3 (Attachment 8.3). 6.2.2 The computer Program Coding shall be the responsibility of the unit software coordinators. 6.2.3 The Computer Program Coding shall reside in the simulator disk packs. It shall be properly maintained and backed up according to simulator systems documentation standard. Rev.: 2 i ks Date: 11/09/88 Page: 3 of 4 NSEM-3.02

L I 6.3 Closed SDC's

 .f} '
(_/ 6.3.1 ASRMS shall be responsible for filing the closed SDC's.

6.3.2 Closed SDC's shall be located in. controlled Document Storage. (CDS) 6.4 Simulator Test Results 6.4.1 Simulator Test Results shall be located in Controlled Document Storage (CDS). 6.4.2 This category includes the following: 6.4.2.1 The Acceptance Test (ATP) 6.4.2.2 Performance Tests 6.4.2.3 Annual Operability Tests 7.0 FIGURES 7.1 Simulator Design Documents Location Chart 8.0 ATTACHMENTS 8.1 Simulator System Documentation Standards 8.2 Design Automated Auditing Documentation System (DAADS) 8.3 Computer Program Codings 8.4 Marginal Note Directory Rev.: 2  ; O- . Date: 11/09/88 Page: 4 of 4 NSEM-3.02 ____a_________ -__:_-.__.

    -                                                             l i

ATTACHMENT 8.1 (3

   .'d i

SIMULATOR SYSTEM DOCUMENTATION STANDARDS l l l l l i  : 1 l l Rev.: 2 O Date: 11/09/88 Page: 8.1-1 of 26 NSEM-3.02 L

                         ,                  .. p
                                                                                                                                              ) i TABLE OF CONTENTS                                                   -

i l ! 'A. INTRODUCTION j B. CONTP.NTS OF SIMULATOR DESIGN DOCUMENTATION C. - DESCRIPTION AND EXAMPLES OF SIMULATOR DESIGN DOCUMENTATION O l

    ,                                                                                                                                         1 I

Rev.: 2 A Date: 11/09/88 Page: 8.1-2 of 26 NSEM-3.02 l6 __---_..___m___.m____.___.___ ..___._m _ _ _

re) l l ., f' . L. Q,}/ : A. INTRODUCTION l This document is intended to serve as the standard format for the " Simulator System Manuals" (SSM) developed for each simulated plant system of the nuclear power: plant A

                                              <                                              simulators. This   standard    applies   to  all documentation' related     to   plant   simulation.

Examples have been included where appropriate. B. CONTENTS OF " SIMULATOR SYSTEM MANUALS"

                 ~

Each Simulator System Manual shall follow the prescribed standard table of contents as follows: I Rev.: 2

 ~-

Date: 11/09/88 Page: 8.1-3 of 26 NSEM-3.02

g. . - _ - . _ _ _ _ _ _. . _ .
      *1' l^      ,;

l 3.; : . ., Figure 7.1 SIMULATOR DESIGN DOCUMENTS LOCATION CHART CONTROLLED DOCUMENT SIMULATOR SW ENG-DOCUMENT STORAGE DISK PACK OFFICE

                                  ~ Simulator Systems Manual                                                ..............................                        x .......

fm t s Computer Program Coding ................... X .................. Closed SDC's .................. X ............................... Simulator' Test Results......... X ............................... 1

                                                                                                                                                                            )

[' Rev.: 2

    \                                                                                                                                               Date: 11/09/88 Page: 7.1-1 of 1 NSEM-3.02

l w a TABLE OF CONTENTS l,)

    - (f                                         CONTENT                                                                                                   PAGE' 10     SIMULATION SYSTEM OVERVIEW                                                                                                                  6                  l
                ' 1.1                         SYSTEM DESCRIPTION                                                                                             6 I

1.2 SYSTEM ASSUMPTIONS 7 143 SYSTEM SIMPLIFICATIONS 8 1.4 SYSTEM DESIGN REFERENCES 9 . l 2.0 INSTRUCTOR STATION INTERFACE 10 2.1 MALFUNCTIONS 10 2.2 REMOTE-FUNCTIONS 12 . i

3. 0- PROCESS COMPUTER MONITORED PARAMETERS 13 I

4.0 CONTROL' PANEL INSTRUMENTATION 14 4.1 METERS 15 14.2 RECORDERS 16 { y 4.3 CONTROLLERS 17

                '4.4                          LIGHTS                                                                                                         18 4.5                        SWITCHES                                                                                                       19 4.6                        MISCELLANEOUS                                                                                                  20 4.7                        ANNUNCIATORS                                                                                                    21 5.0      SYSTEM' COMPONENTS                                                                                                                         22 5.1                       AIR-OPERATED VALVES                                                                                             22 5.2                       SOLENOID / MOTOR-OPERATED VALVES                                                                                23              7 5.3                       PUMP MOTORS                                                                                                     24 5.4                       METERS TRANSMITTERS                                                                                             25

, 6.0 SIMULATION DIAGRAMS 26 k 1'

    ~^                                                                                                                  Rev.: 2
    /

Date: 11/09/88 Page: 8.1-4 of 26 NSEM-3.02 l-

                             ~

l f j%. c) C. DESCRIPTION AND EXAMPLES OF SIMULATOR DESIGN DOCUMENTATION This section provides a brief description with examples of the contents of each item as /}\ listed in the standard Table of Contents. I i l 2 i Rev.: 2 (~']s A- Date: 11/09/88 . s Page: 8.1-5 of 26 NSEM-3.02 l l l

1.0 SIMULATION SYSTEM OVERVIEW 7y (_) 1.1 System Description The System Description will include a few paragraphs describing the function of the system. This will be followed by a list of the plant systems and subsystems covered by the simulation system, plus a list of the jpg control modules and segments for that system. EXAMPLE: 1.1 System Description The compressed air system consists of the instrument air system, station air system and nitrogen supply system. The instrument air system supplies dry, oil-free air for the pneumatic instruments and controls and the pneumatically operated containment isolation valves. The station air system provides thenecessary air requirements for normal plant operation. The nitrogen supply system provides nitrogen to various components in the plant. The instrument air system consists of two 323 SCFM at 100 psig compressors each of which is capable of supplying 100% of the system's requirements. Two air lines combine into a e s common header after passing through aftercoolers where the ( ) pressure is maintained at 100 psig by an air receiver. The instrument header then supplies air to various instrumentation and control components in the turbine building, enclosure building, and auxiliary building. The station air system consists of a 630 SCFM at 100 psig A compressor, aftercooler and air receiver. It provides a backup to instrument air through a cross-tie to the instrument air header. (Additional backup supply is available from Millstone I). The instrument air in the containment is supplied by the instrument air header and the backup from station air header. An air receiver provides a reserve of air for containment air supply. Control modules and segments for the instrument air system are listed below: Control Module Segment Description IAC4DL (RTEX98IO) IAD11B IA Dynamics IAL12A IA Logic [~ Rev.: 2 (_]/ Date: 11/09/88 Page: 8.1-6 of 26 NSEM-3.02

1.2 System Assumptions j(i

  !s ,b                  A statement of assumption is required when data to adequately design the system is not available and sub-stitute information-is used. The rationale or justi-                                                                                         !

fication for the assumption should be given when applicable. An example of a system assumption is given as follcws: TABLE 1.2 SYSTEM ASSUMPTIONS ITEM NO. ASSUMPTION IA01 LOAD / UNLOAD TIME FOR INSTRUMENT AIR COMPRESSORS BIAS-CIA AND 3IAS-C1B WILL BE ASSUMED TO BE EQUAL TO 30 AND 15-SECONDS. RATIONALE: BASED UPON APPROXIMATE VALUES OBSERVED IN THE REFERENCE PLANT. f l. i Rev.: 2 O Date: 11/09/88 Page: 8.1-7 of 26 NSEM-3.02

r - h .. l:f3

  ; (.,/    1. 3' System Simplifications g

I A statement of simplification is_ required when a simplified or limited design that does not strictly adhere to design data. Design simplifications are not related to data voids, but rather to how the data is used. The rationale or justification for the simplification should be given whenever possible. An example of a system simplification is given as follows: TABLE 1.3 [ SYSTEM SIMPLIFICATIONS ITEM NO. SIMPLIFICATION 001 SIMPLIFICATION: SAMPLING LINE VALVES 2-RC-001 AND 2-RC-002 WILL REMAIN CLOSED. RATIONALE: HANDSWITCHES TO OPEN THESE VALVES gg ARE ON PANEL C-72. L) 002 SIMPLIFICATION: VALVE 2-RC-003 WILL BE SIMULATED AS A NORMALLY OPEN VALVE. BUILD-UP OF NON-CONDENSIBLES IN THE PRESSURIZER DURING NORMAL OPERATION IS NEGLIGIBLE AND HENCE NO MASS FLOW WILL BE SIMULATED THROUGH THIS VALVE. RATIONALE: DURING NORMAL PLANT OPERATION, IT TAKES A LONG PERIOD OF TIME (SEVERAL DAYS TO WEEKS) TO BUILD UP NON CONDENSIBLES IN THE PRESSURIZER. VALVE 2-RC-003 IS NORMALLY LEFT OPEN AT THE PLANT. l (~} Rev.: 2 Date: 11/09/88

    \_/

Page: 8.1-8 of 26 NSEM-3.02

,-),

           ..--            1.4       Design ~ Reference                                                             -
     . fy.
        \_)c                        All plant data actually used in the development'of the system model shall be tabulated.. The information for this tabulation shall be obtained from the Design data base index given in " Generic Design Data Base Procedure".

An' example of this tabulation is shown below: TABLE 1.4 DESIGN REFERENCES DATA REV-NO. NO. DATE 2001 0 062088 MP2 REFERENCE PLANT DRAWINGS FOR SIMULATOR DESIGN 2005~ 0 062988 MP2 SUPPLEMENTAL DATA FILE FOR CHEMICAL CONTROL SYSTEM r f

    . ( )' ,

2029 0 061082 MP2 FINAL SAFETY ANALYSIS REPORT (FSAR) 2030 0 110583 MP2 SAFETY TECH SPEC 2034 0 MP2 OP 2387E CONTROL ROOM ANNUNCIATOR RESPONSE l I~ Rev.: 2 I Date: 11/09/88 1 Page: 8.1-9 of 26 NSEM-3.02 I

p ..  ! p;c 2.0 Inst'ructor-Station Interfaces

       \_

2.1 Malfunctions Malfunctions are included in the specification from a master file generated by the instructor (Cause/ Effects A l

                                                                                                                                                       /7 V Document).

An example of a RCP thermal barrier tube rupture malfunction is given as follows: TABLE 2.1 t MALFUNCTION MALF NO. MALFUNCTION COMMENTS RC20.RCP THERMAL BARRIER TUBE RUPTURE VARIABLE: 100% - 50 GPM NORMAL DP WITHIN HX (2100 PSID) A-RCP A C-RCP.C-B-RCP B D-RCP D i TYPE: GENERIC, VARIABLE

      /~

(,T j CAUSE: TUBE FAILURE PLT STA: POWER OPERATION EFFECTS: MAT P A - THIS MALFUNCTION WILL CAUSE REACTOR COOLANT SYSTEM LEAKAGE FROM THE THERMAL BARRIER OF RCP A TO THE REACTOR BUILDING CLOSED COOLING WATER SYSTEM. SEAL TEMPERATURES AND FLOWS WILL NOT BE AFFECTED BY THIS MALFUNCTION. THE REACTOR COOLANT ENTERING THE RBCCW SYSTEM WILL CAUSE'THE PUMP RBCCW RETURN TEMPERATURE TO INCREASE. Rev.: 2 O. Date: 11/09/88 Page: 8.1-10 of 26 NSEM-3.02 , 1 l i l

p.- , W(_,/' RBCCW SURGE TANK LEVEL WILL INCREASE AS WILL THE SYSTEMS ACTIVITY LEVEL AS MONITORED BY RIT-6038. RBCCW HEAT EXCHANGER INLET TEMPERATURES WILL INCREASE DUE TO THE INJECTION OF PRIMARY COOLANT RESULTING IN OPENING OF THEIR SERVICE WATER OUTLET VALVES TO MAINTAIN ITS OUTLET TEMPERATURE. MALF B - EFFECTS-SIMILAR TO MALF A EXCEPT THAT RCP B. THERMAL BARRIER IS.THE MAJOR AFFECTED COMPONENT. MALF C - EFFECTS SIMILAR TO MALF A EXCEPT THAT RCP C THERMAL BARRIER IS THE MAJOR AFFECTED COMPONENT. MALF D - EFFECTS SIMILAR TO MALF A EXCEPT THAT RCP D THERMAL BARRIER IS THE MAJOR AFFECTED COMPONENT. MALFUNCTION REMOVAL WILL RESTORE THE SELECTED FAILED RCP THERMAL BARRIER PER-FORMANCE TO NORMAL WHICH WILL STOP THE RCS LEAKAGE.

 't     ;

i c.-e Rev.; 2 Date: 11/09/88 Page: 8 1-11 of 26 NSEM-3.02

                                                                                                              \

_ __ _______ Y

[^h () 2.2 Remote Functions , All remote functions which directly interface with the system shall be tabulated. l 11N An example of a tabulation of remote functions associated with a RCS is given as follows: { 1 l i TABLE 2.2 l REMOTE FUNCTIONS R.F. TITLE l NO. RANGE CONDITION COMMENTS ' RCR01 PRESSURIZER VENT VALVES 2-RC-C21, & 2-RC-421 RANGE-COND: OPEN/CLOSE RCRC2 RX VESSEL /RC LOOPS / PRESSURIZER /VCT BORON CONCENTRATION l RANGE-COND: 0-2000 PPM

             ~s RCRC3                  PRESSURIZER BORON CONCENTRATION (N/ )                             RANGE-COND: 0-2000 PPM RCR04     RCP-40A RACKOUT RANGE-COND: IN/OUT RCR05     RCP-40B RACKOUT RANGE-COND: IN/OUT RCR06     RCP-40C RACKOUT RANGE-COND: IN/OUT RCR07     RCP-40D RACKOUT RANGE-COND: IN/OUT l

RCR08 PZR NITROGEN SUPPLY VALVES 2-RC-030 & 2-RC-015 i RANGE-COND: OPEN/CLOSE l 1 l

        /~'                                                                                                      Rev.: 2                            !

l (_ Date: 11/09/88 Page: 8.1-12 of 26 l NSEM-3.02 l

           ,4 3' 0         Process Computer Monitored Parameters All parameters monitored by the plant process computer shall be l                             tabulated. These parameters are generated by PPC engineer and                                      1 controlled by a master file. Any PPC point not simulated should be identified by "NS" in the-COMMENTS column.

An example is given as follows: TABLE-3.0 PROCESS COMPUTER MONITORED PARAMETERS PPC POINT ID PARAMETER / RANGE UNITS COMMENTS F166 C/0 RCP C A-REVERSE FLO ABNORM / NORM 1 F167 C/0 NS

               .RCP C REVERSE ROTATE FLO                                ABNORM / NORM.

F170*1- C/0 RCP B CNTL BLEEDOFF FLO HI/ NORM F170*2 C/0

       '      ~RCP B CNTL BLEEDOFF FLO                                  LO/ NORM F174                                                    C/0 RCP B-L/O FLO                                            LO/ NORM F176                                                    C/0 RCP B A-REVERSE FLO                                     ABNORM / NORM F177                                                    C/0                           NS RCP B REVERSE ROTATE FLO                                ABNORM / NORM F180*1                                                  C/0 RCP D CNTL BLEEDOFF FLO                                 HI/ NORM l

F180*2 C/0 RCP D CNTL BLEEDOFF FDR LO/ NORM

                                                                                                                                  )

Rev.: 2 Date: 11/09/88 Page: 8.1-13 of 26 NSEM-3.02

g v. o

  • L i (_) 4 . 0 Control Panel Instrumentation L

This section contains tabulations, grouped by types of instrumentation, of all the control board instrumentation A controlled or accessed by the system. Non-simulated instru- R\ mentation shall be indicated by "NS" under the COMMENTS column. , The tjpes of control panel instruments tabulated are:  !

                               . Meters (Para. 4.1)
                               . Recorders (Para. 4.2)
                               . Controllers (Para. 4.3)
                               . Lights (Para. 4.4)
                               . Switches (Para. 4.5)
                               . Miscellaneous (Para. 4.6)
                               . Annunciators (Para. 4.7)

Some specific details are given as follows: o The Hardware Item ID can be obtained directly from the Instrumentation and Control (I&C) list. o The Plant Tag Number should be taken directly from the I&C /h\ list. The creation of non-existent tag number may be required. 7g o The annunciator list is developed from plant drawings. r o

    \j i

i l 1 l I

                                                                                        )

i i i l I l l i f'} m-Rev.: 2 Date: 11/09/88 Page: 8.1-14 of 26 { i I NSEM-3.02 1 I

 .<-1, "wf 4.1- Meters.

An example on tabulation of panel meters is.given as follows: TABLE 4.1 METERS HARDWARE ID ~ PLANT RANGE / DESCRIPTION

                                             ~

PLANT TAG NUMBER PANEL DATA NUM COMMENTS k 02A1A2M16' PI-208 CO2 28500-204 0-150 PS DELIVERY PR. FOR BA PUMP 19B 02A1A2M17' 0 - 1 0 0' %.- LEVEL IN TANK T8B LI-208 CO2 28500-203 d I 02A1A2M18. PI-215 CO2 28500-215

        \         0-300 PSIG                      RC CONTROLLED BLEEDOF PRESSURE 02A1AM01                                                PDI-204                          CO2    28500-199 0-40 PSIG                       LETDQAN POST FILTER PR-DROP l.

I-O Rev.: 2 Date: 11/09/88 Page: 8.1-15 of 26 NSEM-3.02

       = p.

($: . 4.2 U Recorders An example on tabulation _of panel recorders is given as follows: TAB $E 4.2 RECORDERS HARDWARE ID PLANT RANGE / DESCRIPTION PLANT TAG NUMBER PANEL DATA NUM COMMENTS /i t 03A1A2AR1' FRC-210Y C03 28500-207A,C 0-30 GPM 2 PEN 1 IN/HR TO REC BORIC ACID MAKEUP FLOW TO VCT 03A1A2AR2 AR-203 C03 28500-198 /\ 0 2000 PPM 1 PEN 1 IN/HR BORONOMETER RECORDER f%

 't sl03A1A2AR4                                 FRC-210X                        C03   28500-207B,C 0-150 GPM         2 PEN 1 IN/HR TO RECORD PMW FLOW TO VCT
                                                                                                              )

l I l Rev.: 2 LO Date: 11/09/88 Page: 8.1-16 of 26 NSEM-3.02 f L C -- .

i .. ' 'j ,

     +                            -

(-

       \s d                               4.3    Controllers.

An example on tabulation of panel controllers is given as-follows:- TABLE 4.3 CONTROLLERS HARDWARE ID PLANT RANGE / DESCRIPTION PLANT TAG NUMBER PANEL DATA NUM COMMENTS /\ 02A1A2A1 PIC-201 CO2 28500-195 LET DOWN PRESS. DOWNSTREAM OF.HX A 02A1A2A2 TIC-223 CO2 28500-222 LET DOWN TEMP. DOWNSTREAM OF H-X O i j Rev.: 2 Date: 11/09/88 Page: 8.1_17 of 26 NSEM-3.02

-g v 4 '. 4 Lights' g An example.on. tabulation of panel lights is given'as. follows: TABLE 4.4 l 1 LIGHTS HARDWARE ID- PLANT 1-RANGE / DESCRIPTION PLANT TAG NUMBER PANEL DATA NUM COMMENTS - 02A1A2DS01 HS2520 CO2 32009-27 ION EXCHANGERS BYPASSED (GREEN) 02A1A2DS03 HS2520 CO2 32009-27 h, ION EXCHANGERS NOT BYPASSED (RED) 02A1A2DS03 HS2520 CO2 32009-27 ION EXCH. VALVE. STATUS (WHITE) i Rev.: 2

   '{)'

Date: 11/09/88 Page.: 8.1-18 of 26 NSEM-3.02

                       ~.
          , d'*%

t ! J't-c - 4.5 Switches . An example on tabulation of panel switches is given a follows:; TABLE 4.5 SWITCHES HARDWARE ID PLANT RANGE / DESCRIPTION PLA11T TAG NUMBER PANEL DATA NUM' COMMENTS ff( 02A1A2501 HS-2520 CO2 32009-27 ION EXCHANGERS BYPASS VALVE 02A1A2802 HS-2196A CO2 32009-53 l SWITCH.TO RESET HS-2196 AFTER ESAS OR LOSS OF PWR l HS-2500 CO2 32009-28 [~f02A1A2503

      \,                                                 LETDOWN FLOW TO VCT OR WD VALVE                                           l 02A1A2SO4                                    HS-2513                  CO2       32009-29 GASES FROM VCT TO WASTE GAS HDR. VALVE 02A1A2S05                                    HS-2521                  CO2       32009-26 LETDOWN FLOW TO ANALIZER AND RM VALVE 02A1A2506                                    HS-2196                  CO2       28117-29 VCT BYPASS VALVE                                                          i 02A1A2507                                    HS-2504                  CO2       28117-27 PRIMARY WATER TO CH. PUMPS VALVE Rev.: 2 s-                                                                                             Date: 11/09/88 Page: 8.1-19 of 26 NSEM-3.02

L 1  ! u i

1. .
                          ~

l O f~~ j l t(_ . 4.6 Miscellaneous g j An example on tabulation of other miscellaneous panel items is given as follows: J f TABLE 4.6 MISCELLANEOUS l HARDWARE ID PLANT  ; RANGE / DESCRIPTION' PLANT TAG NUMBER PANEL DATA NUM COMMENTS /fi

                        '02BlA1Al-                         FQI-2539                            CO2R  28500-208 LETDOWN TO WASTE DISPOSAL 03A1A2A3l                         FQI-210Y                            C03   28500-215 BA MAKEUP BATCH COUNTER 03A1A2A5                          FQI-210X                            C03   28500-207A,B PMW BATCH COUNTER f]
             \/

Rev.: 2 Date: 11/09/88 Page: 8.1-20 of 26 NSEM-3.02

WPe :- i r~3 (/ 4.7 Annunciators g .I An example on tabulation of annunciators is'given as

                                                  'follows:

.L TABLE 4.7 ANNUNCIATORS ANNUNCIATOR BOX NUMBER WINDOW . DESCRIPTION NUMBER DATA NUM COMMENTS 1C02 Al 28500-200

              . BORIC ACID TANK A LEVEL HI 1C02                                                            A10                       28500-215 RCP CONTROL BLEED-OFF PRES HI
                      ' C02 1                                                              A12                       32009-49 CHARGE PUMP.A SEAL LUBE SYSTEM PRES HI/LO 1CO2.                                                         A13                       32009-49 CHARGE PUMP B SEAL ~ LUBE SYSTEM PRES HI/LO 1C02                                                          A14                       32009_49-
                 ' CHARGE PUMP C SEAL LUBE SYSTEM PRES HI/LO I

Rev.: 2 O Date: 11/09/88 Page: 8.1-21 of 26 NSEM-3.02

7-

     ~

l 1

                                                                                                                                      )

5.0 System Component l . s i (_) This section provides the specific details on simulated devices l within the system. Simulated devices are:

                                                                     . Air-Operated Valves (Para. 5.1)
                                                                     . Solenoid / Motor Operated Valves (Para. 5.2)
                                                                     . Pump Motors (Para. 5.3)
                                                                      . Meters / Transmitters (Para. 5.4)                      jp An example of t'.te development of this section for a particular                   ;

system (Chemical Volume Control) is described in the following  ! paragraphs. j l 5.1 Air-Operated valves 1 I An example on tabulation of air-operated valves is given as follows: TABLE 5.1 l 1 AIR-OPERATED VALVES OPEN/ s VALVE CONTROL CLOSE > POSITION FAIL AIR CONTROL TIME, ('"') DESCRIPTION SYMBOL MODE PRESSURE POWER SEC _ TAG NO. 2-CH-089 LT. DOWN ISOLATION OUT OF CTMT 0159 CVVCH089 FC IAPAUR2C ED:BDV20 1.14 2-CH-110P LT. DOWN FLOW CONTROL CVVCH110P FC IAPAURIC ED:BVR21 0161 5.0

  • ffi 2-CH-1100 LT. DOWN FLOW CONTROL (BACKUP) 0163 CVVCH110Q FC IAPAURIC ED:BVR21 5.0
  • 2-CH-192 RWST VALVE 0165 CVVCH192 FC CVCH192A ED:BODil 5.0
  • 2-CH-196 VCT BYPASS VALVE 0167 CVVCH196 FC IAPAUR2C ED:B0011 0.58
  /~                                                                                                          Rev.: 2

(_,} Date: 11/09/88 Page: 8.1-22 of 26 NSEM-3.02

c- 7Tv, 7 l i

         .                                                                                                                      I fr-5.2 Solenoid / Motor-Operated valves l- ( /!

An example on tabulation of solenoid / motor-operated valves I L I is given as follows: TABLE 5.2 SOLENOID / MOTOR OPERATED VALVES OPEN/ SOLEND VALVE CLOSE DEENERG BREAKER TIME, TAG NO. DESCRIPTION STATE SYMBOL POWER SEC

            .2-CH-429      CHARGING PUMP OUTLET. CONTROL VALVE N/A        CVVCH429  ED: BOB 61   10.0 S*                                       l CH-501     VCT OUTLET' CONTROL VALVE                                                              /h\'

N/A 'CVVCH501 ED: BOB 51 8.5 S

      -/

2-CH-504 PMW AND BA TO CH. PUMP SUCT. N/A CVVCH504 ED: BOB 51- 8.4 S 2-CH-508 BA GRAVITY TEED FROM TBA N/A CVVCH508 ED: BOB 51 8.5 S Rev.: 2 O Date: 11/09/88 Page: 8.1-23 of 26 NSEM-3.02

L! s u ;- > r h g'.s - L yQ, '

( / . 5.3- ' Pump' Motors

( ): . An example on tabulation of pump-motors is given as , follows:- TABLE 5.3 PUMP MOTORS 1+

  • MOTOR OPERATING PUMP BREAKER' ' BUS CONTROL CURRENT UP/DOWN TAG NO. DESCRIPTION- STATE POWER POWER (AMPS) TIME, SEC P-18A CHARGING. PUMP A g

CV:P18A42 ED: BOB 51 ED: BOB 51- 118A_ 2.0S o P-18B. CHARGING PUMP B CV: 018B42 ED:B0B51,ED:B0B51 118A 2.0S' O P-18C CHARGING PUMP C 118A 2.0S CV:P18C42 ED: BOB 61 ED: BOB 61 l l I l l Rev.: 2

         -                                                                                                                   Date: 11/09/88 Page: 8.1-24 of 26      ;

NSEM-3.02

         ,.b )
                   ' ee :

e... .. 5 .' 4 ' Meters / Transmitters [ !. An example on tabulation of transmitters is given as follows: TABLE 5.4 h METERS / TRANSMITTERS i METER TRANSMITTER METER LIGHT POWER POWER POWER TAG NO. DESCRIPTION RANGE / UNITS SYMBOL- SYMBOL SYMBOL /

               . AR-203                   BORONOMETER 0-2000 PPM-          ED:BVR11     ED:BVR11               N/A FI-202               LETDOWN FLOW ~ RATE UPSTREAM OF PRIFILTER 0-140.GPM            ED:BVR11 ED:BVR11                   N/A
  .(
    % FI-212                              CHARGING FLOW RATE 0-140 GPM            ED:BVR21     ED:BVR21               N/A A                                                                                                Rev.: 2 f
  .\

Date: 11/09/88 Page: 8.1-25 of 26 NSEM-3.02

f a.

     ~

6.0 Simulation Diagrams Simulation Diagrams are meant to provide a visual scope of the reference plant system components being simulated. The diagrams for. fluid systems will be in the form of a Piping and Instrumentation Diagram (P&ID). One_line diagram formats will  ! be used for electrical distribution systems. Logic systems and simulator unique software will be presented in block diagram format.or a control system functional diagram. The simulation diagrams must. include:

                                                         .            All simulated hardware listed in 4.0 except for lights and status lamps.
                                                         .            All: Malfunctions listed in 2.1.
                                                         .            All remote functins listed in 2.2.
                                                         .            Common annunciators, remote functions, malfunctions,      ;

etc., will be' drawn as a single symbol with multiple  ! connectors.  !

                                                         .            Analog Plant Process Computer (PPC) Points.
                                                         .            Transmitters required by an interfacing system or having an associated malfunction.
                                                         .            All simulated check valves, orifices and relief valves will be included.
                                                         .            In-line process flow radiation detectors and area monitors will be shown as transmitters in the appropriate system with an interface to Radiation Monitor.                                                  l
                                                         .            The P&ID reference drawing number.                        I J

A standard symbol table has been developed for the simulation diagrams. An example of Section 6.0 is shown as follows: A 6.0 SYSTEM SIMULATION DIAGRAMS The Simulation Diagrams are not included within the SSM, but are filed in the appropriate engineering office. Rev.: 2 O Date: 11/09/88 Page: 8.1-26 of 26  ; I i j

N ATTACHMENT 8.2 DESIGN AUTOMATED AUDITING DOCUMENTATION SYSTEM (DAADS)

1.0 INTRODUCTION

The purpose of this manual is to provide useful information to the users of DAADS (Design Automated Auditing and Documentation System). The manual is organized into sections as follows: L General Organization Data Organization DAADS Functions BDAAOS 2.0 GENERAL ORGANIZATION DAADS is an information storage and retrieval system based on the SEL 32 computers. It is designed to automate the production.of the NUSCO's Simulator System Documentation. DAADS is a TSM Task and can be activated on any TSM Terminal by typing "NDAADS" in response to the "TSM>" Prompt. All files accessed by DAADS must be explicitly created with the file manager or by a store command on the

   .(           )                                 TSS/TSM Text Editor.

3.0 DATA ORGANIZATION DAADS uses a multiplicity of files. Files directly used by DAADS are required to follow a naning convention based on a three letter trainer ID and two letter system IDS. By these conventions, the system can dynamically allocate files where it expects to find the particular information. Most data sets are of fixed format entries, these are described in the second subsection of this section. I 3.1 DAADS FILES 3.1.1 DAADS TRAINER MASTER FILES Trainer Information File The trainer information file contains the trainer name, contract / spec title and the system designators and passwords for each  ! system defined for the trainer. The format of  ! L this filename is <TRAINERID>"TRNIN". It is l dynamically assigned when DAADS is entered and must be initialized under DAADS. Rev.: 2 ) O, Date: 11/09/88 Page: 8.2-1 of 10

                                                                                                                                                           )

j NSEM-3.02 ) - - - _ - _ _ _ _ - _ _ _ _ _ - _ _ _ _ _ _ - _ _ l

5 4

             '"N             Malfunction Master File
        \s_/ l The malfunction master file contains the malfunction cause and effects documents for all trainer malfunctions. This file is built by a separate task and is a text file which can be edited with the TSS Editor. The filename format is <TRAINERID>"MFMST" it is dynamically asaigned for malfunction merge protions of the design reports. This file must be properly sorted by malfunction number for the malfunction listing to work.

Instrument Master File This file contains an abridged version of the instrument and control lists for the trainer. It is used for instrument merge reports. The filename is <TRAINERID>"INSTM". It is dynamically assigned for the instrument merge. Design Data Master File The design data master file is built by a sperarate task to read a data list tape from

                     .       the dec computer lab and sort the data by control number. The filename format is
          <S                 <TRAINERID>"DESGD". This file is dynamically

( ; assigned for the design data merges. File Description File The file description file contains various descriptors for the internal DAADS data sets. This file is dynamically allocated upon entry to DAADS. The filename format is

                              <TRAINERID>"FDFIL".

3.1.2 INDIVIDUAL SYSTEM FILES System Information File The system master info file contains the system and trainer IDS, the system title and a number of parameters defining the usage and allocation of the system data file. The filename format is <TRAINERID>"SI"<SYSTEMID>. This file is dynamically assigned upon selection of a system ID; it must be initialized under DAADS. i O' Rev.: 2 k- Date: 11/09/88 Page: 8.2-2 of 10 NSEM-3.02

v S 1

   ~

l /T System Data Files

 \  }

Ths system data file contains ten to fifteen data sets of system design data entered under the DAADS input functions. This file is used , as a direct access file by DAADS. It is j internally partitioned and manually blocked by DAADS I/O routines. The filename foramt is l

                                                         <TRAINERID>"SD"<SYSTEMID>. This file is                                                 i dynamically allocated after selecting a system                                           i ID on DAADS.

System Narrative File The system narrative is a TSS Text Editor File  ! which DAADS will list as the system narrative  ! on design reports. This file must be stored l (Not Saved) in the appropriate system file for use by DAADS. The filename format is

                                                         <TRAINERID>"SN"<SYSTEMID>. This file is dynamically allocated for listings by DAADS.

3.1.3 OTHER FILES Other files will be dynamically assigned when

        ,                                               needed by DAADS. These include:

rm () SLO Files System listed output spooling files are allocated for list and report functions of DAADS. TSS PSEUDO Master Files An input function is provided to allow explicit master files for any of the internal DAADS data sets, and TSS input also for any of the data sets. Fixed formats are provided for each record type. All records include a system ID field. On selection of this input function DAADS will request the name of disc file to read input from. The selected file will be dynamically assigned, and then read using the approriate fixed format. The internal data set will be cleared and only those records found on the input where the system ID field matches the current system ID in use will be inserted in the local system data file. A dump function and small list processor also use these formats. (]

 \_/

Rev.: 2 Date: 11/09/88 Page: 8.2-3 of 10 NSEM-3.02

      ~

3 o 3.2 RECORD FORMATS DB #3fApplicable Contract / Specification Paragraphs This data set is to consist of the contract / spec. paragraph numbers and titles applicable to the system. The paragraph number is an eight ascii character I field, 48 characters are provided for the title. The

                     ' format for TSS entry is:

SID*PARG #

  • TITLE DB #4 Design Data Usage This data set is to contain the control index, rev, and source control numbers, title, source, and receive date for design data documents used in the system design. The user must enter only the control index and revision numbers; the design data merge will fill in.the remaining fields from the datalist master for TSS Entry:

SID**IND#**RV* DB #6 Malfunction Usage s The system design usage data set is to contain only the malfunction number for each malfunction applicable IJ k to the system. The cause and effects documents written by the test operator will be merged based on the malfunction number. TSS Format: SID*MF#* COMMENTS

  • DB # 8 Remote Function Remote functions applicable to the system are listed under this data set. Four characters are provided for the remote function number, 60 for the title, 24 for the. range / units, and 5 for COMMENTS.

TSS. Format: SID*NUM.* TITLE Range

  • COMMENTS
  • I e Rev.: 2

( Date: 11/09/88 Page: 8.2-4 of 10 NSEM-3.02

I o C DB #9 Design Assumptions i (,) Design assumptions with item numbers are to be listed jpg under this data set. Each assumption may have one and l only one assosciated data requested number. TSS format: I SID* ITEM *TC* Text Line Text Line DB #10 Design Simplifications Simplifications made in the system design are to be i listed under this database. TSS Format: SID* ITEM *TC* TEXT TEXT Somewhat different formats, based on wide listings were used on hatch. At some point we hope to translate that data into the new formats with the more elegant listing. DB #11 Instrument Usage Records Each instrument addressed by the system is to be listed in this data set. On hatch the tag number

  .s           field is not provided and must be filled by the

( ) instrument merge for each report. There are six type

            codes to identify different groups of instruments--

Type 1 - Meters (Table 4.1) Type 2 - Recorders (Table 4.2) Type 3 - Controllers (Table 4.3) Type 4 - Lights (Table 4.4) Type 5 - Switches (Table 4.5) Type 6 - Miscellaneous (Table 4.6) TSS Format: SID*T* ITEM #

  • TAG NUMBER
  • PANEL Range / Description DB #13 Annunciators l/h\

Each annunciator addressed is listed number, window number, one line description, and COMMENTS. TSS Format: /f( SID** BOX NUMBER * *WINDO* COMMENTS * (TABLE 4.7) Description (~N Rev.: 2 (_) Date: 11/09/88 Page: 8.2-5 of 10 NSEM-3.02 l U___ __

t o-a DB #14 PCM Monitored Parameters.

 ,1\,)s [ '

SID*ITM* UNITS

  • COMMENTS
  • Parameter Name DB #15 PCC monitored Parameters SID*ITM *PPC POINT ID
  • UNITS
  • COMMENTS
  • Parameter Name DB #25-28 -- Component Information Tables DB #25 AIR OPERATED VALVES (TABLE 5.1)

Tag No. 14 Description 57 VLV. Pos. Symbol 12 Failure Mode 3 Control' Air Press 8 i Control Power 12 l

                           .Open/Close Time                    8                          i SID* TAG NO.
  • DESCRIPTION VLV.POS.SYM *FM.* CAP *C. POWER *0/C T*

DB#26 Solenoid / Motor valve (Table 5.2) ( )' Tag Number 14 Description 57 Solenoid Deenerg St 5 Valve Pos. Symbol 12 Power. 12 Open/Close Time 8 SID* TAG NO. S.D.S.*VLV.POS.SYM~.* POWER *0/C Time

  • DB #27 Pump Motor (Table 5.3)

Tag Number 14 Description 57 Motor Breaker Symb. 12 12 l Bus Power l Control Power 12 Operating current 6 Pump Up/Down Time 8 SID* TAG NO.

  • Description Motor BR. XY
  • Bus Power *CNTRL.PWR *OP. CUR *UP/DWN T*
   /~')                                                            Rev.: 2
   \J                                                              Date: 11/09/88 Page: 8.2-6 of 10 NSEM-3.02

g

          ,_s                             DB #28     Meter / Transmitter   (Table 5.4)
       /             \
       \~ )                                     Tag Number                      14 Description                     57 Range                           12 Meter Power Symbol              12 Trans. Power Symbol             12 Meter Light Pwr. Syb.           12 SID* TAG NO.
  • Description Range *MTR PWR *TRANS. PWR *MTR LGHT PWR*

4.0 DAADS FUNCTIONS

4.1 INTRODUCTION

DAADS runs as an interactive terminal task with menu selection displays and prompts requesting specific inputs. Upon signing on a trainer ID and system ID must be entered. After a valid pair of ids has been entered a main t.ableau of available functions is displayed. A few rare functions are left off of this menu as they should not be used if not properly understood. Individual functions are described in the following subsections. Note that all input, edit, and select g3 functions apply only to the data sets stored in the ( , ) "SD" file for the current system. 4.2 PASSWOR 3 Passwords are defined upon defining each system to DAADS. The system password will be requested upon the firest attempt to edit or input data after signing on to a system. Passwords may be changed by responding

                                           "$" to the main DAADS tableau. They will also be requested for design reports, and to initialize the system file. A special password is required to initialize the trainer info file.

4.3 INPUT FUNCTIONS The input functions are selected by the key letter "I" and a data base number input to the main DAADS tableau. If a valid number is entered then the input subroutine will request a choice of media. The choices are: I-Rev.: 2 ['} Date: 11/09/88 N/ Page: 8.2-7 of 10 NSEM-3.02 l

             .c
          .o "C"  For direct input through the_CRT. The system will request each subfield in turn and when
                          -finished return to.the media selection promp T.
                     "D"  For disc file. The system will requrest the name of an existing. permanent disc file and if found will read the contents of that file directly into                           i the selected system data set.                                               '
                     "M"  For TSS master files. The system will request the name of a disc file as on input function D.

The file will be read using the appropriate TSS formats and only records in which the system ids

      ;                   match the current system ID will be read into the local database.
                     "T"  For tape. The system will attempt to allocate the tape drive and read input from a data tape.
                     "C"  For card reader. The system will allocate the card reader and read records, one subfield per card into the local system data file.
                     "E" or "X" Return; no further input.

4.4 EDIT FUNCTION The edit function is selected by entering "M" and a valid data base number in response to the main tableau. The system will then enter a subroutine to select subsets of the selected data set. When the

   .e-               desired set of records have been selected, return to I                  the edit routine by responding "S" to the selection
   \~               prompt. Edit functions will then be offered. The edit functions include display of the selected records. The records may be modified by subfields, either singly or the entire group at one time.

4.5 LIST FUNCTION The list function may be selected to produce listings one local data set at a time. If the selected data set is one of those stored in the system data file then the select routine is entered to select subsets I of the defined records. After selection is complete I the system will offer to sort the selected records on any of the subfields defined for the record type, or perform the standard sort used on the design reports, or list the records in the order they are stored in

                    -the system data file (e.e. no sort). The TSS files                                 ,

listed as the system narrative or math model may also { l be listed with this function. ]' i l l /~) Rev.: 2 (,) Date: 11/09/88 Page: 8.2-8 of 10 NSEM-3.02 l

                                                                    ------ - -- - ------ ----- _ --- a

n , O (~N 4.6 SELECTION SUBROUTINE

  \ ")

The selection subroutine used by the list and modify functions allow the user to select either the entire data set or a subset based on iterative application of relational operators to the subfields or the current group. 4.7 DESIGN REPORTS The report function is selected by responding "R" to the main DAADS prompt. The system ask for a report type selection- 1, 2, 3 - for design report. The system password will be requested if it has not already been provided during the current session. The design reports include a title page, table of contents, followed by listings of all appropriate databases for the report. 4.8 DAADS FILE INITIALIZATION Each of the DAADS internal DAADS data files must be initialized before use by DAADS. The trainer name, and contract specification title must be entered upon initialization of the trainer info file. Each system must be defined and given a password after

      ~         initialization of the trainer info file. The sequence (s)

N' to initialize the trainer info master file is started by entering a "$" in response to the system ID prompt. Each system must have its internal. files initialized before usage. The system initiali*ation sequence is initiated by entering a responding "$" on the main DAADS tableau. The system file must be initialized with a system title and allocations for the maximum number or entries to be contained in each of the internal data sets. The trainer file may be re-initialized without re-initializing each system. i 4.9 OUTPUT FUNCTION Any of the internal data files may be output in the  ! TSS formats with the output function. This function ) is initiated by entering "O" on the main DAADS tableau. The system will then ask for the data base number to be dumped and the name of a previously created permanent disc file to write to. This function will be extended to dump either all databases for individual systems, or one data base for all  ; systems. l \ [] Rev.: 2 l (_f Date: 11/09/88 Page: 8.2-9 of 10 l 1 NSEM-3.02 L ---- ___

a , O (~'T 4.10 CHANGE j G/ Changing from one to another system can be done on DAADS by entering a "C" on the main DAADS tableau. DAADS will de-allocate the system files currently in use, request a new system designator, and allocate the appropriate files for the newly selected system. This function has some subtle bugs which will cause the system to abort after certain combinations of actions which include changing system designators and a number of other file allocations and de-allocations. 5.0 BDAADS BDAADS is provided as a batch program to generate design reports without having an operator waiting at a terminal. The filename for this JCL is BDAADS. g l l l l l I 1 7'T i (,) i i i I I l i 1 l l l

  /~)

(/ Rev.: 2 Date: 11/09/88 Page: 8.2-10 of 10 NSEM-3.02 L _ ____ _ ___ _ __ ___ _ l

7- *. [ L o , 0: Attachmont 8.3 L Computer Program Coding

     ' % / FORTRAN LANGUAGE COMPUTER PROGRAMS:
1. All comment lines shall start with 'C**'.
2. 'Once a year (December), all unnecessary comment lines shall be deleted for the purpose of clarity.
3. An SDC' summary page shall be added after the module description page but before the equation number page in every program. There shall be 40 entries (lines) per SDC summary page. Each line describes the SDC number, the DR_ number,.the date of change, name of engineer, equation number, and a brief description. If an SDC contains only one DR, then use one line to describe the SDC. If an SDC contains more than one DR, then use one line to describe each DR. If more than one equation in a module are affected in a DR work, enter the first affected equation number in the SDC change summary.

Indicate other affected equations in the first affected equation. If.more than one module are affected in a DR work, similar documentation shall be entered in each affected module. SDC changes shall also be documented in the affected equation (s). For simple changes such as power supply modifications,

    '(
    ./-).                      s 4.

the outdated code may be erased entirely , but both the previous and the current listings shall be included in the SDC file. For r.omplicated changes, the outdated code.may be commen'.ed out and new code be inserted. A note indicating hr,w long the previous code should be kept shall be also entered. Rev.: 2 O_ Date: 11/09/88 Page: 8.3-1 of 2 NSEM-3.02

     +         ,, ,

O Attachment 8.3-m Computer Program Coding LASSEMBLY' LANGUAGE COMPUTER PROGRAMS:- l

1. All comment lines shall start with '*(initial)(rev#)'.

2.- Once a year (December), all unnecessary comment lines shall be deleted for the purpose of clarity.

3. A.' Revision history' page shall be added after'the module description page but before the program coding in every program. .There shall be 40 entries (lines) per
                                                      ' Revision: historypage.                  Each line describes the SDC
                                                     -number, the DR pumber, the date of change, the revision initial, and comments.               IfLan SDC contains only one DR, then use one line to describe the SDC.                       If an SDC contains'more than one DR, then use one line to describe each DR. If more'than one module are affected in a DR work, similar documentation shall be entered in each affected module.
4. If it is desired to keep certain comments for a longer period, a note indicating how long the previous code should be kept should be also entered.

U 1' Rev.: 2 O(> Date: 11/09/88 Page: 8.3-2 of 2 NSEM-3.02

e., --

         +4L
        ..g
   ~-

NSEM-3.02 Attachment 8.4 MARGINAL NOTE DIRECTORY

                        ' 1.                         Deleted inappropriate _ references.

2.- Inserted standard definition.

3. Corrected typo / grammar.
4. Corrected title.
5. Added/ Revised.
6. Renumbered.
7. Deleted equation numbers.  !

Oc I Rev.: 2 O' Date: 11/09/88 Page: 8.4-1 of 1 I f __2_m_____:_ _ _ _ _ _ . _ . _ _ _ _ _ _ _ _ _ . _ _ . - . - . _ _ . _ _

3(- 5 1 l >; 1: , l]V-[ [ t. NORTHEAST UTILITIES i NUCLEAR' SIMULATOR ENGINEERING MANUAL 1.

                                                                         'I NSEM - 4.01 VERIFYING SIMULATOR CAPABILITIES y

VIA SYSTEM TESTS I Responsible Individual: Manager, Operap r Training Branch f Approved: D MMr, - M Nuclear Training Revision: 1 Date: May 11, 1989 SCCC Meeting No: N'00f

         - _ _ _ - _ - - _ _ - - - - _ - - _ - _ _                        -    _- -             __   . = _________-___ ____ - _-___ _ ___ _ _ __-__-
 ?.!
<D                   Hl.0                           PURPOSE'

< The; purpose of.this procedure is to develop system tests for plant systems.using the following as inputs: (1) Hardware Lists from NSEM-2.01, (2) Softwarc Flowpath Lists and Simulation Drawings from NSEM-2.01, (3) Remote Function. Lists from NSEM-2.02. This procedure will also define the guidance necessary to. develop a "Cause and Effect" document for all Simulator-Malfunctions in a consistent format.

                        '2.0                        APPLICABILITY This procedure applies to the Nuclear Training Department (NTD),, including the Operator Training Branch (OTB),

Simulator Technical Support Branch (STSB), and other Northeast. Utilities (NU) organizations performing functions

                                                   'in support of the NU Simulator Certification Program.

L

3.0 REFERENCES

3.1 ANSI /ANS'3.5-1985 This standard states the minimal functional requirements on design data and simulator performance and operability testing.

                                                   '3.2   NRC RG'1.149 Rev. 1, April, 1987 - This guide describes an acceptable methodology for-certification by endorsing
                                                         ' ANSI /ANS-3.5, 1985 with some additional requirements.

3.3 10CFR 55.45(b) - Mandates a timetable for simulator facility certification and specifies additional testing requirements. 3.4 INPO Good Practice TQ-504 - Describes techniques for < effectively controlling simulator configuration. i l 3.5 INPO Good Practice T0-505 - Describes techniques for L effectively controlling simulator configuration. 3.6 NUREG 1258,. December, 1987 - Describes the procedures and j L techniques which will'be employed to audit certified I facilities. i I 3.7 NTDD-17, Simulator Certification and Configuration Management Control. [ Rev.: 1 Date: 5/11/89 Page: 1 of 20 NSEM-4.01 l Lin1_ ____ _ _ _ _ _ _ _ _ _ _  ;

l i

             '3. 8  INPO 86-026, Guidell'ne For Simulator Training, October, 1986.

L 3.9- INPO 87-006, Report'on Configuration Management in the l I Nuclear Utility Industry, July, 1987. 4.0 DEFINITIONS 4.1 Deficiency Report (DR) - Form (STS-BI-F1A) used by the Operator Training Branen (OTB) and.the STSB to record all identified simulator deficiencies.between the simulator and reference plant. 4.2 Simulat' ion System-Diagram (SSD) - Functional repre-sentation.of the simulator modeling for a given system. N 4.3 System Test - A. test developed for each modeled plant , system that ensures proper response of all control board ' instrumentation, controls, annunciators, PPC points, remote. functions, flowpaths and components that are

                   -associated with an individual plant system.
             ~4. 4  Remote Function ~~ An instructor initiated input to the simulator model which will provide the same discernible effects as the corresponding manual operation in the reference plant.

4.5 Certified Remote' Function - Those remote functions which will be tested to work-correctly and may be used in simulator training and exams. 4.6 "Cause & Effect" Document -- A description of the simulator response.(effect) to the insertion of a specific mal. function or malfunctions. Each malfunction description also contains the. physical "cause" of the malfunction as well as a description of the significant effects on plant operation due to the malfunction. L 4.7' Oualified Instructor - An Instructor designated by the ASOT to perform a system test. 5.0 RESPONSIBILITIES 5.1 Assistant Supervisor Operator Training (ASOT) 5.1.1 Responsible for assigning Simulator j Instructors to write, perform and document J simulator system tests. Rev.: 1 Date: 5/11/89 Page: 2 of 20 NSEM-4.01

L y l 1 5.1.2 Responsible for reviewing and approving the N' individual system tests prior to the l performance of the test. 5.1.3 Responsible for review and acceptance of each completed system test. 5.1.4 Responsible for scheduling the accomplishment of all system tests on a continuous four year basis. 5.1.5 Responsible for approval of the malfunction "Cause & Effect" document. 5.2 Simulator Instructors 5.2.1 Responsible for writing and conducting system tests required for simulator certification. 5.2.2 Responsible for writing Deficiency Reports (DR) and retests for steps which do not respond as expected during the system test. 5.2.3 Responsible for documentation of completion of each step in the system test.

    ,r~s                            5.2.4  Responsible for writing individual malfunction

(,) "Cause & Effect" descriptions. 6.0 INSTRUCTIONS 6.1 Writing of System Tests Note: The purpose of this section is to establish a consistent content and format for the writing of system tests. 6.1.1 The number of systems to be tested will be selected by the ASOT for each simulator. Documentation of the selected systems will be shown on Attachment 8.1 (MP1), Attachment 8.2 (MP2), Attachment 8.3 (MP3) or Attachment 8.4 (CY). A sequential number shall be assigned to each unique system test (see Attachment 8.2 for an example) Q k/ Rev.: 1 Date: 5/11/89 Page: 3 of 20 NSEM-4.01 ~ _ _ - _ _ _ _ _ _ _ _ _ _ _ _ _

                                                    - ~ _ _ _               __              ._       - _ __

1 1 1 1 l- . l /. 6.1.2 The selection of systems to be tested shall ks)- result in including the following: I i 6.1.2.1 All control board hardware, including annunciators, contained in the NSEM { ' 2.01 Hardware List (except hardware listed on Figure 7.4 of NSEM 2.01, which are not needed for training) j shall be tested. I 6.1.2.2 All Software Flowpaths contained in i the NSEM 2.01 Software Flowpath Lists (except software flowpaths listed on Figure 7.8 of NSEM 2.01, which are not needed for training) shall be tested. 6.1.2.3 All components which are part of a required software flowpath shall be tested. Each required flowpath shall be reviewed on the Simulation Diagram (SSD) for that system, and all components (pumps, valves, heat exchangers, PPC points, pressure indicators, temperature indicators, flowmeters, ammeters, handswitches, rs indicating lights.etc.) that make up i _) the flowpath shall be tested. 6.1.2.4 All systems which are Electronic Control Systems (such as: RPS, ESAS, EHC, etc.) shall be tested. These types of systems (see NSEM 2.01, 6.3.1.3(e) Note and 6.4.2) did not have SSD's reviewed per NSEM 2.01. SSD's for these systems shall be reviewed to ensure no errors are present on the SSD's, as well as to serve as an input to the writing of the system test.  ! 6.1.2.5 The ASOT shall add any additional system tests that he deems appropriate, such as a Process Computer System Test or Reactor Core Physics System Test. 6.1.3 An individual System Test shall be written to ensure that all of the following are tested to work correctly: A k-- Rev.: 1 Date: 5/11/89 Page: 4 of 20 NSEM-4.01

l h l I 6.1.3.1 All annunciators contained on the

      . (m\--)                 NSEM 2.01 Hardware List (except any annunciators listed on Figure 7.4 of NSEM 2.01) associated with an individual system shall be tested to work correctly. There may be some judgment required on some annunciators as to what system test they go in. This is up to the ASOT to decide, as long as every annunciator is tested in at least one of the system tests.

6.1.3.2 All Control Board Hardware contained in the NSEM 2.01 Hardware List (except any hardware listed in Figure 7.4 of NSEM 2.01) associated with an individual system shall be tested to work correctly. Control Board Hard-ware to be tested includes hand-switches, meters, pushbuttons, indicating lights, recorders, j controllers, etc. There may be some  ! judgment required on some Control Board Hardware as to what system test they go in. This is up to the ASOT rN to decide, as long as all control

          ! ,)                 board hardware is tested in at least one of che system tests.

6.1.3.3 All Plant Process Computer (PPC) points associated with an individual system should be tested to work correctly. To accomplish this, all t PPC points listed on the SSD for that system shall be tested. Further, the plant specific PPC point listing should be reviewed to determine additional PPC points that need to be tested for an individual system. ' PPC points include digital inputs, analog inputs and pulse inputs. 6.1.3.4 All Software Flowpaths contained in the NSEM 2.01 Software Flowpath List (except any flowpaths listed in NSEM 2.01 Figure 7.8) associated with an individual system, shall be tested to work correctly. Further, all components which are part of a soft-ware flowpath shall be tested. Each Rev.: 1 Date: 5/11/89 Page: 5 of 20 NSEM-4.01

l

  ,m
 ;       }                                                                                                            required flowpath shall be reviewed
  ' /-

on the SSD for that system and all components (pumps, valves, heat exchangers, PPC points, pressure indicators, temperature indicators, flowmeters, ammeters, handswitches, indicating lights, etc.) that make up the flowpath shall be tested. There may be some judgment required on some L software flowpaths and/or components l as to what system test they go in. This is up to the ASOT to decide, as long as every software flowpath/ component is tested in at least one of the system tests. 6.1.3.5 All remote functions that are identified as requiring certification in the NSEM 2.02. Remote Function List, associated with an individual system, shall be tested to work correctly. There may be some judgment required on some remote functions as to what system test they go in. This is up to the ASOT tc decide, as long as every remote (~% function to be certified is tested in ( ,) at least one of the system tests. 6.1.4 Testing Methodology and Acceptance Criteria for those items listed in Step 6.1.3 shall be: 6.1.4.1 The operation of all annunciators (except any annunciators listed on Figure 7.4 of NSEM 2.01) for a given system shall be verified to actuate at the correct setpoint. Some annunciators will be identified as , non-modeled annunciators. It is  : preferred that annunciators actuate at actual plant setpoints (and tolerances) as set by I&C, if possible. Alternatively, annunciator setpoints shall be verified to actuate at the Control Room Annunciator Book (CRAB) setpoint (or CRAB referenced procedure). Setpoints having a basis in Tech Spec's shall be within Tech Spec limits.

  !~                                                                                                                                                               l
 \s h  /                                                                                                                                    Rev.: 1 Date: 5/11/89 Page: 6 of 20         ,

NSEM-4.01 I l l _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ . _ _ _ _ _ _ _ _ _ _ _ _ ._ _J

m l

    '        ).           6.1.4.2 The operation of all Conttol Board Hardware (except any listed on Figure 7.4 of NSEM 2.01) for a given system shall be tested. Handswitches shall be checked for proper operation in all positions; open, close, normal-after-open, normal-after-close, pull-to-lock, auto, standby, etc.

Indicating lights associated with . handswitch operation shall also be  ! verified to be correct. Proper operation of meters (temperature, pressure, flow, amps, etc.) shall be verified to operate over as much of their instrument span as could be reasonably expected during normal operations. Controllers shall be tested in both automatic and manual operation and shifted back end forth to ensure correct operation. Inter- . locks shall be tested to verify they inhibit operation when required to. The acceptance criteria for hardware testing shall be based on:

  .('                             o   Utilizing plant data, if

( available, to determine correct response. o Utilizing the Simulator Design Data Base, such as P and ID's, j electrical schematics, instrument loop drawings, operating procedures, etc. to determine correct response, or o Utilizing (as a last resort) a qualified instructor to determine correct response based on his/her . experience. l 6.1.4.3 The operation of all PPC points for a given system shall be verified to be correct. Some PPC points will be identified as non-modeled PPC points. 4 PPC Doints to be tested will consist

                                      ~

of Analog. Digital and Pulse Inputs. Proper :esponse of digital inputs will be demonstrated by cycling the , (~m l k- Rev.: 1 l Date: 5/11/89  !' Page: 7 of 20 NSEM-4.01 1 t L__ ___ _ __-.------ - - -

r~m ('_,) digital input and observing the PPC response. An example would be start and stop a pump and observe the PPC digital input change to follow pump status. The acceptance criteria for PPC analog and pulse point testing shall be based on performing simulator control manipulations and observing the PPC point change consistent with one of the following: o The PPC point shall track with control board indication if the PPC point is from the same instrument. o If plant data is available, it may be used to determine correct response. o The Simulator Design Data Base (P and ID's, electrical schematics, instrument loop drawings, operating procedures, etc.) may (~D be used to determine correct (ms/ response. o A qualified instructor (as a last resort) may determine correct response based on his/her experience. 6.1.4.4 All Software Flowpaths contained in the NSEM 2.01 Software Flowpath List (except any flowpaths listed in NSEM I 2.01 Figure 7.8) associated with an 1 individual system shall be tested to f work correctly. All components which are part of a software flowpath shall also be tested. Testing in this section may be performed in conjunction with any other testing listed in section 6.1.3. q l (

 \                                     Rev.: 1                                l Date: 5/11/89                          i Page: 8 of 20                          !

NSEM-4.01 _ _ _ _ _ - _ _ _ _ _ _ _ _ t

[~Nl The acceptance criteria for this N-s testing shall be based on: o Utilizing plant data, if available, to determine correct response. For safety related equipment, available ISI data shall be used. 1 o Utilizing the Simulator Design Data Base, such as P and ID's, electrical schematics, instrument f loop drawings, operating 1 procedures, etc., to determine correct response. o Utilizing (as a last resort) a qualified instructor to determine correct response based on his/her experience. 6.1.4.5 The operation of all remote functions to be certified for a given system shall be verified to be correct. Variable remote functions (i.e., O to 100%, or 55 F to 85 F) should be 73 checked at a minimum of 2 values and

         ' 'j                     preferably a third value. The minimum 2 values should be at or near the endpoints of the variable remote function range. Two or three position (such as on/off or automatic /

stand by/off) remote functions shall be checked for proper operation in all positions. Since many remote functions are for valves, valve testing should include valve stroking times. Acceptance criteria for variable or 2/3 position remote functions shall be based on: o Utilizing plant data, if available, to determine correct response. o Utilizing the Simulator Design Date Base, such as P and ID's, electrical schematics, instrument leep drawings, operating precedures. etc., to determine _ correct response.

       \                                                 Rev.: 1 Date: 5/11/89 Page: 9 of 20 NSEM-4.01 L-------_-_____-____-      _     _     _                                   _ _ _

p ,

 '4 jh_.-                                                          o   Utilizing (as a,last resort) a U.                                                             qualified instructor.to determine
                                                                      . correct response based on his/her-experience.

6.1.5- Each system test will have as its first page, a Figure 7.1 (System Test: Cover Sheet) for that particular system. 6.1.6 The first two steps of each. system test shall l be a listing of all PPC points and-l: annunciators which willlbe tested during that system test (including non-modeled points to be checked). 6.'1.7 ' System tests will be written in a logical order to efficiently allow testing of all components, flowpaths,' hardware, annunciators, remote. functions and PPC points associated with that system. System tests will consist of series of numbered steps to be executed in sequence. 6.1.8. Each system ~ test step or.substep shall have a space for initials by the performer of successful completion of the step or substep. Each-system test step or substep shall have a oO, space for a DR number to be written in by the performer of the test, if the step or substep was unsuccessful. The location of the spaces for successful completion initials or DR number shall be in the body of.the system

                                                -test, in the right hand side of the system test page, next to the respective step or substep number.

6.1.9 To provide flexibility, FPC points or annunciators may be tested in any sequence during the system test. 6.1.10 The instructor station' response (I/O override, etc.) will not be tested during any system test. Instructor station features will be tested elsewhere. 6.1 11 The Figure 7.1 cover sheet for each system test shall contain the system test title, attachment number. revision, developer's signature /date end ASOT's signature /date to release the test for performance. Rev.: 1 Date: 5/11/89 Page: 10 of 20 NSEM-4.01

l ['i 6.1.12 The last 2 steps of any system test should KI contain a listin; of any non-modeled annunciators end PPC points that should be checked to not be in alarm. 6.1.13 Prior to writing a system test, review of the Simulator System Documentation Manual for that system would be useful. This will identify for that system, non-modeled points / annunciators and any simplifications that were assumed on that system. 6.2 Performance of the System Test 6.2.1 The assigned simulator instructor will complete the system test as follows: 6.2.1.1 Establish the required simulator conditions. 6.2.1.2 Accomplish each step and document its completion on the system test in the appropriate column next to the step. I 6.2.1.3 During the accomplishment of any test step, if a discrepancy is identified, 7

      ,-                                a DR will be written for its s                                      correction. The DR will reference both the system test and the step number that failed. The DR number will be written on the system test in the appropriate column nex". to the step.

6.2.2 Any supporting information such as PPC print-outs or other documentation shall be attached to the system test. Any attached documents shall be referenced in the " remarks" section of the Figure 7.1 cover sheet. The simulator instructor who performs the test shall , sign /date the Figure 7.1 cover sheet and I indicate if any comments are attached. I i 6.2.3 At the completion of the system test, it will I be reviewed by another simulator instructor to ensure that all steos are completed and

                                                   ~          ~

initialed as required. l \ l i

                                                              Rev.: 1 l

Date: 5/11/89 Page: 11 of 20 NSEM-4.01 l l

l. I L I l

i l (<-) 6.2.4 During this review, a summary of all i l \/ ~ instructor comments will be compiled to assist I in the completion of this test during its next j scheduled performance or revising the test as i necessary. l i 6.2.5 When the instructor review is complete, the l instructor performing the review should sign / date the Figure 7.1 cover sheet. The completed system test will be submitted to the Assistant Supervisor Operator Training for final review and approval. 6.2.6 All completed system tests and attached printouts or comments will be maintained in a neat, tabbed binder (s) with other system tests. Completed System Tests will be forwarded to Controlled Document Storage (CDS)  ! with other Simulator Certification Records. 6.2.7 Revision level and dates for system tests shall be assigned by the ASOT, when he signs the Figure 7.1 cover sheet to release the test for performance. This applies only to the system tests, as described in NSEM 1.01. 6.3 Development and Maintenance of "Cause & Effect" document (<- ) for Simulator Malfunctions. (Appendix A.1, A.2, A.3, & A.4) 6.3.1 A "Cause & Effect" document shall be main-tained for each NU Plant Referenced Simulator. They shall be contained as: o Appendix A.1 Millstone 1 o Appendix A.2 Millstone 2 o Appendix A.3 Millstone 3 o Appendix A.4 Connecticut Yankee 6.3.2 The "Cause & Effect" document for each unit shall be maintained current and up to date for I use in training or exams. l

                                    .                                                Rev.: 1 Date: 5/11/89 Page: 12 of 20 NSEM-4.01
       )

6.3.3 The format of each malfunction "Cause & x-Effect" shall be identified with 6 distinct headings: o Malfunction Title o Malfunction Type o Malfunction Cause o Plant Statu-o o Malfunction Effects o References Each of these 6 headings shall be further defined as follows: 6.3.4 Malfunction Title

                     . The Malfunction Title shall be a clear, concise statement describing the malfunction.      If the malfunction is generic, each component of the generic malfunction shall be listed under the title.
                     . A unique 4 character (2 letter, 2 number) identifier shall precede the title (example CV01). The 2 letters shall 73                    correspond to the system in which the

() ' malfunction is modeled. The numbers are sequential from 01 to 99 for each system. 6.3.5 Malfunct.on Type

                     . The 3 types of malfunctions are as follows:
                     . Discrete - a malfunction that can be entered for only 1 specific piece of equipment and is not var:able.
                     . Generic - a malfunction that can be entered for more than 1 similar piece of equipment and may or may not be variable.
                     . Variable - a malfunction that can have a range of values up to a specified maximum.

Examples are:

                            - " Variable 100t=500 acm @ 2000 PSIG"
                            - "Verieble 10 0 3, = 5 7 5" E 01 - 5 0 0
  • r "

) 1+ fN ! \_-) Rev.: 1 l l Date: 5/11/89 Page: 13 of 20 i NSEM-4.01  ! L_______ _ _ I

f l l ,s '

      !                             6.3.6  Malfunction cause
    . N_
                                            . State the specific failure that caused the malfunction. This failure should be as specific as possible and should be identifiable in the reference material.

Examples are:  !

                                               - "railure of 27X relay", " failure of 86 relay contact 1-2", or " failure of piping downstream of LD-MOV-200"                       ;

I

                                           . If more than 1 possible failure could cause the malfunction, list as many additional failure causes as possible so as to allow diversification in the training process.

6.3.7 Plant Status

                                           . Plant status is defined as the initial                    !'

condition of the plant from which the effects are based (i.e., power operation, 20% power, on shutdown cooling, etc.) 6.3.8 Malfunction Effects 1 l ('~s ) - . The malfunction effects section shall be written in sufficient detail to facilitate development of training materials and exams.

                                           . The malfunction effects shall consist of a general description of the sequence of events that occur when the malfunction is inserted from the plant initial condition.

Generally, the effects should be written assuming no operator action.

                                           . The malfunction effects shall include: (1) effects on major equipment, (2) effect on control board indication, (3) major annunciators that would be expected to alarm, (4) what student actions could mitigate the malfunction, (5) simulator response to malfunction removal, (6) generally, what procedure and technical specifications are impacted, and (7) plant condition. for example, tripped or not tripped, due to malfunction or operational restrictions that may result.

I m ss Rev.: 1 Date: 5/11/89 Page: 14 of 20 NSEM-4.01

    /~'

k/ ' i .. 'Where operator action is assumed, the E specific. actions and. time frame should be included. If plant data or best estimate data is available it should be used to describe the effects on major plant parameters as the event progresses.

                                .       For generic malfunctions, a complete description shall be given for one of the choi.ces and the differences shall be described for the remaining choices.
                                .      For variable' malfunctions, the effect of varying the severity should be qualitatively-described. In general, it is best to.give the effects of a 100%

severity responseLin the "effect" description.

                                .      For:those malfunctions that do not produce any immediate diagnostic clues, some discussi'on should be present as to what diagnostic indication-will occur and at
     ,                                 what approximate time.

6.3.9 References h'- . References shall be listed for all material used to develop the "Cause & Effect" document. Examples are: P&ID's, Best Estimate Transient Analysis, Plant Drawings, Plant Data 1 Book, Certification Procedures., etc. 6.3.10 An index shall be at the front of the Cause & Effect document containing a listing of all malfunctions in alphabetic / numerical order. 6.3.11 The cover page of the cause & Effect document for each unit shall contain approval of the Assistant Supervisor of Operator Training for  ; that unit. i 6.3.12 Malfunction "cause and Effect" descriptions shall be written in a format that allows STSB ease of incorporation of revised "Cause and Effect" malfunction descriptions into the Simulator Systen Documentation Manual. NSEM-3.02, Section 2.1 shall be referred to as an example. j O,, Rev.: 1 Date: 5/11/89 Page: 15 of 20 NSEM-4.01

 ;       ' 7 ~. 0                              FIGURES
     .y.

7.1 Simulator System Test Cover Sheet 8.0 ATTACHMENTS 8.1 Millstone 1 System Test Index (To be added later) i O l Rev.: 1 Date: 5/11/89 Page: 16 of 20 NSEM-4.01 L - - _ _ _ _ _ _

em i 1- 8.2 Millstone 2 System Test Index

 'O 8.2.1  Service-Water Test 8.2.2  Circulating Water Test 8.2.3  Turbine Building Closed Cooling Water Test 8.2.4  Reactor Building Closed Cooling Water Test 8.2.5  Safety Injection / Containment Spray Test 8.2.6  Chemical & Volume Control System Test 8.2.7  Condensate /reedwater Test 8.2.8  Main Steam Test 8.2.9  Turbine Test 8.2.10 Electrical Generation Test 8.2.11 Electrical Distribution Test 8.2.12 Reactor Coolant System / Steam Generator Test f,                                              8.2.13 Reactor Protection / Nuclear Instrumentation

( Test 8.2.14 Engineered Safety Features Actuation System Test 8.2.15 Reactor Regulating Test 8.2.16 Control Element Drive System Test 8.2.17 Containment / Heating, Ventilation & Air Conditioning Test 8.2.18 Instrument / Station Air Test 8.2.19 Radiation Monitors Test 8.i 20 Waste Disposal Test 8.2.21 Plant Process Computer Test 8.2.22 Shutdown Cooling System Test 8.2.23 Reactor Core Test l Rev.: 1 Date: 5/11/89 Page: 17 of 20 l

____l 1 l 8.3 Millstone'3 System Test Index (To be added later) l l O. O Rev.: 1 Date: 5/11/89 I age: 18 of 20 f NSEM-4.01 i

8.4 Connecticut Yankee System Test Index (To be added later) I 1 l l 8.5 Marginal Note Directory O: Rev.: 1 Date: 5/11/89 NSEM-4.01

i i

g s

( ) 9.0 APPENDICES Appendix A.1 Millstone 1 Simulator Malfunction "Cause & Effect" ! Document A.2 Millstone 2 Simulator Malfunction "Cause & Effect" Document A.3 Millstone 3 Simulator Malfunction "Cause & Effect" Document A.4 Connecticut Yankee Simulator Malfunction "Cause & Effect" Document 1 (~' l

 'k                                                                        Rev.: 1        i Date: 5/11/89 Page: 20 of 20 NSEM-4.01

- _ _ _ - . . - - - __ _ i

  .-                                                                     ATTACHMENT 8.5 l

MARGINAL NOTE DIRECTORY

1. Deleted requirement for revision level on references f

O Rev.: 1 NSEM-4.01 Date: 5/11/89 Page: 8.1-1 of 1 _ = _ _ _ _ _ _ _ _ _ _ _ _ _ - - _ . _ _ _ _ _ _ _ - - - - - _ _ _ _ j

l- i 4 ['u}d 9.0 APPENDICES Appendix A.1 Millstone 1 Simulator Malfunction "Cause & Effect" Document A.2 Millstone 2 Simulator Malfunction "Cause & Effect" Document { l A.3 Millstone 3 Simulator Malfunction "Cause & Effect" Document A.4 Connecticut Yankee Simulator Malfunction "Cause & Effect" Document 1 l l ( l l l l (D bl Rev.: 1 l Date: 5/11/89 Page: 20 of 20 i NSEM-4.01 l

                                                                                             )

ATTACHMENT 8.5 MARGINAL NOTE DIRECTORY

1. Deleted requirement for revision level on references o

l l l l O Rev.: 1 NSEM-4.01 Date: 5/11/89 Page: 8.1-1 of 1 __ ________ _ ____ _______o

            ,:: n.                                               ,

v I j: ,

        .I V: .-     '

NORTHEAST UTILITIES NUCLEAR SIMULATOR ENGINEERING MANUAL

      ^ !--                         NSEM - 4.01 1
               '. ,     VERIFYING SIMULATOR CAPABILITIES
             'f'.
                 !               VIA SYSTEM TESTS.

l' O , i Responsible Individual: Manager, Operapr Training Branch ' 9 Approved: D M r, k Nuclear Training 1 1 Revision: 1 I, l Date: May 11, 1989 l SCCC Meeting No: N'OOI 1 LO l i < 1

1 u M 1. 0.. PURPOSE The purpose of this procedure is to develop system tests for= plant 1 systems.using the following as inputs: (1) 1 HardwBre-Lists from NSEM-2.01, (2) Software Flowpath Lists-

                                                               -and' Simulation.. Drawings from NSEM-2.01, (3) Remote Function Lists'from NSEM-2.02.
                                                              - This procedure will also define the guidance necessary to develop a "Cause and Effect" document for all Simulator Malfunctions in a consistent format.

2.0. APPLICABILITY This procedure applies to.the Nuclear Training Department (NTD),* including the-Operator Training Branch (OTB), Simulator Technical Support Branch (STSB), and other Northeast-Utilities (NU) organizations performing functions

in. support of the NU Simulator Certification Program.

3.0' ' REFERENCES

                                                                                                                                     .i 3 . l' -  ANSI /ANS 3.5-1985 - This standard states the minimal
,/' '               .

functional requirements on design data and simulator l performance and operability testing. 3.2 NRC RG 1.149 Rev. 1, April, 1987 - This guide describes an acceptable methodology for certification by endorsing i ANSI /ANS-3.5, 1985 with some additional requirements. 3.3 10CFR 55.45(b) - Mandates a timetable for simulator facility certification and specifies additional testing requirements, y 3.4 INPO Good Practice TQ-504 - Describes techniques for  ; effectively controlling simulator configuration. 3.5 INPO Good Practice TQ-505 - Describes techniques for effectively controlling simulator configuration. 3.6 NUREG 1258, December, 1987 - Describes the procedures and 4 techniques which will be employed to audit certified facilities. 3.7 'NTDD-17, Simulator Certification and Configuration Management Control. O Rev.: 1 Date: 5/11/89 Page: 1 of 20 NSEM-4.01 I i l

N$

1. -
                               -3.8    'INPOL86-026, Guideline For Simulator Training, October, 1    fl>                              1986.

3.9- INPO 87-006, Report on Configuration Management in the Nuclear Utility Industry, July,.1987. 4.0 DEFINITIONS-4 .1 - Deficiency Report (DR) - Form (STS-BI-FlA) used'by the Operator Training Branch (OTB) and the STSB to record all identified simulator deficiencies between the simulator and reference plant. 4.2 Simulation System Diagram (SSD) - Functional repre-sentation of the simulator modeling for a given system. b, 4.3- System Test - A test developed for each modeled plant system that ensures proper-response of all control board'

                                       ' instrumentation, . controls, annunciators, PPC points, remotenfunctions, flowpaths and components that are associated'with an individual plant system.
                                                                                                   -1 4.4     Remote Function - An instructor initiated input to the simulator model which will provide the same. discernible effects as the corresponding manual operation in the reference plant.

4.5 Certified Remote Function - Those remote functions which will be tested to work correctly and may be used in simulator training and exams. 4.6 "Cause & Effect" Document - A description of the simulator response (effect) to the insertion of a specific malfunction or malfunctions. Each malfunction description also.contains the physical "cause" of the malfunction as well as a description of the significant effects on plant operation due to the malfunction. 4.7 Qualified Instructor - An Instructor designated by the ASOT to perform a system test.

           - 5.0                RESPONSIBILITIES 5.1    Assistant Supervisor Operator Training (ASOT) 5.1.1     Responsible for assigning Simulator Instructors to write, perform and document simulator system tests.

O- Rev.: 1 Date: 5/11/89 Page: 2 of 20 NSEM-4.01 e_________

7 1 0 I i

           ,/,

J

            -.s' NORTHEAST UTILITIES i                    NUCLEAR SIMULATOR ENGINEERING MANUAL l

l l NSEM - 4.02 INITIAL CONDITIONS ym.

                 \
                 /

vJ Responsible Individual: i-s Mahager,'Gperhtor Training Branch Approved: Di uclear Training i Revision: C l Date: January 12, 1989 l SCCC Meeting No.: 89-001 l

         ,r 3
          'R,,f
          'E              v t^
g 1.0 PURPOSE o ,
       .f                                       The purpose of this procedure.is to establish controls on ES . -                                  Initial. conditions.(IC's) used inLoperator training, NRC exams or simulator certification activities to ensure the
                                                .IC's are representative of reference plant conditions.

2.0 APPLICABILITY This procedure. applies to the Nuclear Training Department (NTD) including the Operator. Training Branch (OTB), Simulator Technical Support Branch (STSB), and other l Northeast Utilities (NU) organizations performing functions in. support of the NU Simulator Certification Program.

3.0 REFERENCES

3.1 . ANSI /ANS 3.5-1985 - This standard states the minimal functional requirements on-design data and simulator performance and operability testing. 3.2 NRC RG 1.149-Rev. 1, April 1987 - This guide describes an acceptable methodology for certification'by endorsing ANSI /ANS-3.5, 1985 with some additional requirements. 3.3 10CFR 55.45 (b) - Mandates a timetable for simulator facility certification and specifies additional testing requirements. 3.4 NUREG 1258 December, 1987 - Describes the procedures and techniques which will be employed to audit certified facilities. 3.5 INPO'86-026, Guidelines for Simulator Training October, 1986. 3.6 INPO Good Practice TQ-504 - Describes techniques for effectively controlling simulator configuration.

                  '4.0                           DEFINITIONS 4.1   Initial Conditions (IC's) - A set of analog / digital points that are stored on the Simulator's Computers so that a starting point is available for a simulator session. Physical components (handswitches, relays, etc) must also be manipulated to match the analog / digital initialization points (switchcheck).
     ~

Rev.: 0 Date: 1/12/89 Page: 1 of 11 NSEM-4.02

e. l l

l

  ,a        4.2  SRO Qualified Instructor - An instructor who is (or was

( ) in the past) an NRC licensed Senior Reactor Operator x_/ (or certified), who by nature of his training and experience, has the knowledge to make decisions on proper plant system alignments for given operating

                . conditions.

4.3 Axial Offset (A.O.) or Axial Shape Index (A.S.I.) - Common terms used in reactor core axial power distributions measurement. It is an index which describes the relative amount of power between the top and bottom halves of the reactor core. 4.4 Certified IC - An IC which has been reviewed by an SRO qualified instructor and verified to have consistent control board and remote function conditions as the reference plant would under the same conditions. 5.0 RESPONSIBILITIES , 5.1 Assistant Supervisor Operator Training (ASOT) 5.1.1 Responsible for assigning instructors to establish the initial list of Certified IC's and  : concurring with that initial list.

 ,e w            5.1.2   Responsible for assigning instructors to develop Attachment 8.2 (for MP2) or, Attachment 8.1 (for

() MP1) or Attachment 8.3 (for MP3), or Attachment 8.4 (for CY) as appropriate. 5.1.3 Responsible for ensuring that the Attachment 8.2 (or 8.1, 8.3, 8.4 as appropriate) checklist has been reviewed for all certified IC's. 5.1.4 Responsible for ensuring that certified IC's remain updated. 5.1.5 Responsible for determining which certified IC's will be sent to the NRC for exam purposes. 5.1.6 Responsible for determining which Operator Instructor (preferably the Operations Consultant) shall be responsible for maintaining certified IC's up to date. 5.2 Operator Instructors 5.2.1 Assigned SRO qualified instructors (preferably the unit Operations Consultant) are responsible for determining the initial list of certified IC's. Rev.: 0 Date: 1/12/89 ! Page: 2 of 11 NSEM-4.02

       '(

ll H r p H,:' 41~~ 12.2 5 Assigned SRO qualified instructors (preferably. the unit' Operations consultant) are responsible J

 'l n                ,

for ensuring Attachment 8.2.(for MP2) or Attachment 8.1.(for MP1) or Attachment 8.3:(for MP3) or' attachments 8.4 (for CY) as appropriate,

                                              . is valid for each certified .IC.                                                          f 5.2.3     Assigned instructors are responsible for                                              'l
                                                 . assisting the SRO. qualified instructor in L'                                                                                                                                         !
                                                                                                                                          ~
                                                 . completing the responsibility.of step 5.2.2..

5.2.4 . Operator Instructors are responsible for using l

                                                 'only certified IC's for Simulator training sessions, NRC exams or Simulator Certification f                                   Testing.
                                                                    ~

Note: It is the' intent of this procedure for the ASOT to assign responsibility to the Operations Consultant (or other Operator Instructor) as the coordinator for. maintaining-certified IC's. Other instructors may change, ad'd or delete certified IC's but are responsible for adhering to.this procedure and ensuring the Operations Consultant (or other ASOT designated' Instructor) is aware of their actions so that consistency can be maintained in all certified IC's. 1 5.2.5 Operator Instructors, preferably the Unit Operations Consultant,'are responsible for

                '                                ' ensuring that certified IC's are kept up to date as Simulator-Design Changes (SDC's) are implemented.                                                       ,

5.2.6 Operator instructors are responsible for not deleting or changing certified IC's without the knowledge of either the ASOT or his designated instructor ~(preferably the Operations Consultant) responsible for maintaining certified IC's up to date. l 6.0 INSTRUCTIONS 6.1 Obtain NSEM'2.02 Form 7.2, which contains the recommended IC's for certification.  ; 6.2 An SRO qualified instructor, preferably the Unit Operations Consultant shall review the list of IC's from step 6.1, and' add or delete IC's from this list, c based on his judgement. 6.3 The ASOT shall review the working list from step 6.2 and concur with the list. O Rev.: 0 Date: 1/12/89 Page: 3 of 11 l NSEM-4.02 i

v 1' NOTE: AtlanyLgiven time, the. number of certified initial . conditions can vary. The. purpose of sections 6.4 and

              -             6.5 is to provide directions on how to certify the.

initial. group of' selected IC's. . Directions for adding new certified IC's,. deleting current certified IC's, or modifying current certified IC's will be

                            . discussed'later in this procedure.

6.4 Generate the'unitispecific checklist to be used for certifying.IC's, by performing the following steps. 6.4.1 Each unit'shall generate a-checklist;to be used L to guide.the instructor in certifying an IC..

.The checklist shall be Attachment 8.1 for MP1, Attachment 8.'2'for MP2, Attachment 8.3 for MP3 These attachments E and' Attachment 8.4 for CY.

L t .. shall be controlled by the individual unit ASOT L' per NSEM 1.01 Sections 6.3.2 and 6.3.3.

                        . NOTE:   Attachment 8.1, 8.2, 8.3 or 8.4 shall hereafter be referred to as "the checklist ~" for simplicity. Attachment 8.2 is attached also as an example.

6.4.2: Section I of the checklist shall contain a listing of each control Board or other simulation location such that the list covers 5 all Simulated Components in the Simulator Room.

t -

Each item on the checklist shall have a blank 1 space next to it.for.the instructor to use as a checklist. For each control board or Simulator , location, an SRO qualified instructor (preferably the Operations Consultant) shall review the alignment of all annunciators, switches, controllers, equipment in service, plant. conditions, etc. to ensure that they correctly reflect the stated IC conditions. It is recognized that any given plant condition will have some equipment which could be

                                  . correctly in several different configurations.

It is acceptable for the SRO instructor to determine a most likely position for these conponents in establishing the IC. 6.4.3 Section II of the checklist shall contain a listing of all remote function systems with a blank space next to each to allow the instructor to use the form as a checklist. The SRO qualified instructor shall go through each remote function system and review all remote functions in that system for correct alignment consistent with the stated conditions of the IC being reviewed. O Rev.: 0 Date: 1/12/89 Page: 4 of 11 NSEM-4.02

NOTE: The following. sections discuss stability of

. g'     4                         various parameters when resetting to an IC.                                  It is understood that a specific-IC could have 4          -
                                   .non-steady state conditions for one or more parameters deliberately for the' purpose of training, and this is. acceptable..

i 6.4.4 Section III of the-checklist shall be organized as determined by the individual Unit ASOT but shall contain as a minimum, the following information: i 6.4.4.1 Equilibrium Xenon, Steady State Power l Level IC's o For IC's which are Equilibrium Xenon, Steady State Power Levels, i run the simulator for at least 2 minutes and verify that key plant parameters are steady and reasonable. Key plant parameter  ! exemples are: RCS cold leg temperature, RCS Tave, RCS J Pressure, S/G Levels (PWR's), AT ) Power and Nuclear Instrumentation 1 Power, Calorimetric Power, Reactor Vessel Water Level (BWR's), Electric Power, Charging and Letdown Flows, Pressurizer Level

    \                                             (PWR's), S/G Feed and Steam Flows (PWR's), Xenon Reactivity Worths, Control Rod Positions and ASI (or                                 l Axial Offset).

o verify consistent calibration of l all power level instruments with ' calorimetric. o verify using Xenon Fastime X60 for four minutes, keeping power level and CEA position constant, that  ; Xenon reactivity worth does not t change any more than a value I J determined by the individual unit ASOT (20 pcm for MP2). o verify using Xenon Fastime X60 for four minutes, and keeping power level and CEA position constant, that ASI (AO) does not change any more than a value determined by the individual Unit ASOT. This is to ensure that a Xenon oscillation is , not present in an " equilibrium, ] steady state" IC.  ! l

-O-1 I

Rev.: 0 l Date: 1/12/89 Page: 5 of 11 NSEM-4.02 m _ i _ _ _ _______ _ _ _ _ J

    '4

[ NOTE: Xenon Fastime X60 for 4 minutes is

   - '                                                                                 used to simulate plant response over a 4 hour session in normal time due to      ,

changes in xenon.

              ,                                                              .         o   If during the previous step a significant unintentional or unacceptable Xenon oscillation is     ,

present, it should be damped with ' Xenon rastime X60 holding power

  <                                                                                        constant and using control rods to damp the_ oscillation at its midpoint.

l 6.4.4.2 IC's with reactor critical, but not covered by section 6.4.4.1. o For IC's with the reactor critical, but not covered by section 6.4.4.1, run the simulator for at least 2 minutes and verify'that key plant parameters are steady and/or reasonable. Key plant parameter examples are as stated above in section 6.4.:4.1. o verify consistent calibration of all power level instruments to the extent they would agree in the 1 - reference plant at the stated IC conditions, o verify using. Xenon rastime X60 for 4 minutes, keeping power level and CEA position constant, that xenon reactivity worth and ASI (AO) changes are observed to be consistent with the stated IC conditions and will not cause unreasonable conditions to occur for the operator. 6.4.4.3 IC's with reacter not critical and possibly the plant in various stages of heatup or cooldown. O Rev.: 0 Date: 1/12/89 Page: 6 of 11 NSEM-4.02 _mm____._____ -._ _ _ _ _.-__.__ _._____ _

E l... ] y. L ' o For IC's with the reactor not j#\, critical, run the simulator for at A-) least 2 minutes and verify that key plant parameters are steady and/or reasonable. Key plant parameters

                                          .               examples are: RCS Cold Leg Temp-erature, RCS T     , RCS Pressure, S/G Levels (PWhl s), Wide Range Power Level, Charging Flow, Letdown Flow, Pressurizer Level (PWR's),
,J                                                        S/G Feed Flows (PWR's), Xenon          l reactivity worth, 59C Flow (RHR Flow) etc.

o .Using Xenon Pastime X60 for 4 minutes observe that xenon reactivity worth change is consistent with IC description. 6.4.4.4 For all IC's ensure thAt any items mentioned in the IC description on the instructor station are correct and any key items not present on the instruc-tor station IC description are added. Examples of key items expected to be in the remarks section of the IC description are: BOL/MOL/EOL, Xenon f-- . Trend, time after reactor trip, ( unusual' Control Rod Position, unusual equipment lineups or any other item that helps the instructor to understand the starting point of the IC. 6.4.5 Section IV of the IC checklist shall contain a listing of items that require specific attention. These items may be a result of previous problems identified with IC's, or common confusion areas, as identified by the ASOT, Operations Consultant or other operator instructors. Items in this list may consist of remote functions, PPC points, system alignments, etc. If the Plant Process Computer is simulated, a list of key PPC displays that require verification may be included here. Any items listed in Section IV of the IC checklist shall be at the discretion of the ASOT and SRO instructor (preferably the Operations Consultant) who will certify the IC's. Rev.: 0 Date: 1/12/89 Phge: 7 of 11

                        ,                               NSEM-4.02
         'e                                                                                                                                                                                                               <
  , 9 l::,
             /I'                   .6.5   After the~ unit-specific IC checklist has been generated j                                      and'theLASOT concurs with the checklist, the designated
                                         ! instructor;(preferable the Operations. consultant) shall review each IC to be certified, using the~ checklist.

p 6.6 The IC checklist is not required to be retained as

                                          . documentation. It'is a tool'to help the SRO' qualified instructor-review IC's. Successful completion.of the checklist by the SRO qualified instructor constitutes certification of the IC.

6.7 for-each. certified IC, a capital "C" shall be.placed in the " Remarks" section of the' Initial, Condition

           '                               Description on the Instructor Station. The capital "C" 7                           shall be placed in the second line, last space for consistency and clarity. This will allow instructors to know which-IC's are certified. The instructor
  '                                        station-IC screen will be maintained up to date as to what IC's are certified.

6.8 It.is recommended that one SRO instructor'be common to all certified IC's reviewed (initially at least) to

                                          .give as much consistency to the IC's as possible.

Preferably this should be the unit operations Consultant. 6.9 Deleting a certified IC I il~

                                                                                                                                                                                                                           \

6.9.1 Ensure that the IC to be deleted has not been j identified to the NRC as one available for an -l upcoming exam. 6.9.2 If a. certified IC is meant to be deleted, the  !

                                                     "C"      is removed from the Instructor Station                                                                                                                       l
                                                     " remarks" section for that IC.                                                                                                                                      ,

NOTE: It is not the intent of this procedure to  ; establish any historical traceability for certified IC's. It is the intent of this procedure to ensure that for a Simulator training session (or Simulator Certification Testing) that the IC utilized reflects reason- l able plant conditions for the desired time in j life, power level, plant temperature, etc. 6.10 Adding a certified IC ' If a certified IC is to be added, an IC checklist is reviewed for that IC and if successfully completed, a capital "C" will be added to the " Remarks" column on the instructor station for that IC in the second line, Rev.: 0 Date: 1/12/89 Page: 8 of 11 NSEM-4.02 L - - _ _ . _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

   .k-4; last; space'.                        Also the~ remarks column on the -instructor station for that IC shall be reviewed to ensure.it.

4 correctly represents the IC conditions and contains

                                                            -those items. called for in'the IC checklist.

6.11 Modifications to certified IC's (updating or reshooting cIC's)- 6.11.1 All certified IC's shall be maintained up to date as SDC's are implemented on the simulator, ". such that certified IC's have been updated prior to useLin Simulator training, NRC exams or Simulator Certification. 6.11.2 As-SDC's'are implemented on the Simulator, the operator instructor performing the SDC retest (preferably the Operations consultant) shall determine whether any IC's need to be updated. 6.11.3.If: certified IC's need' updating, they shall be done prior to being used for Simulator Training or Simulator' Certification. activities. When a certified IC is. updated, (except as discussed in 6.11.4), the IC checklist shall be reviewed for any areas that need to be reverified at the discretion.of the SRO qualified instructor. The

                                                                      " remarks" column of the modified IC.shall be specifically reviewed to ensure it still agrees f~                                                                 . with the IC conditions and contains those items

(). called for in the IC checklist. 6.11.4 The ASOT at his discretion may allow a training or certification session to proceed using a.

                                                                     -certified IC not fully updated provided:

o The instructors are aware of the effects of not having the IC updated. o The certified IC not being updated will not cause negative training to operators or interfere with certification testing. l 1 6.12 Common Situations concerning IC's 6.12.1 Snapshots taken for Software / Hardware Trouble- 4 Shooting. These IC's need not be controlled in the manner covered by this procedure. 6.12.2 IC's that are continuations of a previous Training Session. L~ 6 Rev.: 0 Date: 1/12/89 Page: 9 of 11

t. NSEM-4.02 lc

_ _ _ _ _ _ _ _ _ _ _ _ _ - _ - _ _ _ _ _ _ _ . _ _ _ _ _ _ __ _a

                                                      .Some training sessions are,long enough to require a snapshot.be taken to continue a
  't5   ,'
      %                                                training session later.- These IC's need not be controlled in the manner covered by this procedure. A training session by its very nature results,in a variety of control manipul-ations that could place the plant correctly or incorrectly in a substantially different config-uration.that when the session started. When a snapshot of this type is taken, no action is required and'a "C" shall not be placed in the IC      !
                                                       " Remarks" section.

6.12.3 Creating a New Certified IC with Minor Modifications from an Already Existing Certified 3C. 1 If a certified IC is reset, and then minor changes are made and a snapshot is then taken, the snapshot'IC may be certified by: o Reviewing thelIC checklist for any areas that need'to be verified. This is at the discretion of the SRO qualified instructor. o Adding a capital "C" to the " Remarks" column on the instructor station for that IC in the second line, last space,

o. Ensuring that the " Remarks" column of the new IC agrees with the IC conditions and contains those items called for in the IC checklist.

6.12.4 Curriculum Testing l Curriculum testing does not require a certified IC. Curriculum testing by its nature could be exercising very limited systems or components and the judgement of the instructor is considered adequate for this purpose. 6.13 Use of certified IC's 6.13.1 Any time a Simulator LORT, LOIT, or LOUT Training session, NRC exam, or Certification i Testing session is being conducted, a certified IC shall be used for initialization (unless it is a continuation of a previous session). 6.13.2 IC's sent to the NRC for exam purposes shall be certified IC's. The ASOT shall determine whether all certified IC's or a subset of certified IC's shall be sent to the NRC. l Rev.: 0 Date: 1/12/89 p Page: 10 of 11

l

   .o 6.13.3 It is the intent of this procedure to

['N demonstrate that IC's used in operator training or certification activities have been reviewed

 -- (_)                               to ensure simulator conditions are represen-tative of reference plant conditions.

Attachment 8.2~(for MP2) will be the vehicle for ensuring consistency and adequacy of this review. 6.13.4 This preredure does not have to be re-executed over a four year interval since continuous updating of IC's is covered by this procedure. 7.0 FIGURES - NONE 8.0 ATTACHMENTS 8.1 Millstone Unit 1 Initial conditions Ver,ification Checklist 8.2 Millstone Unit 2 Initial Conditions verification Checklist 8.3 Millstone Unit 3 Initial Conditions Verification Checklist (O

    ' 'j                 8.4  Connecticut Yankee Initial Conditions Verification Checklist Rev.:  0 Date:  1/12/89 Page:  11 of 11 NSEM-4.02
   .,                                                                                ATTACHMENT 8.2 MILLSTONE 2 y ., .

[k - INITIAL CONDITIONS VERIFICATION

                                                                                       -CHECKLIST
                             'I.                                                  CONTROL BOARD WALKDOWN With the' Simulator in'"run", at each.of the following control boards,.en SRO licensed or certified instructor shall review switch. positions, controller settings, meter indications, annunciator. conditions, system. alignments etc to ensure they are consistent.with the intended conditions of-the Initial Condition.
                            'C-01                               C-01R                       'C-101             C-25A(B)
                            .C-02                               C-02R                      RC-05E              RC - 14   ^

C-03 C-03R RPS Ch A ESAS Ch A C-04 C-04R RPS Ch B ESAS Ch B C-05 C-05R RPS Ch C ESAS Ch C C-06 C-06R RPS Ch D ESAS Ch D C C-07R C-01X ESAS ActCABl C-08 C-08R C-80 ESAS ActCAB2 2-SI-652 Wall Switch RC.100 Hot Shutdown Panel

                            -II.                                                  REMOTE FUNCTIONS REVIEW With the Simulator in "run", for each of the following remote function systems, review each remote function to ensure its conditionals ~ consistent with the intended conditions of the Initial Condition.

CCR ESR RMR .TCR CHR FWR RPR TPR CVR IAR RXR TUR CWR MSR SGR WDR EDR RCR SIR EGR RHR SWR Rev.: 0 Date: 1/12/89 Page: 8.2-1 of 4 NSEM 4.02 l' = _ - _ - - - - _ _ - _ _ _ _

lY l

i. '

ai .III. INITIAL CONDITION STABILITY AND REASONABILITY Perform either'section A, B or C (A) For Equilibrium Xenon, Steady State, Power Levels (30%, 50%, 75%, 100% power, etc.) only, ensure the following' parameters are stable and-reasonable for'the first 2 minutes after resetting to the IC and going to run: i :o RCS Tc o Charging Flow o ASI reasonable o RCS Tave o Letdown Flow and consistent

o RCS Pressure o Pressurizer Level with power level o S/G Levels o-S/G Feed' Flows and CEA' position
l. o 6T Power: o S/G Steam F19ws (RPS),

o NIJPower o. Xenon Reactivity Worth (RPS) . from Instructor Station Calorimetric Power' L o is equilibr!dm worth on PPC CVMWTH' *

o. Electric Power o CEA Group 7 position .

is reasonable for power level

                    * ' Perform'SP 2601D'(whenfappropriate) to ensure consistency o ~ Usi=ng Xenon Fastime X60, and holding thermal power and CEA
      ~~ ;               position constant, ensure that total xenon reactivity does
                                                     ~
  .{                    not change by more than 20 pcm in 4 minutes of Xenon Fastime X60.

o Using Xenon Fastime X60, and holding thermal power and CEA-position constant verify that ASI does not change by more than. 03 from its initial value (if >50% power) or greater ithan .05 from its initial'value (if 750% power) during a 4-minute period of Xenon Factime X60. This can be done in parallel with'the previous step. o

  • Ensure any items mentioned in IC description on instructor station are correct and any key items not present on the instructor station IC description are added. Key items in remarks section of IC are BOL/MOL/EOL, Xenon Trend, load limit pot setting, unusual CEA positions, unusual equipment lineups, etc.

1 Rev.: 0 Date: 1/12/89 Page: 8.2-2 of 4 NSEM 4.02

        )J i
                    .(B)JFor IC's.which have thefreactor' critical, but do not fall h'                        'into7 category A1above, verify the following parameters are tstableiand/or reasonable for the first 2 minutes after j'                    resetting to the IC and going to run:

Lo RCS Tc o. Charging Flow o ASI is reasonable o RCS Tave o Letdown Flow for power-level o RCS Pressure o Pressurizer Level and CEA position o S/G Levels o S/G Feed Flows ' o AT Power o S/G Steam' Flows o NI Power o Xenon'rea;tivity worth  ! o' Calorimetric Power for the instructor station on PPC CVMWTH is reasonable

o- Electric Power o Group 7 CEA position is reasonable for power level o Using. Xenon Fastime X60.for 4 minutes, and holding thermal power and CEA position constant, observe'that the change in Xenon reactivity worth and.ASI are consistent with the
                                  . stated IC's conditions and will not cause unreasonable-conditions to occur for the. operator, o    ; Ensure any items mentioned in IC description on instructor
                                  . station are correct and any key items not present on the
                                  -instructor station IC description are added. Key items in the remarks section of the IC are BOL/MOL/EOL, Xenon Trend, load limit pot setting, unusual CEA position, unusual equipment lineups, etc.

E() (C) For IC's which the reactor is not critical and may be in - various stages of Plant-Startup or Shutdown, verify the following parameters are stable and/or reasonable for the first 2 minutes after resetting to the IC and going to run: , o RCS Tc o Pressurizer Level o If Heatup or i o RCS Tave o S/G Feed Flows cooldown is o RCS Pressure o Xenon reactivity worth in progress, o f/G Levels from the instructor # of running o_ Fide Range Power station is reasonable RCP's is oi CPS reasonable o Chstging Flow o If SDC in operation, SDC flow is steady o Using. Xenon Fastime X60 for 4 minutes, observe that the change in Xenon reactivity worth is consistent with the stated conditions of the IC description. () Rev.: 0 Date: 1/12/89 Page: 8.2-3 of 4 NSEM 4.02

k

        's)                                              "                                                 .
                      -o. Ensure'any:itemsLmentioned in the IC. description on the instructor' station'are correct and any key items not present on

, (('w,)1 the1 instructor station IC.descriptionLare added. Key items in ' W._/ ~ , the remarks section of.the-IC are BOL/MOL/EOL, Xenon trend, time s , afterLreactorEtrip, unusual CEA positions, unusual equipment

                                                             . lineups , etc. -

IV'. IC REQUIREMENTS TO~BE SPECIF.ICALLY VERIFIED 1 Verify the- following: RemoteLFunctions. CCR03 set to 50 F (to fail B'HXTCV open) lCVR03 set to open CVR13' set to closed' CWRO11 set to 55 F FWR55 set to.100 gpm (at-power IC's) FWR55. set to 0 gpm (shutdown IC's) . IAR17'Get to lead ~ LIAR 15- set toi start-

                                   ;MSR03' set to auto-
                                     'MSR05 set to auto MSR07 : set' to ' auto RPR31 set to full
                                       -SGR01 set to 51.5 SGR02 set to 38.0.

O SWR 04: set to'open' SWR 05, set to closed SWR 06 set-to closed SWR 07-set to closed:

                                       ' SWR 12 set to 0%

SWR 24. set to 10% ("B" RBCCW ~1500 gpm) TPR16' set to close TPR18 set to TBCCW. WDR06 set to..open WDR02 set to auto

                                      -PPC Points SCBLDN1 consistent with SGR01 SCBLDN2 consistent with SGR02 Rev.: 0 Date: 1/12/89 Page: 8.2-4 of 4 NSEM 4.02                                         >

L g -s

   -[:

Is h NORTHEAST UTILITIES NUCLEAR SIMULATOR ENGINEERING MANUAL NSEM - 4.03 CERTIFIED REMOTE FUNCTIONS i 3 Responsible Individual: M& nager, Opelstyr Training Bradch Approved: Dire uclear Training Revision: 0 Date: May 4. 1938 O SCCC Meeting No: 88-006

7 fx 1.0 PURPOSE ^~ The purpose of this procedure is to ensure that all Certified Remote' Functions have been verified to operate correctly ano establish a fornal list of Certified Remote Functions. 2.0- APPLICABILITY This procedure applies to the Nuclear Training Department (NTD), including the Operator Training Branch (OTB), Simulator Technical Support Branch (STSB) and other Northeast Utilities (NU) organizations performing functions in support of the NU Simulator Certification Program. 3.0 REERENCES 3.1 ANSI /ANS 3.5-1985 - This standard states the minimal functional requirements on design data and simulator performance and operability testing. 3.2 NRC RG.l.149 Rev. 1, April, 1987 - This guide describes an acceptable methodology for certification by endorsing ANSI /ANS-3.5, 1985 with some additional requirements. 3.3 10CFR 55.45(b) - Mandates a timetable for simulator /" facility certification and specifies additional testing (]_) requirements. 3.4 INPO Good Practice TQ-504 - Describes techniques for effectively controlling simulator configuration. 3.5 INPO Good Practice TQ-505 - Describes techniques for effectively controlling simulator configuration. 3.6 NUREG 1258, December, 1987 - Describes the procedures and techniques which will be employed to audit certified facilities. 3.7 NTDD-17, Simulator Certification and Configuration Management Control. 3.8 INPO 86-026, Guideline For Simulator Training, October, ' 1986. 3.9 INPO 87-006, Report on Configuration Management in the Nuclear Utility Industry, July, 1987. F Rev.: 0 l' Date: ~/4/88 1 \ Page: 1 of 6 NSEM-4.03

r 0 L i 4.0 DEFINITIONS . 4.1 Deficiency Report (DR) . Form (STS-BI-FIA) used by the l Operator Training Branch and the STSB to record all l identified simulator deficiencies between the simulator and refertnce plant. 4.2 Remote Function (REM) - an instructor initiated input to the simulator model which will provide-the same . discernible-effects as the corresponding manual i J operation in the reference plant. 4.3 Certified Remote Function - those remote functions which 3 will be tested.to. work correctly and may be.used in j simulator. training and exams. 4.4 System Test - a test developed for each modeled plant system that ensures proper response of all control board instrumentation, controls, annunciators, PPC points, remote functions, flowpaths and components that are associated with an individual plant system.  !

4. 5 - Qualified Instructor - an instructor designated by the ,

ASOT to perform a remote function test. l

           .5. 0 RESPONSIBILITIES A                      Assistant Supervisor Operator Training (ASOT) 5.1 1(_)

5.1.1 Responsible for assigning Simulator Instructors to' write, perform, and document simulator remote functions tests. 5.1.2 Responsible for reviewing and approving the Certified Remote Functions List (Figure 7.1). 5.1.3 Responsible for approval and acceptance of the remote functions test. 5.1.4 Responsible for scheduling the accomplishment of all certified remote function tests on a continuous four year basis. 5.2 Simulator Instructors 5.2.1 Responsible for writing and conducting remote function tests. I Rev.: 0 Date: 5/4/88 Fage: 2 of 6 NSEM-4.03

j. .

L \

i

                                                                                                 ,                 J f                                               5.2.2      Responsible for the establishment of simulator.

, /~ conditions necessary to perform remote function

    *                                                    . tests.                                                  <

5.2.3 Responsible for writing Deficiency Reports (DR) and retests for remote functions that do not respond as expected during the remote functions 'j test. { 6.0 INSTRUCTIONS i 6.1- -Establish List of Certified Remote Functions 6.1.1 Obtain the list of remote functions to be certified from NSEM-2.02. 6 .1. 2 : On Figure 7.1. list the alphanumeric code and description of.all Certified Remote Functions obtained from Step 6.1.1. Determine If All-Certified Remote Functions Have Been

                              ~

6.2 Tested in the System Tests (NSEM-4.01) 6.2.1 .Using a copy of Figure 7.1, ensure all certified remote functions have been tested to operate correctly in the NSEM-4.01 System Tests. 6.2.2 List any certified remote functions (from Figure

  ]  -                                                     7.1) that have not been tested in the System Tests on Figure 7.2.

6.3 Testing Of Certified Remote Functions Listed On Figure 7.2. Note:- If no certified remote functions are listed on Figure 7.2, this section (6.3) may'be skipped. This means all certified remote functions listed on Figure 7.1 were tested in the NSEM-4.01 System Tests. 6.3.1 A' Certified Remote Function Test shall be written to test each remote function listed on Figure 7.2. 6.3.2 The test will be Attachment 8.1 for Millstone 1, Attachment 8.2 for Millstone 2, Attachment 8.3 for Millstone 3 and Attachment 8.4 for Connecticut Yankee. Rev.: 0 l Date: 5/4/88 O' Page: 3 of 6 NSEM-4.03

i l * ) L J ' i V l I 6.3.3 The cover sheet for the test shall be rigure  ! es y 6.3.4 The test shall sequentially cover each of the remote functions listed on Figure 7.2. 6.3.5 Each variable remote function 'i.e. O to 100%, 55*F to 85*F, etc.) shall be cisecked at a minimum of 2 values and preferably a 3rd value. The minimum 2 values checked shall be at or near the endpoints of the variable remote function's range. For example, a 55* r to 85* r remote function might be checked at 56 r and 84"F. The third value should be somewhere in the mid position of the remote function range. 6.3.6 Two or three position (such as on/off or automatic / standby /off) remote functions shall be checked in all positions. 6.3.7 Acceptance criteria for variable or 2/3 position  ! remote functions shall be based on: o Utilizing plant data, if available, to determine correct response, o Utilizing the Simulator Design Data Base, such as P and ID's, electrical schematics, instrument loop drawings, f-~)s

    '-            operating procedures, etc., to determine correct response.

o Utilizing (as a last resort) a qualified instructor to determine correct response based on his/her experience. Valve testing shall include valve stroking l time information, if available. An example of reasonable response (lacking plant data) for a valve might be (1) valve closed - no flow, (2) valve opens, flow increases, (3) valve wide open - flow value is acceptable, if flow is known, (4) stroke time of valve (full open to full closed to full open or vice versa) is acceptable, if the stroke time is known. Rev.: 0 ['- Date: 5/4/88 Page: 4 of 6 NSEM-4.03

l l i i 1 6.3.8 Each step or substep of the remote function test  ; i I,_,j shall have a space for initials by the performer of successful completion of the step or substep. ) (_/ A space shall also be available for a DR number if the step or substep was unsuccessful. The location of these spaces for successful l completion initials or DR number shall be in the i body of the remote function test, in the right ) hand side of the remote function test, next to  ! j the respective step or substep number. i 6.3.9 The Remote Function Test shall be released for  ! testing when the ASOT signs the Figure 7.3 cover  ! sheet. l 6.3.10 The assigned simulator instructor shall perform ( the Remote Function test by establishing the i required simulator conditions, performing each step of the remote function test procedure and j either initialing off the successful performance j of a step or writing a DR for a step or substep l that was unsuccessful. l 6.3.11 If a DR is written, the DR shall reference the test and step number that failed. The DR number shall be written on the Remote Function Test a column for DR numbers, next to the appropriate j step. I 7- .s l l l

 \'        6.3.12 Any supporting information such as PPC                            ]

printouts, trend typer outputs or other docu- l i mentation shall be attached to the Remote J Function Test. Any attached documents shall be referenced in the " Remarks" section of the Figure 7.3 cover sheet. 6.3.13 The ASOT shall review and approve the completed Remote Function Test. 6.3.14 Revision level and dates for the Remote Function i test shall be assigned by the ASOT, when he l signs the Figure 7.3 cover sheet to release the l test for performance. 1 6.3.15 The approvec' Remote Fuliction Test procedure will be repeated over a four year interval. l l l l l Rev.: 0 Date: 5/4/88 l'

 /^)\

(_ Page: 5 of 6 NSEM-4.03 I 1 l l

S g- 7.0 FIGURES 7.1  : Certified ~ Remote Functions List 7.2 Certified Remote Functions Needing Testing By Procedure NSEM-4.03.

                                                     ' 7 .' 3                                  Remote Function' Test Cover Sheet 8.0                                          ATTACHMENTS
                                                     '8.1                                      Millstone 1       Remote Function Test Procedure
                                                     '8.2                                      Millstone 2       Remote Function Test Procedure 8.3                                 Millstone 3      Remote Function Test Procedure 8.4                                 Connecticut Yankee     Remote Function Test Procedure i

i O l I 1. 1 1 Rev.: 0 O Date: 5/4/88 Page: 6 of 6 NSEM-4.03 l

L. ' e 1 t

                                                                                                 \

l 1- l o Figure 7.1 f

                                                                                                 ]

l N l CERTIFIED REMOTE FUNCTIONS LIST  ! i UNIT PAGE OF 1 1 1 f l I (

                           /

l Rev.: 0 i ASOT Signature /Date Date: 5/4/88 Page: 7.1-1 of 1 NSEM-4.03 ( i

_ _ _ _ _ _ _ = _ _ _ _ _ _. _ _ _ l it Figure 7.2 { I CERTIFIED REMOTE FUNCTIONS NEEDING TESTING i BY PROCEDURE NSEM-4.03 I l UNIT PAGE OF O l l l l i l l l l

                                                                                                             )

I l

                                                                                                            \

l 1 l I \

   ,GN_
   '                                                                                     Rev.: 0 I                                                                                         Date: 5/4/88 Page: 7.1-1 of 1 NSEM-4.03
      #i i;
  .E V l..

Figure 7.3 REMOTE FUNCTION TEST COVER SHEET _ ATTACHMENT REV. UNIT , l i1-

t, >

fj . Developed.By Date Released.for Testing by ASOT.- Date Remarks: Accepted By Date Assistant Supervisor Operator Training ( Rev.: 0 i: .\ Date: 5/4/88 Page: 7.3-1 of 1 NSEM-4.03

  ;?r
   ;c:; .                                                                                            ,
  .' h .

rs, x.) . NORTHEAST UTILITIES NUCLEAR. SIMULATOR ENGINEERING MANUAL o NSEM - 4.04 Lt MAJOR MALFUNCTION TESTING X i Responsible i 1x. Individual: ~ ' Titre N Approved: Dire @ Nuclear Training Revision: O s Date: 6/29/88 l

                                                                                                    \

SCCC Meeting No: 88-008 l l 1 __________2_______. _.

a i 1

                              - 1. 0 PURPOSE                                                               {

This procedure defines the requirements for developing unit specific procedures to test major malfunctions. The procedure also describes techniques for evaluating simulator test response data and includes general require-ments for acceptance criteria. . l

                              .2.0'  APPLICABILITY                                                         ,

This procedure applies to the Nuclear Training Department (NTD), including Operator Training Branch (OTB), Simulator Technical Support Branch (STSB), and other Northeast Utilities (NU) organizations performing functions in support of the NU Simulator Certification Program. 1

3.0 REFERENCES

3.1 ANSI /ANS 3.5-1985 - This standard states the minimal functional requirements on design data and simulator performance and operability testing. 3.2 NRC RG 1.149 Rev. 1, April, 1987 - This guide describes an acceptable methodology for certification by endorsing ANSI /ANS-3.5, 1985 with some additional fs requirements. (} 3.3 10CFR 55.45(b) - Mandates a timetable for simulator facility certification and specifies additional testing requirements. 3.4 INPO Good Practice TQ-504 - Describes techniques for effectively controlling simulator configuration. 3.5 NUREG 1258 December, 1987 - Describes the procedures and techniques which will be employed to audit certified facilities. 3.6 INPO 86-026, Guideline For Simulator Training, October, 1986. 3.7 INPO 87-006, Report on Configuration Management in the Nuclear Utility Industry, July, 1987. Rev.: 0 Date: 6/29/88 Page: 1 of 15 NSEM-4.04

                                                                                                           )

['N (.-- 4.0 DEFINITIONS 4 .1- . Deficiency Report (DR) - form (STS-BI-FIA) used by the Operator Training Branch (OTB) and the Simulator Technical Support Branch (STSB) to record all identified simulator deficiencies between the simulator and reference plant. 4.2 Major Malfunction - those malfunctions which produce i extensive integrated effects in a number of plant systems which requires complicated analysis to verify acceptable response. 4.3 Best Estimate Analysis - analytical technique used to evaluate the acceptability of simulator response to a given malfunction in the absence of available reference data for comparison. Experience and/or rough engineering calculations and mass energy balances will be.used by instructors to perform best estimate analysis. 4.4 critical Parameters - those parameters, specific to a given major malfunction, which are driven directly by the initiatirg. event, required for diagnosis, or required to verify proper plant response to safety equipment actuations and/or operators' corrective g- actions.

'% /

5.0 RESPONSIBILITIES 5.1 Assistant Supervisor Operator Training (ASOT) 5.1.1 Responsible for approving the list of major malfunctions to be tested in accordance with this procedure. 5.1.2 Responsible for assigning instructors to develop major malfunctions test procedures. 5.1.3 Responsible for assigning instructors to perform major malfunction testing. 5.1.4 Responsible for approving the results of major malfunction testing. 5.2 Simulator Technical Support Branch (STSB) 5.2.1 Responsible for providing simulator response data plots, as required, to support major malfunction testing evaluation. Rev.: 0 Date: 6/29/88 Page: 2 of 15 NSEM-4.04

a l (~h 5.3 Operator Instructors w/ 5.3.1 Responsible, as assigned, for performing functions in accordance with this procedure. 5.3.2 Responsible for writing DR's as required. 6.0 INSTRUCTIONS 6.1 Selecting Malfunctions To Be Tested In Accordance With This Procedure 6.1.1 Assigned instructors shall review the simulator's malfunction list to identify those malfunctions which produce extensive integrated effects in a number of plant systems requiring complicated analysis to verify acceptable response. 6.1.2 Those malfunctions identified shall be listed on NSEM-4.04 Form 7.1. 6.1.3 For malfunctions with variable severity levels, a severity shall be specified for the test. This will normally be 100% severity. A 7s lesser severity may also be indicated based on frequent usage during training or availability (' ') of data for comparison. 6.1.4 Composite malfunctions from NSEM-2.02 Form 7.7 shall also be listed on NSEM-4.04 Form 7.1. j 6.1.5 Submit the completed NSEM-4.04 Form 7.1 to the ASOT for review and approval. 6.1.6 Initiate a Major Malfunction Test Procedure Worksheet, NSEM-4.04 Form 7.2 for each malfunction and severity listed on the approved NSEM-4.04 Form 7.1. 6.2 Selecting Reference Data For Evaluation of Major Malfunctions 6.2.1 As available, reference data should be selected for similar malfunctions, (type and severity), preferentially in the order listed from the following sources: 6.2.1.1 Reference plant data r) \/ Rev.: 0 Date: 6/29/88 Page: 3 of 15 NSEM-4.04

rs t 6.2.1.2 Analytical data for the reference i

      '~)                                                           plant using realistic vice conservative assumptions.

6.2.1.3 Reference plant FSAR data. 6.2.1.4 Generic NSSS vendor data for similar plant size and type. G.2.1.5 Industry event reports for similar plants, (LER's, etc.). Note: Best estimate analysis may be required, for some malfunctions, to adequately analyze simulator response test data when the reference data source covers only a limited number 1 of parameters. This determination will be made in 6.4. 6.2.2 List the selected data source on the respective NSEM-4.04 Form 7.2. 6.2.3 For any malfunctions where reference data is

                                                  .        not available, specify "Best Estimate Analysis." The response of these malfunctions
     ,s                                                    will be analyzed using best estimate analysis
   ;    )                                                  techniques described in Section 6.7.

LJ 6.3 Selecting Critical Parameters For Malfunction Response Evaluation 6.3.1 Assigned instructors shall review each malfunction to determine the critical parameters. The following criteria should be considered: 6.3.1.1 Parameters driven directly by the initiating event. 6.3.1.2 Parameters required for diagnosis. 6.3.1.3 Parameters required to verify proper plant response to safety equipment actuations and/or operators' corrective actions. rl 6.3.2 Specify the selected critical parametera on NSEM-4.04 Form 7.2 for the respective malfunction. a

    /~

(_N) Rev.: 0 Date: 6/29/88 q Page: 4 of 15  ! NSEM-4.04

        . _ _ _ _ _ _ _ _ _ _ - - - _ _ _ _ _                                                                  1

y"'N 6.4~ Determining Acceptance Criteria for The critical

   -()         Parameters 6.4.1  Assigned instructors shall review the reference data for each malfunction to determine areas where acceptance criteria may be specified. Consider the following:

6.4.1.1 Maximum / minimum values. 6.4.1.2 Turning points (trend reversals). 6.4.1.3 Time to reach the above. 6.4.2 Check the assumptions specified for reference data to determine if values for acceptance criteria should be modified to reflect realistic vice conservative expectations. Note: Major malfunctions will be tested with normal system response versus the time delays, incomplete actuations, or conservative actuation setpoints assumed for most accident analyses. 6.4.3 Specify selected acceptance criteria for each parameter on NSEM-4.04 Form 7.2. Acceptance

   -['T criteria is to be stated as approximations V-               versus exact values.

Example: "RCS pressure decreases to approximately 80 psia within 15 to 20 seconds then slowly decreases to approximately containment pressure." 6.4.4 When the reference data source does not provide data for a selected critical parameter it will be necessary to evaluate the simulator test response data using best estimate analysis per Section 6.7. This shall be indicated by an asterisk to the left of the respective parameter on NSBM-4.04 Form 7.2. 6.4.5 For malfunctions without available reference data, acceptance criteria will be developed using best estimate analysis of simulator response data for that malfunction, per Section 6.7. 1 l ) Rev.: 0 Date: 6/29/88 Page: 5 of 15 l NSEM-4.04 l

l. _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ . _ - _ _ _ _ _ _ _ . _ _ _ _ _ _ - _ - _ - -
 >m.

l L. *.. k 6.5 ' sele'cting'A Test-Duration-6.5.1 . Assigned instructors should review each malfunction'to determine an appropriate

                                                                         . duration for the response' test.. The selected-duration should encompass the following:

6'.5.1.1- . Sufficient to achieve the specified acceptance' criteria'for all~ critical parameters. 6.5.1.2 Sufficient to allow for operator l diagnosis, safety. systems actuations, i . initial operator response, and verification of plant response to corrective actions. 6.5.2 A test run of the malfunction may be used to

                                                                         ' help determine an appropriate duration.

6.5.3 Specify the selected duration on NSEM-4.04

                                                         .                Form 7.2.

6.6L Selecting Data. Recording Methods And Appropriate Recording Intervals 6.6.1 Review the data recording options to select 1( ). the best method (s) for each malfunction test.

                                                                          . Automated data recording program.
                                                                          .: Typer trend.from PPC
                                                                          . Gould strip recorders
                                                                          . Hardcopy of " Monitored Parameter Plot Screens"
                                                                          . Manual. recording (if necessary) 6.6.2       Consideration should be given to the following:

6.6.2.1 Availability of required parameters, not all parameters are available to all record programs, especially PPC calculated parameters. 6.6.2.2 Available recording intervals. 6.6.2.3 Scale divisions and accuracy required to satisfy acceptance criteria. 6.6.3 Specify the selected data recording method (s) on NSEM-4.04 Form 7.2. Rev.: 0 Date: 6/29/88 Page: 6 of 15 NSEM-4.04

                   .. s i^

i 4

                 /N '                                      '6.6.4          _A recording interval shall be selected when 1(jj^                                                          .the automated data recording program, typer
                                                                           -trend,-or' manual' recording is selected. The followingLaspects.should be considered when selecting a recording interval:

6.6.4.1 Recording intervals should be sufficiently frequent'to establish trending for the-parameter with the most rapid rate of response. 6.6.4.2 Recording intervals should be sufficiently frequent to plot parameter oscillations if that aspect is important to malfunction diagnosis or verification of response to corrective actions. i 6.6.5 specify the selected recording interval for the record program and/or trend typer on NSEM-4.04 Form 7.2. 6.7: Developing Acceptance Criteria Using Best Estimate Analysis 6.7.1 Malfunctions / parameters without'available

                             - .                                           ' reference data, identified'in 6.2.3/6.4.4, shall be test run to obtain simulator response
            '{-]s-
                 .                                                          data.

6.7.2- Data for critical parameter response shall be collected using the recording method (s)'and interval (s)' selected per Section 6.6. 6.7.3 Assigned instructors shall review the response data to determine if the simulator response is acceptable. The following criteria must be met: , { 6.7.3.1 The observable response of any ] critical parameter shall not violate < any physical laws of nature. ] 6.7.3.2 Alarms and/or automatic actuations which should occur, do occur. j l 6.7.3.3 Alarms and/or automatic actuations ' which should not occur, do not occur. I 1 I ( Rev.: 0 Date: 6/29/88 Page: 7 of 15 NSEM-4.04

+ ' G' ? 6.7.3.4 Parameters which are closely coupled fx - due to physical relationships do not exhibit independent, incorrect response. l 6.7.3.5 The rate and magnitude of the I response-does not adversely affect I diagnosis or grossly misrepresent the 1 severity of the malfunction. f 6.7.4 If the simulator response data for all of the

    +                                                  critical parameters satisfies all the criteria of 6.7.3, it may be used to determine acceptance criteria in accordance with Section 6.4.

6.8 Writing Unit Specific Major Malfunction Test Procedures 6.8.1 The major malfunction test procedures for each unit shall be identified as the respective attachment to NSEM-4.04. 6.8.2 The procedure shall be in columnar format, similar to the ATP format. The following column headings shall be used: () 6.8.2.1 6.8.2.2 STEP # - for' procedural step number PROCEDURE / RESULT - for specifying procedural actions or required response. 6.8.2.3 PANEL - for specifying the panel location where required response is to be verified. 6.8.2.4 TAG # - for specifying the identifying number of the instrument, , controller, PPC point, etc. to be verified. 6.8.2.5 ACCEPT /DR - for the test instructor to initial if simulator response is acceptable, or to enter a DR# if required. 6.8.3 Each major malfunction to be tested shall be a section of the procedure. () Rev.: 0 Date: 6/29/88 Page: 8 of 15 NSEM-4.04 i l

p . . t D . . . 6.8.4 Each section shall have a heading specifying L (_ (~)NL the following:. 6.8.4.1 The alpha-numeric designator for the malfunction (each designator for composite malfunctions). 1

                  . 8.4.2   A brief description of the L                            malfunction.

l- 6.8.4.3 A listing of the equipment / locations L available for selection for generic malfunctions. Example: RCO2A(B) - #1(2) RCS Hot Leg Break, RCllA(B,C,D) RCP A(B,C,D) Locked Rotor. 6.8.4,4 Available range for variable malfunctions including an approximation of the effect at 100% severity NOP/NOT. Example: Range 100%, 100% equals 200 gpm-

         .                  at 2250 psia RCS pressure.

6.8.5 The heading shall be followed by a list of the critical. parameters to be recorded for the O' test. The recording method, duration, and interval, (if appropriate), shall be included. This information is obtained from worksheet NSEM-4.04 Form 7.2. 6.8.6 Next, list any PPC points to be verified during the test. 6.8.7 Specify simulator initialization requirements, j' including any alignment changes required. 6.8.8 Specify requirements for entering the L malfunction (s). The severity specified on l NSEM-4.04 Form 7.2 shall be used. Where i severa) severities are listed, the maximum j (norma 2.ly 100%) shcu20 be tested first.  ? 6.8.9 Following the malfunction insertion, the procedure shall list results to be verified J and subsequent actions to be performed. As possible, each item should be listed in the ' order it is to be performed.

                                                                      )

Rev.: 0 Date: 6/29/88 Page: 9 of 15 NSEM-4.04

e., i h -,

                                                                                             ~

f/~ U Note: Major malfunction test procedures shall be written to provide for reproducibility of effects. Where possible, subsequent actions should be initiated by timed actuations or Boolean triggers versus manual action. A time mark or parameter value shall be specified for any required manual actions. 6.8.10 Simulator response to be verified should specify the following: 6.8.10.1 Acceptance criteria from NSEM-4.04 Form 7.2 for the critical parameters. f 6.8.10.2 Response requirements for equipment actuations, e.g.: start, stop, shift. 6,8.10.3 Response requirements for key annun-  ! ciators and automatic systems i actuations. 6.8.10.4 General' response requirements for parsmeters related to, but not considered critical to, the malfunction, e.g.: , increase, decrease. T

  • 6.8.11 The procedure should specify an endpoint for the test, either in terms of duration or endpoint conditions.

6.8.12 For malfunctions with more than one severity l specified on NSEM-4.04 Form 7.2, subsequent testing shall-be performed to verify response. Much of the information from the initial test may be referenced, but separate statements of acceptance criteria for the critical parameters is required. 6.8.13 ror variable malfunctions, the test procedure i shall specify performance at two intermediate I severity levels, e.g.: 50% and 10%. l 6.8.14 The procedure for intermediate severity level I tests may reference the initial test i information heavily. Response requirements may take the form of: "approximately 1/2 (1/10) l the rate of", "to a lesser extent", etc. IL Rev.: 0 l

                                                                      ~

Date: 6/20/88 Page: 10 of 15 NSEM-4.04  ! i

                                                                                                                                                 'I
                                                                                                        ~~    ___-______________-__.----___---A

6.8.15 Generic malfunctions which produce similar, (~^)

i. symmetrical response shall be tested and recorded for one component or location. The remaining components or. locations shall be tested to verify similar effects. The procedure shall specify any observable differences. Data recording and comparison is not required.

6.8.16 Generic malfunctions which produce different or non-symmetrical response will require testing of a spectrum of components (not necessarily all), to verify acceptable response. Control rod malfunctions are an example of this type of generic malfunction. 6.C.17 The response of certain malfunctions may vary significantly with time in core life (reactivity feedbacks). The malfunction shall be tested and recorded at a time in core life which corresponds to that used for the reference data. The malfunction shall be tested at the other extreme to verify relative differences in the reactivity feedbacks (lesser, greater). Data recording and comparison is not required.

                         ~

6.8.18 The response of certain malfunctions may vary (s-)/ significantly based on whether or not procedurally required operator actions are or are not performed. The malfunction shall be j tested and recorded, to best replicate the assumptions used in the reference data. The other alternative shall be tested to verify response differences. Data recording and comparison is not requiredo Example: RCS pressure (RCP's tripped vs. not tripped). 6.9 Performing Major Malfunction Testing 6.9.1 Recorded response data is to be in accordance with procedure requirements; method, duration, and interval. 6.9.2 Observe simulator response to ensure that key annunciators and automatic systems actuations respond as required. i Rev.: 0 Date: 6/29/88 Page: 11 of 15 NSEM-4.04

    ~
          ~
 .k
    .i.
   ,r'TL         ' 6. 9. 3 ' observe simulator response to ensure that
  - (_/ v                . annunciators and automatic systems actuations
     ~

which should not occur, do not occur.

                  '.9.4 6       Indicate the acceptability of simulator response by initialing in the ACCEPT /DR column for cach item where a " RESULT" is specified.

6.9.5 Indicate non-acceptance by entering the number of the.DK in the ACCEPT /DR column. 1 6.9.6 Obtain hard copies of recorded data for the I critical parameters._ Unless specified I otherwise in the test procedure, recorded data is only required for the maximum severity < test. 6.10 Evaluating Simulator Response Data For Critical Parameters And Resolving Discrepancies 6.10.1 Simulator response data should be compared to 4 acceptance criteria specified in the test procedure. Response which meets all stated. I criteria may be signed'off as acceptable. I 6.10.2 Where significant differences exist, the reference data should be reviewed again to . determine if any of the following conditions exist:  ; i 6.10.2.1 Previously unidentified conservatism; time delays, incomplete equipment actuations,. conservative actuation I setpoints, differences.in initial conditions. l 6.10.2.2 Incorrect interpretation of scales; psia vs. psig, log vs. linear. J 6.10.3 If any of the above are discovered, reevaluate specified acceptance criteria and adjust

                         -accordingly. Then, reevaluate simulator       l response data.

6.10.4 If significant differences still exist or if , l no problems are found with the reference data, the simulator response data should be subjected to best estimate analysis by two or l more assigned instructors. All the criteria of 6.7.3 must be met. )

          )                                               Rev.: 0 Date: 6/29/88 Page: 12 of 15 NSEM-4.04 1

c.. -

                                                                                                    .y lI*-

1

                                                                                                    'l e

Er- 6.10.5 ror simulator response deemed acceptable per ( f 6.10.4, the following additional steps are required: q 6.10.5.1 The results of the best estimate i analysis must be reviewed and ) approved by the ASOT. 6.10.5.2 The acceptance criteria on the NSEM-4.04 Form 7.2 must be modified to encompass the simulator response. l 6.10.5.3 The NSEM-4.04 Form 7.2 must be annotated to indicate that acceptance criteria for a specific parameter (s) has been modified per best estimate , analysis.  ! 6.10.5.4 The test procedure must be revised to reflect the modified acceptance criteria. 6.10.6 Submit a DR for simulator response which is unacceptable. Successful completion of the respective major malfunction test shall be i specified as a requirement for closecut. i 6.11 Reviewing / Revising The Cause And Effect Document (C&E)

                                }

6.11.1 Review the malfunction response description to

b. i.n accordance with the test results. The acceptance criteria should be specified as response for the critical parameters.

6.11.2 Revise the C&E document as required to be consistent with simulator response. 6.11.3 Review the C&E document for typos, proper content and format per NSEM-4.01 Section 6.3, and any inaccuracies, revise as necessary. 6.12 Reporting Results of Major Malfunction Testing 6.12.1 Major malfunction test results shall be reported for initial performance testing and anytime that retesting is required due to simulator design changes resulting in significant simulator configuration or performance variations. i ) Rev.: 0 Date: 6/29/88 Page: 13 of 15 NSEM-4.04

v r 1(^V The following items shall be included in the

d report:

6.12.1.1 Signed off originals of applicable major malfunction test procedures. 6.12.1.2 Copies of reference data used to specify. acceptance criteria for the critical parameters. 6.12.1.3 Hard. copies-of the recorded simulator response data for the critical parameters. 6.12.2 Completed, signed, sections of. applicable p~ major malftmetion test procedures with hard

                                                            -copy data for the critical parameters attached, are sufficient to satisfy reporting requirements for:on-going simulator testing, (4 year cycle).

6.13' Simulator Design Changes 6.13.1 Identifying the need to add new major

                                                   .         malfunctions or to perform non-scheduled major malfunction testing due to simulator design changes will be addressed in NSEM-5.02.

i 6.14 Disposition of Forms Generated 6.14.1 Forward completed originals of the following to the ASOT for review and approval: 6.14.1.1 NSEM-4.04 Form 7.1 Major Malfunction List. 6.14.1.2 NSEM-4.04 Form 7.2 Major Malfunction Test Procedure Worksheet with copies of the reference data used to develop acceptance criteria for the critical parameters as attachments. , 6.14.2 The ASOT will forward approved originals of NSEM-4.04 Form 7.1 and NSEM-4.04 Form 7.2 with attachments to controlled document storage for retention with simulator certification records. 6.14.3 The A$0T, or a designee, shall maintain copies of the above, for ready retrieval, should i simulator design changes require revisions. l Rev.: 0 i

   ~

Date: 6/29/88 Page: 14 of 15 NSEM-4.04

l < l l 1 (~ 7.0 FORMS l V} 7.1 Major Malfunction List i i 7.2 Major Malfunction Test Procedure Worksheet 8.0 ATTACHMENTS i 8.1 Major Malfunction Test Procedure - MP1 8.2 Major Malfunction Test Procedure - MP2 8.3 Major Malfunction Test Procedure - MP3 8.4 Major Malfunction Test Procedure - CY 1 (~

  '(                                                Rev.: 0 Date: 6/29/88 Page: 15 of 15 NSEM-4.04

i FORM.7.1 [ \ MAJOR MALFUNCTION LIST (.) . PLANT MAJOR MALFUNCTIONS FROM MALF LIST REVIEW SEVERITY (IES) I I 1 I [v COMPOSITE MALFUNCTIONS FROM NSEM-2.02 FORM 7.7 SEVERITY (IES) Ap' proved: Date: ASOT l l (v ) Rev.: 0 Date: 6/29/88 Page: 7.1-1 of 1 NSEM-4.04 1 i 1

1. j
   . s: ,

( _.if FORM 7.2 MAJOR MALFUNCTION TEST PROCEDURE WORKSHEET PLANT MALF # SEVERITY TEST DURATION MALF DESCRIPTION REFERENCE DATA SOURCE Critical Parameters Acceptance Criteria Record Method Interval l-

                                                                                                                                                     .l 3

O

  • Acceptance criteria.for this critical parameter was determined / modified based on best estimate' analysis of simulator response data:

l Approved: Date:

                                   ^" '

10 Rev.: 0 Date: 6/29/88 Page: 7.2-1 of 1 1 Q--_-___.__.__-..._.

 . T ')

V NORTHEAST UTILITIES NUCLEAR SIMULATOR ENGINEERING MANUAL

                                                                                                                                                      .l-NSEM - 4 05 l

MALFUNCTION TESTING

  ?

v Responsible 4 Individual: \ Manager, Operat6# Training Branch Approved: Dfector,juclearTraining Revision: 1 Cate: May 11, 1989 SCCC Meeting No: $9- 00 f

                                                           . _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - _ _ - _ _ _ _ _ _ _                            l

l

 -n (v)   1.0 PURPOSE This procedure defines the requirements for developing unit specific procedures to test all malfunctions except those malfunctions already covered by NSEM 4.04, Major Malfunction Testing.

2.0 APPLICABILITY This procedure applies to the Nuclear Training Department (NTD), including Operator Training Branch (OTB), Simulator Technical Support Branch (STSB), and other Northeast Utilities (NU) organizations performing functions in support of the NU Simulator Certification Program.

3.0 REFERENCES

3.1 ANSI /ANS 3.5-1985 - This standard states the minimal functional requirements on design data and simulator performance and operability testing. 3.2 NRC RG 1.149 Rev. 1, April, 1987 - This guide describes an acceptable methodology for certification by endorsing ANSI /ANS-3.5, 1985 with some additional

    ~~

requirements. t 3.3 10CFR 55.45(b) - Mandates a timetable for simulator facility certification and specifies additional testing requirements. 3.4 NUREG 1258 December, 1987 - Describes the procedures and techniques which will be employed to audit certified facilities. 3.5 INPO 86-026, Guideline For Simulator Training, October, 1986. 4.0 DEFINITIONS 4.1 Deficiency Report (DR) - form (STS-5I-FlA) used by the Operator Training Branch (OTB) and the Simulator Technical Support Branch (STSB) to record all identified simulator deficiencies between the simulator and reference plant. 4.2 Malfunction - A specific equipment failure which oroduces discernable indications in the Control Room

                 'that replicates the same equipment failure should it occur in the reference plant. Specific preprogrammed malfunctions are available at the simulator 7-instructor station.

( l

  '%                                                   Rev.: 1 Date: 5/11/89 Page: 1 of 13 NSEM-4.05

l-l 1I) 4.3' . Major Malfunction - those malfunctions which produce extensive integrated effects in a number of plant systems which requires complicated analysis to verify accepta' le response. 4.4 Best Estimate Analysis - analytical technique used to evaluate the acceptability of simulator response to a given malfunction in the absence of available reference data for comparison. Experience and/or rough engineering calculations and mass energy balances will be used by instructors to perform best estimate analysis. Instructors who will perform Best Estimate Analysis will be designated by the ASOT and shall be SRO licensed or certified. 4.5 Critical Parameters - those parameters, specific to a given malfunction, which are driven directly by the initiating event, required for diagnosis, or required to verify proper plant response to safety equipment actuations and/or operators' corrective actions. 5.0 RESPONSIBILITIES 5.1 Assistant Supervisor Operator Training (ASOT) (~ 5.1.1 Responsible for assigning instructors to

    \                                         develop malfunctions test procedures.

5.1.2 Responsible for assigning instructors t perform malfunction testing. 5.1.3 Responsible for approving the results of malfunction testing. 5.2 Operator Instructors 5.2.1 Responsible to write and perform malfunction test procedures, as assigned by the ASOT. 5.2.2 Responsible for writing DR's, as required. 5.2.3 Responsible for performing Best Estimate Analysis on malfunction results, as assigned by the ASOT, where results cannot be reconciled with acceptance criteria. 6.0 INSTRUCTIONS 6.1 Selecting Malfunctions To Be Tested In Accordance With This Procedure

    \~/                                                                       Rev.: 1 Date: 5/11/89 Page: 2 of 13 NSEM-4.05

le [1 1 Ef/~N 6.1.1 All malfunctions that are available from the (_sb instructor' station shall be tested except: 6.1.1.1 Those malfunctions already. tested per NSEM 4.04, " Major Malfunction Testing" do not need to be tested by this procedure, i 6.1.1.2 Those malfunctions designated as not i to be certified by NSEM 2.02 Form 7.6 1 do'not need to'be tested by this procedure. These malfunctions shall not be used in any' training sessions. t 6.1.2 For malfunctions with variable severity levels, a severity shall be specified for the 1 test. This will normally be 100% severity. A lesser severity may also be indicated based on r frequent usage during training or availability of data for comparison. 6.1.3 If a malfunction is not. tested at its highest severity level (100%), then it cannot be used for training at any higher severity level than it was tested at. For, Variable Severity 1 Malfunctions, per step 6.8.13, at-least 1 and preferably 2 intermediate severities (ie. 50%, () 10%) should be tested to verify reasonability.

                                         ' Example:   A typical variable severity wa1 function                                                                                              l will be tested at 100%, 50%, and 1%.                                                                                          The detailed' test will be for 100% severity and the two intermediate severities will be checked for reasonability of response.

Note: Certain variable malfunctions have the most seve're consequences at severities other than 100%. The testing concept for these I malfunctions is still to ensure that testing at the highest consequences severity is performed as well as at other intermediate severities. For example, a valve malfunction might result in the valve failed ahut at 0% severity and this could be tha most severe consequences for that malfunction. In this case 0% severity shall be tested and 2 other severity levels, such as 50 and 100% severity, would be tested to determine intermediate responses of the malfunction.

     /^

Rev.: 1 Date: 5/11/89 Page: 3 of 13 NSEM-4.05

           'l                6.2     Data Source for Acceptance Criteria 6.2.1   The malfunctions to be tested'by this                                                                                                                                    l procedure will typically be best estimate analysis.                            Those malfunctions with complex system interactions, and complex reactivity and thermohydraulic effects which lend themselves better to computer code analysis have already been covered by NSEM 4.04, Major Malfunction Testing.                                    If, however, malfunc-tions covered by this procedure, have any of the following data sources available, they shall be used to replace or supplement best estimate analysis in the following order:

6.2.1.1 Reference Plant Actual Data 6.2.1.2 Analytical data for the reference plant using tealistic assumptions. 6.2.1.3 Generic Analytical Data for a similiar size / type of plant using realistic assumptions. 6.2.1.4 Industry event reports for similiar plants with sufficient information to

(-)s s.

analyze the event (LER's etc.). 6.2.2 If the data source for malfunction testing is other than Best Estimate Analysis, the malfunction test procedure shall reference che data source. 6.3 Selecting critical Parameters For Malfunction Response Evaluation 6.3.1 Assigned instructors shall review each malfunction to determine the critical parameters. The following criteria shall be considered in selecting critical parameters: 6.3.1.1 Parameters driven directly by the initiating event. 6.3.1.2 Parameters required for diagnosis. 6.3.1.3 Parameters required to verify proper plent response to safety equipment actuetien's and/or operators' correc-

                                                                                ~

l tive actions. l l 4

         \~                                                                                                                  Rev.: 1 Date: 5/11/89 i                                                                                                                             Page: 4 of 13 NSEM-4.05

l-i. p-

 \-        6.3.2  Critical parameters determined from 6.3.1 shall be tested in the malfunction test.

A list of selected critical parameters need not be documented. 6.3.3 In addition to the critical parameters described above, assigned simulator instructors shall determine any other parameters which, in their judgement, need verification of proper response. Parameters in this category shall be those which could at some point in the transient effect the operator's tesponse or cause negative training if it was not correct. These parameters will be termed the non-critical parameters. A list of selected non-critical parameters need not be documented. 6.4 Selecting a Test Duration 6.4.1 Assigt.ed instructors shall review each malfunction to determine an appropriate duration for the test. The duration chosen shall allow sufficient time for operator diagnosis, safety systems actuations, initial r-4 operator response and verification of plant response to corrective actions. 6.4.2 A test run of the ma3 function may be used to help determine an appropriate duration. 6.5 Data Recording 6.5.1 No automated data recording is required for malfunctions tested in accordance with this procedure. However, at the discretion of assigned instructors, recording may be helpful in reviewing the malfunction response. 6.6 Acceptance Criteria Using Reference Data 6.6.1 Assigned instructors shall review reference data for a malfunction to determine areas where acceptance criteria may be specified. Each critical parameter shall have acceptance criteria. Consider the following for a selected critical parameter acceptance criteria with reference data: 6.6.1.1 Maximum' Minimum values O l V Rev.: 1 Date: 5/11/89 l Page: 5 of 13 NSEM-4.05

[)

 \'

i 6.6.1.2 Turning points or trend reversals 6.6.1.3 Time to reach the above 6.6.2 Check the assumptions specified for reference data to determine if values for acceptance criteria should be modified to reflect realistic vice conservative assumptions. Note: Malfunctions will be tested with normal system response versus the time delays, incomplete actuations or conservative actuation setpoints I assumed for many analyses. 6.6.3 Acceptance criteria shall be stated as approximations versus exact values. Example: RCS pressure decreases to approximately 80 psia within 15 to 20 seconds then slowly decreases to approximately containment pressure. 6.6.4 For malfunctions which have no reference data or when available reference data does not cover all required parameters, best estimate r~g analysis of simulator data will be used per ( j section 6.7. 6.7- Acceptance Criteria using Best Estimate Analysis . Note: It is expected that most malfunctions covered by this procedure will use best estimate analysis for acceptance criteria. 6.7.1 Malfunctions / parameters without available reference data shall be test run to obtain/ observe simulator response. 6.7.2 Assigned instructors shall review the simulator response and verify the following criteria are satisfied: 6.7.2.1 The observable response of any critical parameters shall not violate any physical laws of nature. 6.7.2.2 Alarms and/or automatic actuations which should occur, do occur. 6.7.2.3 Alarms and/or automatic actuations which should not occur, do not occur. l /~T l ks/ Rev.: 1 Date: 5/11/89 Page: 6 of 13 NSEM-4.05

1  ! l s

          /'
          '~

6.7.2.4 Parameters which are closely coupled due to physical relationships do not exhibit independent, incorrect response. 6.7.2.5 The rate and magnitude of the response does not adversely affect diagnosis or grossly misrepresent the severity of the' malfunction. 6.7.3 If the simulator response for critical parameters satisfies the above 6.7.2 criteria, acceptance criteria may be based on the simulator response. 6.7.4 Each critical parameter shall have acceptance criteria. Acceptance criteria for critical parameters shall consider: 6.7.4.1 Maximum / minimum values , l 6.7.4.2 Turning points or trend reversals 6.7.4.3 Time to reach the above ( 6.7.5 Acceptance criteria is to be stated as approximations versus exact values. Example: "RCS pressure decreases to approximately 80 psia within 15 to 20 seconds then slowly decreases to approximately containment pressure. 6.8 Writing Unit Specific Malfunction Test Procedures 6.8.1 The Millstone Unit 1 Malfunction Test Procedure shall be Attachment 8.1. The Millstone Unit 2 Malfunction Test Procedure shall be Attachment 8.2. The Millstone Unit 3 Malfunction Test Procedure shall be Attachment 8.3. The Connecticut Yankee Malfunction Test Procedure shall be rttachment 8.4. 6.8.2 The procedure shall be in columnar format, similar to the Acceptance Test Procedure (ATP) format. The following column headings shall be used: 5.8.2.1 STEF e - for procedural step number 5.8.2.2 FPCCEDUFE EESULT - for specifying

          ,_                                       precedurel actions or required t        >

resconse. (ms/ ~ Rev.: 1 Date: 5/11/89 Page: 7 of 13 NSEM-4.05

                    ).
           *>'                                      6.8.2.3  PANEL - for specifying the panel location where required response is to be verified.

6.8.2.4 TAG H - for specifying the identifying number of the instrument, controller, PPC point, etc. to be verified. 6.8.2.5 ACCEPT /DR - for the test instructor to initial if simulator response is acceptable, or to enter a DR# if required. 6.8.3 Each malfunction to be tested shall be a section of the procedure. 6.8.4 Each section shall have a heading specifying the following: 6.8.4.1 The alpha-numeric designator for the malfunction. 6.8.4.2 A brief description of the malfunction. f

            .'")                                    6.8.4.3  A listing of the equipment / locations available for selection for generic malfunctions.

Example: RC02A(B) - #1(2) RCS Hot Leg Break, RCllA(B,C,D) RCP A(B,C,D) Locked Rotor. 6.8.4.4 Available range for variable malfunctions including an approximation of the effect at 100% severity NOP/NOT. Example: Range 100%, 1001 equals 200 gpm at 2250 psia RCS pressure. 6.8.4.5 Document on the test procedure the date of performance of this malfunction test (month / day / year) and the instructors performing the test. 6.8.5 specify simuletor initialization requirements, including any elignment changes required. O Rev.: 1 Date: 5/11/89 Page: 8 of 13 NSEM-4.05

l i l l(}

 \/       6.8.6  Specify requirements for entering the malfunction (s). Where several severities are required, the maximum (normally 100%) should be tested first.

6.8.7 Following the malfunction insertion, the procedure shall list results to be verified and subsequent actions to be performed. As much as possible, each item should be listed in the order it is to be performed. Note: Malfunction test procedures shall be written te provide for reproducibility of effects. Where possible, subsequent actions should be initiated by timed actuations or Boolean triggers versus manual action. A time mark or parameter value shall be specified for any required manual actions. 6.8.8 Simulator response to be verified should specify the following: 6.8.8.1 Acceptance criteria per either Section 6.6 or 6.7 for the critical i parameters. p (,) 6.8.8.2 Response requirements for equipment actuations, e.g.: start, stop, shift. 6.8.8.3 Response requirements for key annun- i ciators and automatic systems . actuations. l 6.8.8.4 General response requirements for . parameters (non critical parameters) f related to, but not considered critical to, the malfunction, e.g.: increase, decrease. 6.8.9 The procedure shall verify what, if any, operator actions can be performed or attempted to stop the malfunction, or mitigate its effects. 6.8.10 Malfunction removal shall also be tested to verify that the malfunction can be removed or demonstrate 't is unrecoverable, as appropriate. l 4 t") (_, Rev.: 1 j Date: 5/11/89 Page: 9 of 13 NSEM-4.05

I p+ l L)1 6.8.11 The procedure should specify an endpoint for the test, eit:ter in terms of duration or endpoint conditions. 4 6.8.12 The procedure next shall test any additional severities needed due to frequent usage during training or availability of data for comparison. This shall be at the instructor's discretion. 6.8.13 For variable malfunctions, the test procedure should specif3 performance at least one and preferably two intermediate severity leveis, 2 e.g.: 50% and 101. 6.8.14 The procedure for intermediate severity level tests may reference the initial test information heavily. Response requirements may take the form of: "approximately 1/2 (1/10) the rate of", "to a lesser extent", etc. 6.8.15 Generic malfunctions which produce similar, symmetrical response shall be tested and recorded for one component or location. The remaining components or locations shall be (~)

     \_ ,/

tested to verify similar effects. The procedure shall specify any observable differences. 6.8.16 Generic malfunctions which produce different or non-symmetrical response will require testing of a spectrum of components (not necessarily all), to verify acceptable response. Incore detector malfunctions are an example of this type of generic malfunction. I 1 6.8.17 The response of certain malfunctions may vary I significantly with time in core life due to reactivity feedbacks. For these malfunctions, the test procedure shall specify the test on l the malfunctions be performed at extremes of j time in core life (BOL and EOL) to verify relative differences in the reactivity feedbacks. 6.8.18 The response of certain malfunctions may vtry significantly based on whether or not procedurally required operator actions are or are not performed. Test procedure shall be written with the ninimum of operator action i that can be essumed to maximize test l

       ~'         repeatability
                    ~

Mhen operator actions are done, they shall be specified in some (\') Rev.: 1 Date: 5/11/89 Page: 10 of 13 NSEM-4.05

                                                                                                      -. _ J

I l

(o}
  \' -

repeatable timed method to maximize test reproducibility. Some key operator actions which could drive the event to have substantially different results may need testing with and without operator response to demonstrate the differences. An example of this might be the response of RCS pressure and temperatures due to an operator tripping or  ; not tripping RCP's. j 6.9 Performing Malfunction Testing 6.9.1 Observe simulator response in accordance with test procedure. 6.9.2 Observe simulator response to ensure that key annunciators and automatic systems actuations respond as required. 6.9.3 Observe simulator response to ensure that annunciators and automatic systems actuations which should not occur, do not occur. 6.9.4 Indicate the a' acceptability of simulator response by initialing in the ACCEPT /DR column for each item where a blank for initials is (~T specified. U 6.9.5 Indicate non-acceptance by entering the number of the DR in the ACCEPT /DR column. 6.10 Evaluating Simulator Response For Critical Parameters And Resolving Discrepancies 6.10.1 Simulator response shall be compared to acceptance criteria specified in the test procedure. Response which meets all stated criteria shall be signed off as acceptable. 6.10.2 Where significant differences exist and reference data was used, the reference data should be reviewed again to determine if any of the following conditions exist: 6.10.2.1 Previously unidentified conservatism; time delays, incomplete equipment actuations, conservative actuation setpoints. differences in initial conditions. 6.10.2.2 Inccrrect interpretation of scales; esie s psig, log vs. linear. 1

   ,/ x
  !]

N Rev.: 1 Date: 5/11/89 Page: 11 of 13 NSEM-4.05 L___----.-__-- - - - - -

l

,m

( \ 6.10.3 If any of the above are discovered, reevaluate specified acceptance criteria and adjust accordingly. Then, reevaluate simulator response. 6.10.4 If significant differences still exist or if no problems are found with the reference data or best estimate analysis, the simulator response should be subjec'ed to best estimate analysis by two or more assigned instructors. All the criteria of 6.7.2 must be met. 6.10.5 For simulator response deemed acceptable per 6.10.4, the following additional steps are required: 6.10.5.1 The results of the best estimate analysis must be reviewed and approved by the ASOT. 6.10.5.2 The test procedure must be revised to reflect the modified acceptance criteria. 6.10.6 Submit a DR for simulator response which is (% unacceptable. Successful completion of the (_,/ respective malfunction test shall be specified as a requirement for closecut. 6.11 Reviewing / Revising The Cause And Effect Document (C&E) 6.11.1 Revise the C&E document as required to be consistent with simulator response. 6.11.2 Revise the C&E document for proper content and format per NSEM-4.01 Section 6.3. 6.12 Reporting Results of Malfunction Testing 6.12.1 Malfunction test results shall be reported for initial performance testing and anytime that retesting is required due to simulator design changes resulting in significant simulator configuration or performance variations. 6.12.2 The signed off originals of applicable malfunction test procedures shall be included in the report. (')N

  \-                                                 Rev.: 1
                                                                      \

l Date: 5/11/89 j Page: 12 of 13 ) NSEM-4.05

l rx l 6.12.3 The Malfunction Test Procedure (Attachment 8.1, 8.2, 8.3, or 8.4) for each unit shall contain a signed cover sheet (rigure 7.1) by the ASOT to document his review and acceptance (except as noted by DR's) of the Malfunction test procedure. 6.12.4 All malfunctions shall be tested over a four year cycle. 6.13 Simulator Design Changes  ! 6.13.1 Identifying the need to add new malfunctions or to perform non-scheduled malfunction testing due to simulator design changes will be addressed in NSEM-5.02. 6.14 Disposition of Test Procedures Generated 6.14.1 originals of malfunction test procedures and associated Cause and Effects Documents shall be retained in Controlled Document storage. 6.14.2 Copies of malfunction test procedures and associated Cause and Effects Documents shall e be maintained by the ASOT or his designee for I working use. 7.0 FORMS Figure 7.1 Malfunction Test Procedure Cover Sheet 8.0 ATTACHMENTS 8.1 Malfunction Test Procedure - MP1 8.2 Malfunction Test Procedure - MP2 8.3 Malfunction Test Procedure - MP3 8.4 Malfunction Test Procedure - CY 8.5 Marginal Note Directory l Rev.: 1 Date: 5/11/89 Page: 13 of 13 NSEM-4.05

() ATTACHMENT 8.5 MARGINAL NOTE DIRECTORY

1. Corrects typographical error and allows option of 1 rather than 2 intermediate severities to be tested.
2. Allows option of I rather than 2 intermediate severities to be tested.

1 Rev.: 1 NSEM-4.05 Date: 5/11/89 Page: 8.1-1 of 1

   .;;        _ ~ - - _ _ _ _ - . __-     _ _ _ _ _ _ . _

(. ,,: + l .' ,* d' l-

    ;.j                  !,           ,

a l 1 . Ma

                                                                . NORTHEAST UTILITIES NUCLEAR SIMULATOR ENGINEERING MANUAL NSEM-4.07 MASTER TESTING SCHEDULE i
    '. O_. ,

Responsible l Individual: ~ Manager, operptor Training Approved: Elre, clod, Nuclear Training s 0 Revision: l 3/23/89 Date: SCCC Meeting Not o I

l I\_J

   'l                           1.0  PURPOSE l

This procedure specifies the scheduling requirements for l operability and performance testing to maintain simulator certification, meet regulatory requirements, and satisfy industry standards.  ! I Attachments to this procedure specify the four year performance test. schedule for each of the NU simulators. j

                                                                              .                                                            I 2.0  APPLICABILITY                                                                                         I
                                    'This procedure applies to the Nuclear Training Department                                           '

(NTD), including Operator Training Branch (OTB), Simulator Technical Support Branch (STSB), and other Northeast Utilities (NU) organizations performing functions in support of the NU Simulator Certification Program.

3.0 REFERENCES

3.1 ANSI /ANS 3.5-1985 - This standard states the minimal functional requirements on design data and simulator performance and operability testing. 3.2 NRC RG 1.149 Rev. 1, April, 1987 - This guide describes O an acceptable methodology for certification by endorsing ANSI /ANS-3.5, 1985 with some additional requirements. 3.3 10CFR 55.45(b) - Mandates a timetable for simulator facility certification and specifies additional testing requirements. 3.4 NSEM-1.02 - Presents an overview of the NU simulator certification program. 4.0 DEFINITIONS 4.1 Deficiency Report (DR) - form (STS-BI-FlA) used by the Operator Training Branch (OTB) and the Simulator Technical Support Branch (STSE) to record identified deficiencies between the simulator and reference plant. 1 Rev.: 0 Date: 3/23/89 NSEM-4.07 Page: 1 of 9

1

  \'       4.2   Performance Test - a defined group of tests conducted to verify a simulation facility's performance as compared to actual or predicted reference plant performance. A performance test is required for initial certification and for every subsequent four year period in order to maintain certification.

Performance testing for certification maintenance is intended to be an on-going process with approximately 25% of the testing performed during each year of the four year cycle. . Additionally, a complete performance test is required if simulator design changes result in significant simulator configuration or performance variations. Significant simulator configuration or performance variations may result from design changes which affect the dis ~tribution in the integrated heat balance, e.g.: a new RCS model, a new steam generator model, or incorporation of new plant components which change secondary plant efficiency. Simulator design changes with the potential for significant impact, such as computer changeout, must be evaluated on a case by case basis.

  q      .
                                                                                      "Y". The "N" shall~be_ deleted by a single line,. initialed and dated.

G.4.4 At the end of the 4' year cycle, the working copy of NSEM-4.07 Form 7.1 shall be transcribed r to a clean form for submittal with the decertification report. 6.4.4.1 All acceptable tests, (whether initial or corrected), shall be indicated by a "Y" in the results column. 6.4.4.2 only open discrepancies shall be transcribed for the report. 7.0 FORMS 7.1 Simulator ~ Performance Test Completion / Result Record Rev.: 0 Date: 3/23/89 NSEM-4.07 Page: 8 of 9

          ,                  ~

i .j 8.0 ATTACHMENTS 8.1 MP1 Performance Test Schedule 8.2 MP2 Performance Test Schedule 8.3 MP3 Performance Test Schedule 8.4 CY Performance Test Schedule l

/~%)
 \_/

l I i I i 1 i

                                                                                     )

I ( Rev.: 0 1 Date: 3/23/89 { NSEM-4.07 Page: 9 of 9 ' l _ - - _ - . - - - - _ - - 1

FORM 7.1

      ~

SIMULATOR PERFORMANCE TEST COMPLETION / RESULT RECOEL

                                                   ~

I i Simulator: MP2 Page of I Acceptance Criteria Met (Y/N) Annual Operability Year: 1 2 3 4 Rev.: 0 Date: 3/23/89 NSEM-4.07 Page: 7.1-1 of 3

l l t FORM 7.1 SIMULATOR PERFORMANCE TEST COMPLETION / RESULT RECORD 1 Simulator: Page of ) Acceptance Criteria 4 Year Cycle Performance Testing Met (Y/N) i l 1 I t t I i I

 *N                                                         Rev.:  0 Date:  3/23/89 NSEM-4.07              Page:  7.1-2 of 3

c._

   ,/~')                                                      FORM 7.1                                i

/ "q) SIMULATOR PERFORMANCE TEST COMPLETION / RESULT RECORD j I l Simulator: Page of Performance Test Discrepancy: Test Name/ Number Section/ Step DR #

Description:

Resolution: l

   \,-)
    \~-

Test-Name/ Number Section/ Step DR #

Description:

Resolution:

     /%

(_ ,) Rev.: 0 Date: 3/23/89 NSEM-4.07 Page: 7.1-3 of 3

        'N M NORTHEAST UTILITIES NUCLEAR SIMULATOR ENGINEERING MANUAL NSEM - 4.08 SIMULIiTOR OPERATING LIMITS                                                      ,

Responsible Individual: s MadaP r, Operator Training g Approved: EC1 r e c tne', Nuclear Training ( i i Revision: 0 Date: 3/24/88 SCCC Meeting No: 88-004 3 j

1.0 PURPOSE The purpose of this ,.ocedure is to identify the simulator ' "3 operating limits and implement controls to avoid imparting negative training as a result of simulator operation beyond h, these limits. 2.0 APPLICABILITY This procedure applies to the Nuclear Training Department (NTD), including Operator Training Branch (OTB), Simulator Technical Support Branch (STSB), and other Northeast Utilities (NU) organizations performing functions in support of the.NU Simulator Certification Program.

3.0 REFERENCES

3.1 ANSI /ANS 3.5-1985 - This standard states the minimal functional requirements on design data and simulator perfcymance and operability testing. 3.2 N AC RG 1.149 Rev. 1, April, 1987 - This guide describes v.n acceptable methodology for certification by endorsing ANSI /ANS-3.5, 1985 with some additional requirements. 3.3 10CFR 55.45(b) - Mandates a timetable for simulator facility certification and specifies additional testing requirements. 3.4' INPO Good Practice TQ-504 - Describes techniques for effectively controlling simulator configuration. 3.5 INPO Good Practice TQ-505 - Describes. techniques for effectively controlling simulator configuration. 3.6 NUREG 1258, December, 1987 - Describes the procedures and techr.iques which will be employed to audit certified i I facilities. 3.7 NTDD-17, Simulator Certification and Configuration Management Control. 3.8 INPO 86-026, Guideline For Simulator Training, October, 1986. 3 ,9 INPO 87-006, Report on Configuration Management in the Nuclear Utility Industry, July, 1987, i n k Rev.: 0 i Date: 3/24/88 Page: 1 of 12 i NSEM - 4.08

l i I I [~h DEFINITIONS

          <>4.0                                                                               8 4.1     Deficiency Report (DR) - Form (STS-BI-FlA) used by the operator Training Branch (OTB) and the simulator Technical Support Branch (STSB) to record all identified simulator deficiencies between the simulator and              !

reference plant. j 4.2 Simulator Operating Limit - A given simulator condition beyond which simulation is unrealistic or inaccurate and negative training may be provided. Simulator operating limits may be impoced due to plant design limits, computer code model limits, or observed anomalous response. 4.3 Design Limits - Extreme values for specified plant parameters. Design limits are obtained from engineering design and accident analysis documents, e.g.: maximum RCS pressure, peak containment pressure, i etc. 4.4 Model Limits - Physical conditions which cannot be simulated by the model coding, e.g.: critical pressure and temperature, core melt, clad melt, etc.

f. 5 Anomalous Response - Simulator response which violates rx the physical laws of nature or differs. greatly from

( ) expected response. Expected response may be based on plant data, accident analysis, or best estimate evaluation. 4.6 Simulator . Instructor Guide (SIG) - A training document outlining the sequence of events for a simulator training session, SIG's also contain additional infor- , mation for the instructor conducting the session. 4.7 Best Estimate Evaluation - A method used, (in the absence of plant data, engineering analysis, or accident analysis), to determine the direction, rate, and magnitude of response for critical plant parameters during transient and accident conditions. Experience, rough engineering calculations and mass / energy balances, and table-top discussion may all be used to determine best estinate response. 5.0 RESPONSIBILITIES 5.1 Assistant Supervisor Operator Training (ASOT) 5.1.1 Responsible for assigning instructors to conduct simulator response testing in accordance with this procedure. f)

          '\_/                                                            Rev.: 0 Date: 3/24/88 Page: 2 of 12 NSEM - 4.08

_ _ _ _ _ _ _ 4

      =

r 1 g g 5.1.2 Responsible for approving the list of design limits, model limits, and anomalous responses generated by this procedure. 5.1.3 Responsible for assigning instructors to review and revise SIG's to contain appropriate caution statements when simulator operating limits may be encountered.. 5.1.4 Responsible for providing Simulator Computer Engineering (SCE) with a list of critical systems and. reference plant design limits. 5.2 Simulator Technical Support Branch (STSB) 5.2.1 Responsible for determining model limits for the i simulator's computer coding. 5.2.2 Responsible for specifying simulator operating conditions -^ich may produce anomalous response. 5.2.3 Responsible for performing the hardware and software modifications required to implement the controls specified by this procedure. 5.3 Operator Instructors 5.3.1 Responsible, as assigned, for performing simulator response testing. . 5.3.2 Responsible for writing DR's as required. 5.3.3 Responsible, ar assigned, for review and revision of SIG's to include appropriate caution statements when simulator operating limits may , be encountered. 6.0 INSTRUCTIONS 5.1 Defining critical Systems and Design Limits 6.1.1 Compile a list of critical systems. 6.1.1.1 Include systems essential to core cool-ing for LOCA and non-LOCA conditions, e.g.: steam generators, high pressure safety injection, low pressure safety injection, chemical and volume control system. ( Rev.: 0

  \

Date: 3/24/88 Page: 3 of 12 NSEM - 4.08 l

b-I 6.1.1.2 Include systems essential to contain-t

           )                      ment pressure suppression, e.g.: con-tainment spray, containment air re-circulating coolers, reactor building closed cooling water.

6.1.1.3 Include systems considered as vital auxiliaries for accident conditions, e.g.: vital AC, vital DC, air, generator cooling water. 6.1.1.4 Do not include non-safety related secondary or primary support systems. 6.1.1.5 List the critical systems in the appro-priate column on NSEM-4.08, Form 7.1. 6.1.2 Determine the Design Limits for the critical systems. 6.1.2.1 Review reference plant design documents c.nd accident analyses to determine Design Limits, e.g.: maximum RCS or containment pressure. 6.1.2.2 Enter Design Limit value, if ("N) applicable, on NSEM-4.08, Form 7.1 in ( ,/ the space provided by its respective critical system. 6.1.3 Forward the NSEM-4.00, Form 7.1 to SCE. 6.2 Determining Model Limits 6.2.1 Assigned SCE engineers shall review the simulator model for each critical system to determine if modeling limits exist. 6.2.1.1 Do not consider conditions beyond design limits, if specified. 6.2.1.2 Do not consider conditions beyond the physical capabilities of installed equipment, e.g.: system flow in excess of the total capacity of all installed pumps. l'~) k.) Rev.: 0 Date: 3/24/98 Page: 4 of 12 { NSEM - 4.08 i

                                                                          )

4 [ N

        *;              6.2.1.3  Conditions which should be considered include: critical temperature and pressure, fuel melt, clad melt, two phase flow, RCS drain down.

6.2.2 Model Limits shall be listed and/or described in the space provided on NSEM-4.08, Form 7.1 beside the respective critical system. 6.3 Determining Areas of Anomalous Response 6.3.1 Assigned SCE engineers shall review the simulator model for each critical system to determine if the possibility for anomalous response exists. 6.3.1.1 Do not consider conditions beyond design or model limits, if specified. 6.3.1.2 Do not consider conditions beyond the physical capabilities of installed equipment. 6.3.2 Use the space provided on NSEM-4.08, Form 7.1 to describe the conditions which could produce anomalous response.

        )      Example: The RCS model for Rx vessel level indication is inaccurate when the RCS is at atmospheric pressure and drained below the pressurizer.

6.3.3 Forward NSEM-4.00, Form 7.1 to the respective Assistant Supervisor-Operator Training (ASOT). 6.3.3.1 The ASOT shall add any known areas of anomalous response not identified by SCE. 6.3.4 The ASOT will assign instructors to perform simulator response testing on the specified areas of possible anomalous response. 6.3.5 The assigned instructors shall perform tests on each area of possible anomalous response. 6.3.5.1 Use combinations of initial conditions, remote functions, and malfunctions, as required, to achieve the conditions specified on NSEM-4.08, Form 7.1.

  /'h I
 \~./)                                                    Rev.: 0 Date: 3/24/88  l Page: 5 of 12  l NSEM - 4.08                              l

a. Y.;

 }'                                                           6.3.5.2     If unable to achieve the stated conditions and simulator response'is acceptable up to the point achieved, annotate NSEM-4.08, Form'7.1 in the space provided.

6.3.5.3 If the stated conditions are achievable by'a variety of means, several different approaches should be tested to ensure that the response of the suspect parameters is consistent. 6.3.5.4 For each test, determine those parameters which are critical to determining if response is anomalous. 6.3.5.5 NSEM-4.08, Form 7.2 should be used to document each test.

a. Specify the initial conditions i
b. Specify any equipment alignment modifications, e.g.: pumps out of service, valves closed, etc.
c. Specify any remote functions used to create the test conditions.
d. Specify any malfunctions used during the test.
e. Specify any operator actions per-formed during the test; include timing and magnitude (where appropriate).
f. Specify those parameters determined in Step 6.3.5.4.

C.3.5.6 The parameters determined in Step I 6.3.5.4 shall be recorded during the j test. Installed recorders, Gould recorders, CRT hardcopies, or special recording programs may be used. 6.3.5.7 Critical parameter recordings for each test shall be attached to their respective NSEM-4.08, Form 7.2 and forwarded to the ASOT. l l Rev.: 0 Date: 3/24/89 Page: 6 of 12 NSEM - 4.08 __________________ J

M l 6.3.6 The ASOT shall assign instructors to evaluate 5-} the results of each test. 6.3.7 The assigned instructors shall evaluate test data to determine if the response violates any physical laws of nature. 6.3.7.1 As a minimum, the following should be considered:

a. Obvious mass and/or energy imbalances.
b. Super-heated or sub-cooled indications for systems which should be saturated.

6.3.8 The assigned instructors shall evaluate the test data to determine if the response differs greatly from expected. Expected response may be based on actual plant data, accident analysis, or best estimate evaluation. 6.3.8.1 When using actual plant data, (reference plant or similar design), the test data shall correspond in

          /~                            direction, rate of response, and

(_,T) relative nagnitude. Differences in the rate of response and/or magnitude may exist to the degree that opetator diagnosis and response will not be adversely affected. 6.3.8.2 When using accident analysis data, the test data shall correspond in direction, however large differences may exist in the rate and/or magnitude due to the conservative conditions assumed for worst case analyses. 6.3.8.3 Best estimate shall be used to evaluate those tests for which plant data and/or accident analysis data is unavailable. Best estimate shall also be used to evaluate the rate and/or magnitude where accident analysis and test data differ greatly.

          /~'N t         1
          \/                                                       Rev.: 0 Date: 3/24/88  .

Page: 7 of 12 ( NSEM - 4.08 _ _ _ _ _ _ _ _ _ _ _ _ J

4 O 1 1 O l l l O 1 l l

4 s 1, ' ( 4

           ).                                     6.3.9  Simulator response shall be determined to be anomalous if the direction of. response does not agree-with expected or if.the rate and/or magnitude of response would cause misdiagnosis or improper operator response.

6.3.10 Assigned instructors shall determine the point at which the simulator response becomes anomalous and'specify those conditions on the NSEM-4.08, Form 7.2. 6.3.11 The completed NSEM-4.08, Form 7.2's with all test data and applicable additions shall be forwarded to the ASOT. 6.4 Implementing controls For Design And Model Limits 6.4.1 A DR shall be submitted, containing the following: 6.4.1.1 All design and model limits specified on NSEM-4.08, Form 7.1. 6.4.1.2 A request that reaching any one of  ; these limits shall cause the simulator to freeze. 6.4.1.3 A request that audible and/or visual indfcation be provided to inform the instructor that the simulator has frozen due to a design or model limit being reached. 6.4.1.4 A request for an indication to inform ' the instructor as to which limit was reached. (Preferably a CRT message) i 6.4.1.5 A request for a means to override the freeze condition to allow for testing. 6.4.2 The DR retest should specify and document all l aspects of 6.4.1. 6.4.3 A copy of the completed DR and retest shall accompany NSEM-4.08 records to document satis-factory completion. h I

    -([)                                                                                   Rev.: 0 Date: 3/24/88     i Page: 8 of 12   i NSEM - 4.08                               l

6 9 6 O i I O 1 0

                      )                             6.5 Implementing Controls For Anomalous Response 6.5.1      A training commitment, NTM-2.06, Form 7.2,
                                                                   'shall be opened to review all existing SIG's and to include an appropriate caution statement in those SIG's where areas of anomalous response may be encountered.

6.5.2 Assigned instructors shall review the SIG's against copies of all the NSEM-4.08, Form 7.2's to identify exercises where an anomalous response may be encountered, either by guide directions or trainee response. 6.5.3 Each guide identified shall be revised, using l the NTM-2.06 process, to include a caution to the instructor. The caution should be located in the body of the guide, just preceding the l directions which could lead to an anomalous response, and contain the following: 6.5.3.1 A bold heading, e.g.: CAUTION 6.5.3.2 A brief description of the anomalous response. 6.5.3.3 The actions which would cause the j anomalous response if appropriate. 6.5.3.4 A warning that, " Allowing the exercise to proceed beyond this point may provide negative training.", or; 6.5.3.5 Directions to inform trainees that specific indications will not provide accurate information due to simulator modeling limitations. 6.5.4 A copy of the completed training commitment shall accompany NSEM-4.08 records to document ] satisfactory completion.  ! l 6.6 Implementing Instructor Awareness Of Simulator Operating l Limits 6.6.1 The ASOT should require that all instructors providing training on the respective simulator be familiar with the following: 1 1 ( 1 Rev.: 0 Date: 3/24/88 Page: 9 of 12 NSEM - 4.08 i j

9 9 O l l l.. I O O

  ' ~ ... 3 The. reason for imposing simulator
    -i( )               6.6.1.1 operating limits.

6.6.1.2 The response of the simulator when a design or model limit is reached. 6.6.1.3 The procedure for determining which

                                  -limit caused the simulator to freeze and how to use the override feature.

6.6.1.4 All of the conditions determined as anomalous response and the actions / conditions leading to them. 6.6.1.5 Expected instructor actions when a simulator training exercise progresses to the point where further operation could provide negative training. j 6.6.1.6 The instructor's responsibility for l considering simulator operating limits when developing new SIG's and ensuring appropriate warnings are included. 6.7 Adding / Deleting Simulator Operating Limits

    ,Q          6.7.1   Previously unidentified areas of anomalous
   -(_)-                 response.

6.7.1.1 Instructors who observe simulator ' i response which they believe to be anomalous, but not previously identified, shall fill ~out an NSEM-4.08, Form 7.3, as completely as possible, specifying the following:

a. The parameters believed to be anomalous.
b. The instructor's best estimate of what correct response should be. l l '
c. As much of the information specified in 6.3.5.5 as possible.
d. The IC number of a snapshot taken (if possible).

I

     \                                                      Rev.: 0 Date: 3/24/88 Page: 10 of 12 NSEM - 4.08

0 1 l O l O

4, . h

 !! ).                                                      6.7.1.2  Forward NSEM-4.08, Form 7.3 to the ASOT.

5' 6.7.1.3 The ASOT will assign instructors to independently evaluate the conditions described on the NSEM-4.08, Form 7.3. 6.7.1.4 If the independent evaluator, originator, and ASOT are in agreement, the NSEM-4.08, Form 7.3 should be for-warded to SCE for evaluation. Note: The ASOT will resolve any areas of disagreement. 6.7.1.5 SCE shall review the NSEM-4.08, Form 7.3 to determine if the anomalous response is due to model limitations or a modeling error. If the problem is correctable, specify that determination on the form and forward to the ASOT. 6.7.1.6 If the problem is determined to be correctable, the ASOT shall assign the originator to submit a Dh. [}

  \_/<

6.7.1.7 If the response is the result of model limitations, the ASOT shall assign instructors to perform applicable sections of 6.3.5 through 6.3.11 and section 6.5. G.7.1.8 The ASOT shall ensure that all instructors are made aware of the condition per section 6.6. 6.7.2 Simulator modifications 6.7.2.1 Simulator modifications which add or delete simulator operating limits will be covered in NSEM-5.02. 6.8 Disposition of Forms Generated 6.8.1 Forward completed originals of the following to the ASOT for review and approval: 6.8.1.1 NSEM-4.08, Form 7.1 N- Rev.: 0 Date: 3/24/88 Page: 11 of 12 NSEM - 4.08

B' , p,. . r~% l E isj 6.8.1.2 All NSEM-4.08, Form 7.2's with applicable test data attached 1 6.8.1.3 All NSEM-4.08, Form 7.3's (future) 6.8.2 The ASOT shall forward the approved originals specified in 6.8.1 to controlled document

                                                                 ' Storage for retention with simulator certification records.

7.0 FORMS 7.1 Critical Systems List 7.2 Anomalous Response Test Form 7.3 Suspect Response Form 8.0 ATTACHMENTS None O O- Rev.: 0 Date: 3/24/88 Page: 12 of 12 NSEM - 4.08

,\
'~                                            Form 7.1 CRITICAL SYSTEMS LIST DESIGN SYSTEM               LIMIT  MODEL LIMIT    POSSIBLE ANOMALOUS RESPONSE                                                            N s

l

/

iL Approved: Date: ASOT b) e Rev.: 0 Date: 3/24/88 Page- 7.1_1 of 1 NSEM-4.08

4 Form 7.2

             ;                                    ANOMALOUS RESPONSE TEST FORM A.s . -

Briefly describe the Anomalous Response to be tested: Suspect Parameter (s):

            - Initial Conditions for Test:

Alignment Modifications: Remote Functions Used (initial): m;

            - Malfunctions Used (specify timing and magnitude):

Operator Actions Including Remote Functions (specify timing and magnitude): Critical Parameters (name and instrument ID): Test Traces Attached (list parameter ID, range recorded, time scale recorded): Rev.: 0 Date: 3/24/88 Page: 7.2-1 of 2 NSEM-4.08

l: Form 7.2 j *5[ ' ANOMALOUS RESPONSE TEST FORM

    %)

RESPONSE TEST RESULT: Anomalous Response: Y N (circle one) If yes, list reasons (laws of nature; response direction, rate, magnitude): List the method (s) used to base expected response, (attach copies of references,. graphs, or calculations used): O Briefly describe how the anomalous response differs from expected: List or describe the condition at which the simulator response becomes anomalous: Approved: Date: ASOT O 4 Rev.: 0 Date: 3/24/88 Page: 7.2-2 of 2 NSEM-4.08

7, Forn 7.3 c ., .

                                                                         ' SUSPECT RESPONSE FORM f'\./                              _

Descr'ibe suspect response; list those parameters believed to be anomalous: Briefly describe your best estimate of the correct response for the

                                . condition:

I Specify the following as completely as possible: Suspect parameters:

                                  -Initial Condition:

Alignment Modifications: Remote Functions in Use: Malfunctions in Use.(timing and magnitude): Operator Actions Creating Response:

                                                                                                                           )

Independent Evaluation: Agree Disagree (circle one) SCE Evaluation: Anomalous Correctable - DR# (circle one) Approved: Date: ASOT I O Rev.: 0 Date: 3/24/88 Page: 7.3-1 of 1 NSEM-4.08 ,3

  . .. o -

e;

     . '4 7.~
    .(

I-' NORTHEAST UTILITIES NUCLEAR SIMULATOR ENGINEERING MANUAL

                                                                                                          ]

NSEM - 4.09 { SIMULATOR OPERABILITY TESTING i I ( Responsible i Individual: Manager, Operatbr Training Branch Approved: / Dir Nuclear Training Revision: 0 Date: May 4, 1988 SCCC Meeting No: 88-006 i D i i 1 -- _ _ - _ - - - - i

1 l l l 1.0 PURPOSE m

                 ;    )     This procedure defines the methodology for writing and
                   '-       conducting yearly operability testing for each NU Simulator. This procedure defines how to write the initial I

l Operability Test and then how to perform it in succeeding years. To accomplish this task, this procedure: (1) Defines what is required for Steady State Testing per ANSI /ANS-3.5, Sections 4.1, 5.4.2 and Appendix B. (2) Defines what is required for Transient Testing per ANSI /ANS-3.5, Sections 4.2 (in part), 5.4.2 and Appendix B. 2.0 APPLICABILITY This procedure applies to the Nuclear Training Department (NTD), including Operator Training Branch (OTB), Simulator Technical Support Branch (STSB), and other Northeast Utilities (NU) organizations performing functions in support of the NU Simulator Certification Program.

3.0 REFERENCES

G) ( 3.1 ANSI /ANS-3.5, 1985 - This standard states the minimal functional requirements on design data and simulator performance and operability testing. 3.2 NRC RG 1.149 Rev. 1, April, 1987 - This guide describes an acceptable methodology for certification by endorsing ANSI /ANS-3.5, 1985 with some additional requirements. 3.3 10CFR 55.45(b) - Mandates a timetable for simulator facility certification and specifies additional testing requirements. 3.4 INPO Good Practice TO-504 - Describes techniques for effectively controlling simulator configuration. 3.5 INPO Good Practice T0-505 - Describes techniques for effectively controlling simulator configuration. 3.6 NUREG 1258, December, 1987 - Describes the procedures and techniques which will be employed to audit certified facilities.

                 '( /

Rev.: 0 Date: 5/4/88 Page: 1 of 26

                                                       .4SEM-4.09

_ _ _ _ _ _ _ _ _ i

a t (~N 3.7 NTDD-17, Simulator Certification and Configuration

        .(,)                                Management Control.

3.8 INPO 86-026, Guideline For Simulater Training, October, 1986. 3.9 INPO 87-006, Report on Configuration Management in the Nuclear Utility Industry, July, 1987. 4.0 DEFINITIONS 4.1 Deficiency Report (DR) - Form (STS-Bi-FlA) used by ' the Operator Training Branch (OTB) and the Simulator Technical Support Branch (STSB) to record all identified simulator deficiencies between the simulator and reference plant. 4.2 Critical Parameters - Those parameters that require direct and continuous observation to operate the power plant under manual control, or are inputs to plant safety systems. 4.3 Best Estimate Evaluation - A method used, (in the absence of plant data, engineering analysis, or accident analysis), to determine the direction, rate, and magnitude of response for critical plant ('_T parameters during transient and accident conditions. s s/ Experience, rough engineering calculations and m mass / energy balances, and table-top discussion may all be used to determine best estimate response. 4.4 Comparison Standard - A set of reference plant response data used to evaluate simulator performance for a specific steady state condition or transient. Actual plant data, engineering analysis, accident analysis, and/or best estimate may be used to develop comparison standards for transient response. Actual plant data is used to develop steady state comparison standards. 4.5 Performance Benchmark - A set of simulator response data for a specific condition or transient. Per-formance benchmarks are used to verify the continued accuracy of simulation. 4.6 Performance Testing - Tests performed to prove the simulator's capability to perform in a realistic manner. Rev.: 0 Date: 5/4/88 Page: 2 of 26 NSEM-4.09

(~ 4.7 Operability Testing - Tests performed to prove the (_,N/ continued accuracy of simulator performance. 4.8 Benchmark Transient - One of a set of transients, selected in accordance with ANS-3.5 Appendix B, used to verify simulator transient performance, model completeness, and systems integration during annual operability testing. Comparison standards are used to validate simulator response as a performance benchmark for each benchmark transient. 4.9 Turning Point - The point on a parameter plot where the parameter response changes direction. l 5.0 RESPONSIBILITIES 5.1 Assistant Supervisor Operator Training (ASOT) 5.1.1 Responsible for assigning instructors to develop simulator specific test procedures in accordance with this procedure. 5.1.2 Responsible for assigning instructors to conduct initial operability testing in accordance with this procedure. 5.1.3 Responsible for approving simulator

 /"~}

x- operability test reports. 5.1.4 Responsible for approving the selection of evaluation criteria for annual operability testing of benchmark transients. 5.1.5 Responsible for assigning instructors to perform annual operability testing. 5.1.6 Responsible for determining the need for non-scheduled operability testing due to simulator design changes which result in significant simulator configuration or performance variations. 5.1.7 Responsible for determining if simulator design changes, made in response to significant plant modifications, require performance testing and revision of performance benchmarks. l O

                                                      Rev.: 0 l

l Date: 5/4/88 j Page: 3 of 26 NSEM-4.09 l l

4; y 5.2 ~ Simulator Technical Support Branch (STSB) T/' - ' (~ 5.2.1 Responsible for maintaining the simulator performance benchmarks. 5.2.2 Responsible for performing operability test data comparisons. 5.3 operator Instructors 5.3.1 Responsible, as assigned, for performing functions in accordance with this procedure. 5.3.2 Responsible for writing DR's as required. 6.0 INSTRUCTIONS 6.1 Selecting Power Levels For Simulator Steady State Comparison To The Reference Plant 6.1.1 one comparison will be performed at the 100% full power steady state condition. 6.1.2 Two intermediate power, steady state conditions should be chosen based on the

                                                          .following:
 ./~')

k's i 6.1.2.1 Availability of valid reference plant data. 6.1.2.2 Completeness of available plant data, (see Section 6.3). i 6.1.3 Initiate an NSEM-4.09, Form 7.1 for each of the three power levels selected. 6.2 Selecting a Benchmark Set of Transients for Simulator Operability Testing 6.2.1 Evaluate the list of BWR (PWR) transients, specified in ANS-3.5 Appendix B, for , applicability to the reference plant. 6.2.1.1 Transients which do not apply or rely on operator action should be deleted, e.g.: maximum rate power I ramp for a plant without an Integrated Control System. 1 O Rev.: 0 Date: 5/4/88 Page: 4 of 26 NSEM-4.09

         .4 6.2.2    Transients deleted per 6.2.1.1 shall be L[~} (_j                                           replaced with a comparable transient.

6.2.2.1 Comparable transients shall exercise the same parameters as those specified for the original. 6.2.2.2 Comparable transients should be readily reproducible, i.e.: no reliance on the timing or degree of operator action to affect response. 6.2.2.3 , Ramp rate introduction of one or more malfunctions and/or remote functions may be used to simulate operator response while ensuring reproducibility. 6.2.3 Initiate an NSEM-4.09, Form 7.2 for each i transient selected. 6.2.4 Ensure that at least ten transients are selected for establishing benchmarks. 6.3 Selecting Parameters For Steady State Comparison Ih 6.3.1 Include those parameters which define plant

             \ s/                                            power level, e.g.: nuclear power, thermal power, electrical power, main feed flow, main steam' flow.

6.3.2 Include those parameters which are power i i dependent, e.g.: programmed pressurizer level, programmed RCS temperatures, power dependent cooling water flows, turbine first stage pressure, heater drains flow. 6.3.3 Include those parameters necessary to verify the following principal mass and energy balances: l 6.3.3.1 Net NSSS thermal power to generated electrical power. 6.3.3.2 RCS temperature to steam generator pressure (PWR). l l I l () Rev.: 0 Date: 5/4/88 l

                                                                                                            )

I Page: 5 of 26 NSEM-4.09

l .

  .fsv                                    -6.3.3.3   RCS temperature to main steam
  ?

pressure (BWR). JL 6.3.3.4 Feedwater flow to reactor thermal power. 6.3.3.5 Steam generator mass outflow equal to mass inflow at steady state, (constant level). 6.3.3.6 RCS mass outflow equal to mass inflow at steady state, (constant pressurizer level) (PWR). 6.3.3.7 Reactor vessel mass outflow equal to mass inflow at steady state, (level constant), (BWR). 6.3.4 Include critical parameters as defined in 4.2. NOTE 1: Certain inputs to safety systems are not directly related to, or are independent of power level. Simulator values need not agree within +2% of plant data, but shall be checked to be reasonable, e.g.: refueling water storage tank level. () NOTE 2: Where multiple channels exist for the same parameter, list each instrument. steady state comparison testing, each During instrument must meet acceptance criteria for simulator response range and channel check. Only one instrument input will be I recorded during the stability run. I 6.3.5 Those parameters identified shall be entered on the NSEM-4.09, Form 7.1 for each of the three' power levels. Parameters which fit more than one inclusion category need only be entered once. Include noun name and instrument identification number. i 6.4 Selecting Parameters For Transient Comparison 6.4.1 Include those parameters specified for the respective transient in ANS-3.5 Appendix B, as applicable. Rev.: 0 Date: 5/4/88 Page: 6 of 26 NSEM-4.09

I l 6.4.2 Include any additional parameters required to verify the dynamic response of the fluid 7-~) (

   %-                systems being tested by the transient, i.e.: RCS, steam generators, containment.

6.4.3 Additional parameters may be included at the discretion of the ASOT. ) i NOTE: Where multiple channels exist for the same parameter, only one channel will be recorded. 6.4.4 Parameters identified shall be entered on the NSEM-4.09, Form 7.2 for the respective transient. Include noun name and instrument identification number. 6.5 Determining Allowable Instrument Error For Steady State Performance Test Parameters 6.5.1 As much as possible, plant surveillance specifying instrument tolerances or channel deviation should be used to determine allowable instrument error, e.g.: channel check of reactor protection system inputs from pressurizer pressure agree within + 40 - psi; allowable error = +40 psi. () 6.5.2 Other sources for determining allowable instrument error include: 6.5.2.1 I&c instrument loop calibration i folders.  ; 6.5.2.2 Instrument loop tolerance specifications from the FSAR or ' other design documents. NOTE: The allowable instrument error, for the purpose of this procedure, is comprised of the sum total, or root mean square method (if appropriate), of the errors for all components in the instrument loop (detector, transmitter, amplifier, meter). 6.5.3 Enter the instrument range in the appropriate column on NSEM-4.09, Form 7.1. l-l n Rev.: 0 Date: 5/4/88 Page: 7 of 26 NSEM-4.09 ( _ _ _ _________________________ _ _ _ _ -

d e-s 6.5.4 Enter the allowable instrument error in the appropriate column on NSEM-4.09,-Form 7.1.

 - (' )-                                    Percentage error shall be converted to applicable engineering units, e.g.: 0-1000 psig range with 1 2% error = 120 psi.

6.5.5 Enter the reference name and/or number, usedLto determine instrument loop error, in the appropriate column on NSEM-4.09, Form 7.1. 6.6 Developing Comparison Standards and Acceptance Criteria For Steady State Performance Testing 6.6.1 Use actual plant data to specify target values for each of the parameters listed on NSEM-4.09, Form 7.1 for the respective power level. Enter in the appropriate column. 6.6.1.1 Plant data may be obtained from:

a. Engineering test records
b. Plant charts and computer data from historical records r's c. RE post refuel data U d. Manually recorded data collected specifically for this procedure 6.6.2 Specify the data source and the date and time of the steady state condition on the NSEM-4.09, Form 7.1.

6.6.3 Determine each parameter as critical or non-critical and enter a "C" or an "N" in the appropriate column on NSEM-4.09, Form 7.1. l l 6.6.3.1 Apply the defir ition f rom 4.2 in determining a parameter as critical. All others should be non-critical. 6.6.4 Determine the simulator performance 3 tolerance for each parameter and list in the appropriate column on NSEM-4.09, Form 7.1. i Rev.: 0 Date: 5/4/88 Page: 8 of 26 , NSEM-4.09

n 6.6.4.1 Critical. parameter tolerance is

    /~}                                                              12% of the target value.

is j-6.6.4.2 Non-critical parameter tolerance is 110% of the target value. 6.6.4.3 Tolerances should be specified as i deviation from the target value.in I appropriate engineering units, f e.g.: target =600 psig, tolerance j

                                                                      =12%, enter 112 psig.               I 6.6.5    Determine the acceptable range for each parameter and specify in the appropriate column on NSEM-4.09, Form 7.1.

6.6.5.1 Add the allowable instrument error to the simulator performance tolerance and apply to the target value to arrive at the acceptable  ; range, e.g.: (120 psig instrument error) + (+12 psig tolerance) =

                                                                      +32 psig range) target value = 600 psig, acceptable range - 568-632   l psig.   (Reference ANSI /ANS-3.5,  1 1985, Section 4.1)                  l 6.6.6     The principal mass and energy balances

('s. specified in 6.3.3 are automatically verified if their component parameters meet i their acceptance criteria. A separate evaluation and acceptance criteria is not required. l 6.6.7 Safety system inputs which do not relate directly to power or are independent of power should have acceptance criteria ' response ranges specified which are bounded l' by Tech Specs and/or plant procedures. The tolerance column is N/A. 6.7 Developing Comparison Standards For The Selected Set of Benchmark Transients 4 6.7.1 Select data sources for transient comparison standards. Data for developing transient test comparison standards shall be 'best available' using one or more of the following sources, in order of preference: O Rev.: 0 l Date: 5/4/88 Page: 9 of 26 NSEM-4.09

e , l i , :s

6. 7.1. l' Actual reference plant data with little or no affect due to f')T-(_ operator action.
 -J                     6.7.1.2    Actual reference plant data affected by known and reproducible operator action.                                                               !

6.7.1.3 Engineering analysis data obtained from an engineering computer model . i using reference plant input data.

                                                                                                                  )

6.7.1.4 Accident analysis data for the reference plant. 6.7.1.5 Actual plant data from a similar design plant (affect of operator action usually not known). 6.7.1.6 Generic analysis data for similar l design plants (engineering and accident). 6.7.1.7 Best estimate evaluation 6.7.2 Selected data source (s) shall be specified on NSEM-4.09, Form 7.2.

      ..s
   \              6.7.3 Select an appropriate duration for each comparison standard using the following criteria:

6.7.3.1 The duration shall encompass significant parameter changes directly caused by the initiating event. 6.7.3.2 Unless otherwise specified, the transient may end when automatic L systems have control and plant parameters are trending toward normal. 6.7.3.3 Unless otherwise specified, the transient may end when it can be reasonably expected that operator action would significantly affect response. Rev.: 0 Date: 5/4/88 Page: 10 of 26 NSEM-4.09

c -e 6.7.3.4 Where endpoint conditions are i

                                                                      .specified, the duration shall be sufficient to achieve the specified conditions.

6.7.4 Enter the endpoint conditions and duration selected for each transient on the respective NSEM-4.09, Form 7.2. 6.7.5 Plot comparison standard response curves for each parameter specified on NSEM-4.09, Form 7.2 for each transient. 6.7.5.1 The duration selected in 6.7.3 should be used for the horizonttl axis scale. 6.7.5.2 l The instrument extremes may be used for determining the vertical axis scale, or the scale may be narrowed ruch that the parameter response ane,s roughly in the middle 50% of scale. 6.7.5.3 The point of automatic systems actuation shall be identified on the appropriate parameter plot, e.g.: reactor trip on RCS O; pressure. NOTE: Simulator transient response data may be used to generate parameter plots using identical scales for  ; comparison in an overlay fashion. 6.8 Identifying Automatic Systems Actuations and Key Alarms to be Checked During Transient Performance Testing 6.8.1 List automatic system actuations identified in 6.7.5.3 on the respective NSEM-4.09, Form 7.2 and specify the setpoint. NOTE: Setpoints for automatic systems actuations and annunciators are verified during system testing. Setpoints are listed here as an aid for the test operator only. l- ' Rev.: 0 Date: 5/4/88 Page: 11 of 26 NSEM-4.09

6.8.2 Identify key alarm actuations and list (s on the respective NSEH-4.09, Form 7.2. -( ) 6.8.2.1 Key alarms are those which aid in diagnosis, indicate automatic  ! system actuations, or affect operator response. 6.8.2 2 Include alarm setpoints and the ID for the initiating parameter. 6.9 Developing the Simulator Specific Test Procedure for Steady State Testing 4 6.9.1 The steady state operating test will meet the requirements of initial and annual operability testing. 6.9.2 The steady state operation test shall be identified as Section 1 of Part B of the respective attachment to NSEM-4.09. 6.9.3 Use worksheets NSEM-4.09, Form 7.1 to develop data sheet NSEM-4.09, Form 7.3 for each power level' to be tested. 6.9.3.1 List each perameter to be checked (noun name and instrument ID#). (~}

%.)                                                         6.9.3.2 Enter the response range for each instrument.

6.9.3.3 Enter the allowable instrument error for channel check of multiple channels. Enter N/A in this column for parameters with single channel indication. 6.9.4 Select or develop a simulator initial condition for each of the power levels to be tested. 6.9.5 Specify the IC number, conditions, and any alignment changes required for the test in the appropriate location on NSEM-4.09, Form 7.3. O-- Rev.: 0 Date: 5/4/88 Page: 12 of 26 NSEM-4.09

x

  • 6.9'.6 Data sheet NSEM-4.09,.rorm 7.3 may be~

I. q- j'. referenced in: the procedure. for verifying

                      , simulator accuracy to.the reference plant at. steady. state powers. .The procedure should specify the following:

6.9.6.1- Simulator IC number or conditions. 6.9.6.2 Required alignment' changes. 6.9.6.3 Data recording requirements. 6.9.7 On NSEM-4.09, Form 7.4, list the parameters to be recorded for the stability test. The unique parameters listed on-the.100% power worksheet NSEM-4.09, Form 7.1 shall be recorded. Where multiple channels exist for the same parameter, only one should be listed. 6.9.8 Compare the number of parameters listed to the number of data points available on the simulator recording program to determine if more than sne run is required. Enter this determination on NSEM-4.09, Form 7.4. l 6.9.9 Provide'STSB with a list of parameters to be recorded for stability testing.

           )                                                                                                                         !

6.9.10 Specify the simulator IC number or conditions and any required alignment modifications on NSEM-4.09, Form 7.4. These shall be the same as those specified , on NSEM-4.09, Form 7.3 for the 100% power j

                                                                                                                                     )

steady state accuracy test. 6.9.11 Select a data recording interval and specify on NSEM-4.09, Form 7.4. A recording interval of 2 minutes (or less) is sufficient for demonstrating stability. , 6.9.12 Data sheet NSEM-4.09, Form 7.4 may be i referenced in the procedure for performing j simulator stability testing. The procedure  ! should specify the following: 6.9.12.1 Simulator IC number or conditions. 6.9.12.2 Required alignment changes. 3

                                                                                                                                     ]

6.9.12.3 Data recording requirements. Rev.: 0 Date: 5/4/88 Page: 13 of 26 NSEM-4.09

4 . (-s 6.9.12'4 Test duration of 60 minutes. (- . 6.9.13 Simulator stability data will be evaluated with acceptance criteria of +2% variation from the initial value of the parameter and test results indicated by circling Y or N on NSEM-4.09, Form 7.4. 6.9.14 Any response that does not meet acceptance criteria shall be described on NSEM-4.09, Form 7.4 and a DR submitted. 6.10 Developing The simulator Specific Test Procedure For Performance Testing Of The Selected Benchmark Transients 6.10.1 The performance test will be used to validate the simulator benchmark transient results for use during annual operability testing. 6.10.2 The performance test shall be identified as Part A of the respective attachment to NSEM-4.09, 6.10.3 The procedure shall provide instructions for completing data sheet NSEM-4.09, Form

  ,   s               7.5 for each of the ten transients to be
  't                  tested. Information obtained from NSEM-4.09, Form 7.2's shall be used to specify the following:                                                 j 6.10.3.1 A brief description of the                                    l transient to be tested.

6.10.3.2 The parameters to be recorded for comparison, (noun name and , instrument ID). 6.10.3.3 The initial condition for the simulator. 6.10.3.4 The transient endpoint conditions. Endpoint conditions shall be the same as those used to determine the duration for the respective comparison standard. 6.10.3.5 " Key alarms" and their causal parameter. O

  \d                                                Rev.: 0 Date: 5/4/88 Page: 14 of 26 NSEM-4.09

E- , 4

   /~'T                       6.10.3.6 Automatic systems actuations and

(_,) the initiating parameter. 6.10.4 All parameter plot response curves developed in 6.7.5 shall be attached to the NSEM-4.09, Form 7.5 for their respective i transient. l 6.10.5 The procedure shall specify the data recording interval. Data must be recorded I' at less than or equal to 0.5 second intervals for each parameter point. l The procedu.. shall provide detailed 6.10.6 instructions .3r performing each test. The following need to be included: 6.10.6.1 Simulator initialization conditions. 6.10.6.2 Required control board alignment modifications. 6.10.6.3 The status of remote functions critical to establishing test conditions. 6.10.6.4 Any malfunctions included and [N.-'>) active at time zero. Magnitude must be specified for variable malfunctions. 6.10.6.5 The timing of the event (s) which initiate the transient. 6.10.6.6 The timing, identification, and direction / magnitude of any remote functions, malfunctions, or operator actions used during the test. . NOTE: To improve the usability of transient benchmarks, actions must be reproducible with split second accuracy, therefore, it is important to use simulator timed actuations to produce an action rather than rely on manual operations. Rev.: 0 Date: 5/4/88 Page: 15 of 26 NSEM-4.09

      ~s                                    6.10.6.7 Requirements for identifying automatic systems actuation and
    '-  )                                            key alarm response (as required and improper).

6.11 Evaluating Transient Test Response Data 6.11.1 Recorded data of the transient performance test results shall be used to graph the response of each parameter, for each transient. 6.11.1.1 The scale divisions for the horizontal and vertical axis shall be the same as those used for the comparison standard. 6.11.1.2 The time of the initiating event for the test results shall be plotted to coincide with the  ; comparison standard. 6.11.1.3 Achieving specified endpoint conditions may require a longer duration on the simulator than that used for the comparison standard. When that occurs, the horizontal axis should be expanded ('T such that the entire response ( ')

    ~

curve ce;. be plotted while maintaining equidistant scale divisions. 6.11.2 Plot the point of automatic systems actuation on each graph. This will aid in evaluating integrated response. 6.11.3 Evaluate the graphs for simulator response for a given transient, as a set, to verify ' that the physical laws of nature are not violated. As a minimum, the following shall be considered: 6.11.3.1 Obvious mass and/or energy imbalances. 1 l i l (' (,)' Rev.: 0 ) Date: 5/4/88 i Page: 16 of 26 ) NSEM-4.09 l l

                                                                                           )

e3 6.11.3.2 Super-heated or sub-cooled indications for systems which

  ' ~'                             should be saturated. Instrument error should be considered, i.e.:

An indication of several degrees super-heat should not be considered as unacceptable. 6.11.3.3 Independent, incorrect response from parameters which are closely coupled, e.g.: steam generator pressure rapidly decreasing with RCS temperature increasing. 6.11.4 Compare the graph of simulator response for each parameter against its respective comparison standard graph to evaluate the response rate and magnitude. 6.11.4.1 For comparison purposes, either the standard or the test plot may be converted to a transparency and used as an overlay or the standard may be plotted onto the test plot. The ori'ginal of the comparison standard should be maintained for possible future use. (}) 6.11.4.2 The rate and magnitude of parameter response for simulator { j transient test data versus the j comparison standard should j correspond roughly and shall be { i considered acceptable to the degree that it does not adversely affect diagnosis, cause improper operator response, or grossly misrepresent the severity of the , transient. ] i 6.11.4.3 Unacceptable and questionable { comparisons will be resolved per j 6.12. l I circle or check the appropriate locations I 6.11.5 on NSEM-4.09, Form 7.5 to indicate the determination on acceptance criteria for each area evaluated. (3 Rev.: 0

 \l Date: 5/4/88 Page: 17 of 26 NSEM-4.09 l

l l

4

   /s      6.11.6   Ensure that the determination of acceptance

('- criteria is indicated for the following checks performed during the simulator test run. 6.11.6.1 Automatic systems actuated as required. 6.11.6.2 Key alarms actuated as required. 6.11.6.3 Improper alarms and/or automatic systems actuations did not occur. 6.11.7 Enter a description for any area of unacceptable simulator response and list DR's s submitted, in the appropriate location on NSEM-4.09, Form 7.,5. 6.11.8 Attach comparison standard copies and simulator response curves for each parameter to the NSEM-4.09, Form 7.5 for the respective transient. Forward to the ASOT for review and inclusion in the Simulator Operability Test Report. 6.12 Resolving Unacceptable and Questionable Parameter Comparisons for Benchmark Transient Testing 6.12.1 Comparison standards and transient test response plots should be reevaluated to determine which represents the more accurate response for the parameters. 6.12.2 Comparison standards using actual reference plant data may be checked for accuracy and appropriate translation to graph form, but the base data shall be considered realistic. 6.12.3 Comparison standards using sources of data other than actual reference plant data may be suspect based on one or more of the following: 6.12.3.1 Conservative assumptions imposed on the analysis cause significant differences in initial mass inventories, equipment response times, or equipment response magnitude, e.g.: minimal steam generator inventory, delays in ECCS, single train ECCS response. \- Rev.: 0 l Date: 5/4/88 Page: 18 of 26 NSEM-4.09

         <~5                  6.12.3.2 Similar plant data appears to be
                   )                    significantly affected by unknown
                                      operator response when subjected to best estimate evaluation.

6.12.3.3 Analysis data assumes equipment actuation, or lack of, which is inconsistent with simulator response. 6.12.4 Where input data for the comparison standard is evaluated as inaccurate, the standard shall be redeveloped using best estimate and the evaluation of simulator response data re-performed per 6.11. 6.12.5 Where possible, the simulator performance test shall be restructured to make simulator conditions, equipment response and timing match those assumed for the data used to develop the comparison standard. The benchmark transient shall then be re-run and evaluated per 6.11. 6.12.6 Where the simulator performance test cannot be changed to match the analysis conditions,, or should not be changed due to simulator (~N response being more realistic, the (m,) comparison standard shall be redeveloped. Best estimate should be applied to modify the source data and replot the para-meter (s). Re-evaluate the simulator response plots per 6.11. 1 6.12.7 Where differences in the rate and/or magnitude of parameter response compares questionably, (neither clearly acceptable nor clearly unacceptable), and cannot be resolved by the above, the following evaluation may be used: 6.12.7.1 Test run the transient for one or more experienced license holders. 6.12.7.2 The nature and severity of the transient should be unknown to the

                     .                   license holder (s).

f~')

                '                                            Rev.: 0 Date: 5/4/88 Page: 19 of 26 NSEM-4.09
 .c 4

6.12.7.3 The simulator response shall be (' considered acceptable if the ( license holder (s) can correctly identify the nature of the transient, specify reasonable operator response, and correctly determine the relative severity, if appropriate (large versus small LOCA). 6.12.7.4 Complete the appropriate section of NSEM-4.09, Form 7.5 to indicate that this evaluation method was used. 6.12.8 Submit a DR for any simulator response which cannot be resolved by the above. The DR retest shall specify re-performance of the transient test and acceptance per 6.11 for successful completion. Enter the DR number and unacceptable response on NSEM-4.09, Form 7.5. 6.13 Developing Acceptance criteria For Transient Test Comparison Standards 6.13.1 Assigned instructors shall review I comparison standards to identify areas

  \--)                                  where objective acceptance criteria can be The specified for parameter response.

following should be identified: 6.13.1.1 Extreme values for parameter response at turning points. 6.13.1.2 Parameter values at transient endpoint or parameter stability. 6.13.1.3 Time to reach actuation points for automatic systems and key alarms. 6.13.1.4 Time to reach turning points or ' stability. NOTE: Each comparison standard will have at least one acceptance criteria specified. Example: For a parameter which responds linearly in only one direction and doesn't stabilize, only the parameter value at transient endpoint can be specified. O Rev.: 0 Date: 5/4/88 Page: 20 of 26 NSEM-4.09

d ( ,) 6.13.2 For each item identified in 6.13.1, use table-top discussion to determine the j amount of positive and negative deviation { which could adversely affect diagnosis, j cause improper operator response, or grossly misrepresent the severity of the transient. NOTE: The positive and negative values may coincide or differ, e.g.: -+50*r, or +50 r and -25*r. 6.13.3 Acceptance criteria identified in 6.13.2 should be indicated on the comparison standard for the respective parameter. 6.14 Reporting Initial Operability Test Results for Steady State Operation, Simulator Stability, and Benchmark Transients 6.14.1 Include the completed NSEM-4.09, Form 7.1 for each power level tested. 6.14.2 Include the completed data sheet, NSEM-4.09, Form 7.3 for each power level tested. () 6.14.3 Include the computer data printouts for full power steady state stability testing and the completed cover sheet, NSEM-4.09, Form 7.4. 6.14.4 Include the completed NSEM-4.09, form 7.2 for each transient tested. 6.14.5 Include the completed data sheet, NSEM-4.09, Form 7.5 for each transient tested accompanied by the required comparison standards and simulator response plots. 6.14.6 rill out test report cover sheet NSEM-4.09, form 7.6 specifying the following: 6.14.6.1 Testing performed; steady state, simulator stability, and bench-mark transients performance. Rev.: 0 Date: 5/4/88 Page: 21 of 26 NSEM-4.09

73 6.14.6.2 Areas of unacceptable response

                       )                          identified during testing.

v 6.14.6.3 DR's submitted to correct simulator response. 6.14.6.4 Acceptance criteria met for all areas tested. , 6.14.7 Forward the completed test report to the ASOT for review and approval. 6.15 Designating Simulator Transient Test Results as f Performance Benchmarks for Annual Operability Testing 6.15.1 Simulator transient test results which meet all the acceptance criteria for performance testing and are approved by the ASOT may then be designated as benchmarks. 6.15.2 The ASOT shall inform STSB that the computer records for the selected transients are to be designated as bench-marks. 6.15.3 STSB shall take required actions to designate the records, provide safe () (j storage, and a means for ready retrieval for annual operability testing. 6.16 Developing Evaluation Criteria for Annual Operability Testing of the Selected Set of Benchmark Transients 6.16.1 Acceptable response for automatic systems actuations and key alarms specified on NSEM-4.09, Form 7.5 will be used as one measure of acceptable response. 6.16.2 Comparison standards acceptance criteria identified in 6.13 shall be used to determine evaluation criteria for the annual operability test. In determining criteria, consider the following: 6.16.2.1 Evaluation criteria for annual operability testing must not be less restrictive than acceptance criteria specified on the comparison standard.

              ,m i         )

(_/ Rev.: 0 Date: 5/4/88 Page: 22 of 26 ' NSEM-4.09

L j . 6.16.2.2 Evaluation criteria should be ( ,e - selected for ease of use if a computer program is used for evaluating operability test l results. STSB should be consulted in this regard. NOTE: All parameters must have criteria specified to evaluate each acceptance criteria specified on the comparison standard. Additionally, evaluation criteria should be specified to ensure consistency in the " shape" of parameter response curves. Evaluation criteria selected may take the form of "+2% deviation from benchmark data for the parameter value at any given time", provided this is more restrictive than the criteria specified on the comparison standard. 6.16.3 Evaluation criteria determined per 6.16.1 and 6.16.2 shall be entered on an NSEM-4.09, Form 7.7 for each transient. k, s 6.16.4 Completed NSEM-4.09, Form 7.7's shall be reviewed and approved by the ASOT. 6.16.5 Copies of approved NSEM-4.09, Form 7.7's shall be forwarded to STSB. 6.17 Developing the simulator Specific Test Procedure for Annual Operability Testing 6.17.1 The annual operability test will be identified as Section 2 of Part B of the respective attachment to NSEM-4.09. 6.17.2 The procedure shall include the same instructions as required by 6.10.5 and 6.10.6. 6.17.3 Data from benchmark performance tests shall be used to specify endpoint conditions for each transient test.

 ' '\                                                  Rev.: 0 Date: 5/4/88 Page: 23 of 26 NSEM-4.09

6.17.4 Completed annual operability test results

             -f(_               f'~)s _                                shall be reviewed and approved by the ASOT.

6.17.5 Approved annual operability test results will be maintained with simulator certification records per NSEM-3.02. 6.18 Resolving Test Results Which Do Not Meet Acceptance Criteria 6.18.1 Appropriate tests should be re-run to . ensure proper performance.  ! 6.18.2 A.DR shall be submitted for unacceptable response for steady state and/or stability testing. 6.18.3 A DR shall be submitted if automatic systems actuations or key alarms do not occur as required or if improper alarms or automatic systems actuations do occur. l 6.18.4 Transient test response which fails to meet evaluation criteria based on the limits determined in 6.16 shall be evaluated using the comparison standards. Take appropriate action as follows: 6.18.4.1 Submit a DR if response does not meet acceptance criteria specified on the comparison standard. 6.18.4.2 If the response data meets all the acceptance criteria of the comparison standard, a new bench-mark should be established NOTE: Minor changes to simulator modeling may cause responses to exceed the more restrictive evaluation criteria for annual operability testing while meeting all acceptance criteria for the comparison standard. Dependent on the method chosen for programming the limits, the new response may more closely resemble the standard. l l Rev.: 0 1 Date: 5/4/88 Page: 24 of 26 NSEM-4.09

O

   /T         6.19  Simulator Design Changes
                             Identifying the need to establish new 6.19.1 transient benchmarks or to perform non-scheduled operability testing in response to simulator design changes will be addressed in NSEM-5.02.

6.20 Disposition of Forms Generated 6.20.1 Forward completed originals of the following to the AsoT for review and  ; approval: 6.20.1.1 NSEM-4.09, Form 7.6 Simulator Operability Test Report with attachments. 6.20.1.2 NSEM-4.09, Form 7.7 operability Test Acceptance Criteria-Transients 6.20.2 The AsoT will forward approved originals of NSEM-4.09, Form 7.1, 7.2, 7.3, 7.4, 7.5, 7.6, 7.7 and attachments to controlled document storage for retention with simulator certification records. 6.20.3 Partially completed copies of NSEM-4.09, [N) (, Form 7.3 and 7.4 should be maintained by the ASOT for recording test data of fu*.ure operability testing. 6.20.4 Copies of comparison standards for all benchmark transients shall be maintained by the ASOT. 7.0 FORMS 7.1 Steady State Test Worksheet 7.2 Benchmark Transient Test Worksheet 7.3 Steady State Test Data Sheet 7.4 Stability Test Data Sheet 7.5 Benchmark Transient Test Data Sheet l  ! (' ' . A Rev.: 0 Date: 5/4/88 Page: 25 of 26 { NSEM-4.09 1 l L-- _ _ _

L.,' i 7.0 FORMS (Continued) 7.6 . Simulator operability Test Report (Steady State Operation, Stability, Benchmark Transients) 7.7 Operability Test Evaluation Criteria-Transients 8.0 -ATTACHMENTS 8.1 Part A - Transient Benchmark validation Test Procedure - MP1 Part B - Operability Test Procedure - MP1 8.2 Part A - Transient Benchmark validation Test ~ Procedure - MP2 Part B - Operability Test Procedure - MP2 8.3 Part A - Transient Benchmark Validation Test Procedure - MP3 Part B - Operability Test Procedure - MP3 8.3 Part A - Transient Benchmark validation Test Procedure - CY Part B - Operability Test Procedure - CY Rev.: 0 Date: 5/4/88 Page: 26 of 26 NSEM-4.09

Form 7.1

 . p                                                                 STEADY STATE TEST WORKSHEET l .(

l Power Reference Plant Date/ l Plant Leve) Data Source Time PAPAMETER ACCEPTANCE CRITERIA Instrument Response Ncme- ID# Target nange Error Source C/N Tolerance Range l I O- Rev.: 0 I Date: 5/4/88 l Page: 7.1-1 of 1 NSEM-4.09 I

Form 7.2

                                                                     . BENCHMARK TRANSIENT TEST WORKSHEET

, b P'LANT' TRANSIENT

                    .ANS-3.5 REFERENCE _

PARAMETER NAME ID# DATA SOURCE ENDPOINT CONDITIONS: TRANSIENT DURATION: O Rev.: 0 Date: 5/4/88 Page: 7.2-1 of 2 NSEM-4.09

      =. - - . - __- . _ _ _ _ - _ _ _ . _ _ _

Form 7.2 l.

                                                          . BENCHMARK TRANSIENT' TEST WORKSHEET e,

l1 ,

                                ' Auto Sys. Actuation (s)            Instrument #               Setpoint

[ Key Alarm (s) Instrument # Setpoint O O Rev.: 0 Date: 5/4/88 Page: 7.2-2 of 2

s Form 7.3 jf.

 -(_/                                  STEADY STATE TEST DATA SHEET Simulator                 Power Level                     Time /Date
          . Initial Conditions /IC#

IC Alignment. Changes PARAMETER ACCEPTANCE CRITERIA Response Instrument Error Displayed Acceptable Name ID# C/N Range (Multiple Channels) Value Yes No O f

       ~

I Rev.: 0 Date: 5/4/88 Page: 7.3-1 of 1

l L , ' +-_ A. Form 7.4 ' 'N(j STABILITY TEST DATA SHEET l Simulator Time /Date Initial Conditions /IC#  ! Control System Alignments Number of Runs Required Data Recording Interval Test Acceptance Criteria Met: Y N Recorded Parameters: (O J Describe unacceptable response and list DR submitted (if required): i Rev.: 0 i O- Date: 5/4/88 Page: 7.4-1 of 1 NSEM-4.09

Form 7.5

  .;f 0*                                                          ~ BENCHMARK TRANSIENT TEST DATA SHEET 1'

l:; . fh, Simulator Transient  ; Initial Test /Re-run Time /Date Recorded Parameters (Name and ID#): Initial Conditions: Endpoint conditions: I Acceptable Key Alarm Setpoint Y N (circle one) l Acceptable

                      . Auto Sys Actuation                          Setpoint                             Y   N    ,

(circle one) O Rev.: 0 Date: 5/4/88 Page: 7.5-1 of 2 NSEM-4.09

Form 7.5 v I BENCHMARK TRANSIENT TEST DATA SHEET l

 ?r si                                                                          I i
 -(                                                                             '

m./ Acceptable Improper Alarms / Auto Sys Actuation Y N

       ' Describe:                                                 (circle one)

Acceptable Physical Laws of Nature Y N Describe-any unacceptable response: (circle one) Acceptable Parameter Plot Comparison Y N Specify ID# and area of (circle one) response for any unacceptable parameter: Sepcify ID# and area of response for any-parameter (s) resolved by independent observation by license holder (s): 1 DR's Submitted Specify number and problem area: l

                                                                                  )

i O Rev.: 0 Date: 5/4/88 Page: 7.5-2 of 2 NSEM-4.09 { L j

l . l' ' Form 7.6 L, SIMULATOR. OPERABILITY TEST REPORT , . ,i~~; 1 I\,s) (_ Simulator Date Submitted Reason for testing (initial /significant design change - describe):

                            . Test report submitted by:

Instructor Test report approved: ASST l TESTING PERFORMED ACCEPTANCE CRITERIA MET i l

                                                                                                           ~

Specify; type, include-power level of. Describe unacceptable steady state tests and description response on page 2  ! for transients: Y N NN i I I j

      ,r r Rev.: 0 Date: 5/4/88 Page: 7.6-1 of 2 NSEM-4.09
                                                                                   -l Form;7.6                                    1
  '.4
    ~.'

SIMULATOR OPERABILITY TEST REPORT e

   -(

UNACCEPTABLE RESPONSE TEST TYPE POWER LEVEL / TRANSIENT DESCRIBE RESPONSE DR# 1. O  ; I I O Revo: 0 Date: 5/4/88 Pa93: 7.6-2 of 2 NSEM-4.09 L-----______.___

L. ;3 1'; Form.7.7 j h 1

   ..                                                                        ' OPERABILITY ~ TEST EVALUATION CRITERIA-TRANSIENTS ff         f '

Transient

  - Q [ Simulator
              ^ Approved                                                                                                                                                                                                          Date ASST Evaluation Criteria
                 . Key! Alarm                                                          Parameter           Setpoint                             Time                                                    Value                               Time k

1 l i Auto Sys Act. Parameter Setpoint Time j { b l TURNING POINTS Parameter Extreme Max / Min Time i l l VALUE AT ENDPOINT / STABILITY \ Parameter Value Time  ; Rev.: 0 f

      %                                                                                                                                                                                                                           Date: 5/4/88 Page: 7.7-1 of 2 NSEM-4.09

Form 7.7 OPERABILITY TEST EVALUATION CRITERIA-TRANSIENTS i

 ./    4-( _,/                                                                                          PARAMETER RESPONSE CURVE " SHAPE"
                                                                                                                                                                         )

List all alarms, auto sys, actuations, turning points, and stability / endpoint i values whose evaluation criteria is less restrictive than the " shape" criteria (thereby acceptable if the " shape" Parameter Deviation criteria is satisfied) l l l O O Rev.: 0 Date: 5/4/88 Page: 7.7-2 of 2 NSEM-4.09 l L

 ,31,-
     %'  ; i. '.
 /%
 .: \.j NORTHEAST UTILITIES NUCLEAR SIMULATOR ENGINEERING MANUAL NSEM - 4.10 NORMAL.. OPERATIONS VERIFICATION Responsible Individual:

M& nager, Operator' Training Branch Approved: JFirector) Nuclear Training Revision: 0 Date: 6/29/88 SCCC Meeting No: 88-008

   'O

7 1 O 1.0

V PURPOSE e

L The purpose of this procedure is to provide guidance to write a unit specific Normal Operations Test for each of the 4 NU Simulators. The unit specific Normal Operations Test will verify that the simulator is capable of simulating continuously, in real time, normal operations of the reference plant. 2.0 APPLICABILITY This procedure applies to the Nuclear Training Department (NTD), including Operator Training Branch (OTB), Simulator Technical Support Branch (STSB), and other Northeast Utilities (NU) organizations performing functions to support the NU Simulator Certification Program.

3.0 REFERENCES

3.1 ANSI /ANS 3.5-1985 - This standard states the minimal { functional requirements on design data and simulator performance and operability testing. l 3.2 NRC RG 1.149 Rev. 1, April, 1987 - This guide f describes an acceptable methodology for certification by endorsing ANSI /ANS-3.5-1985 with some additional j requirements. 1 3.3 10CFR 55.45(b) - Mandates a timetable for simulator facility certification and specifies additional testing requirements. I 3.4 INPO Good Practice TQ-504 - Describes techniques for effectively controlling simulator configuration. 3.5 NUREG 1258, December, 1987 - Describes the procedures and techniques which will be employed to audit certified facilities. 3.6 INPO 86-026, Guideline For Simulator Training, October, 1986. l 3.7 INPO 87-006, Report on Configuration Management in ) the Nuclear Utility Industry, July, 1987. j Rev.: 0 Date: 6/29/88 Page: 1 of 9 NSEM-4.10

1
  ]
 .\_/;                   4.0 DEFINITIONS                                                                            ;

4.1 Deficiency Report (DR) - Form (STS-BI-FlA) used by the Operator Training Branch (OTB) and the Simulator  ! Technical Support Branch (STSB) to record all identified simulator deficiencies between the , simulator and reference plant. )

                                                                                                                    )

4.2 Normal Plant Evolutions - Evolutions that the simulator shall be capable of performing, in real time, that simulate routine reference plant evolutions. 4.3 Plant Startup - The starting conditions shall be cold shutdown temperature and pressure to hot standby temperature and pressure. The neactor Vessel Head need not be removed for cold shutdown. 4.4 Nuclear Start-Up - From all CEA's fully inserted to going critical at hot standby conditions. 4.5 Turbine Generator Start-Up - Turbine Generator at zero RPM, to rated speed and synchronization to grid. 4.6 Reactor Trip and Recovery - Reactor trip followed by f- recovery to rated power. i

   \                         4.7   Hot Standby Operations - Maintaining stable plant conditions at hot standby.

4.8 Load Changes - Increasing and decreasing plant load. 4.9 Plant Operation Less Than Full Reactor Coolant Flow - Startup, shutdown and power operations with less than full reactor coolant flow. 4.10 Plant Shutdown - Shutdown from rated power to hot standby, then cooldown to cold shutdown conditions. 4.11 Core Performance Testing - Plant heat balance, deter-mination of shutdown margin, measurement of reactivity coefficients and control rod worth using permanently installed instrumentation. 4.12 Surveillance Testing - Operation conducted surveillance testing on safety related equipment or systems.

 .p
   \                                                                       Rev.: 0 Date: 6/29/88 Page: 2 of 9 NSEM-4.10

F. I 4 h l e f}'

   \,_/                5.0 RESPONSIBILITIES 5.1  Assistant Supervisor Operator Training (ASOT) 5.1.1    Responsible for' assigning Operator Instructors to write, perform, and document normal operations capability tests.

5.1.2 Responsible for assigning Operator. Instructors to perform retests of discrepan-cies identified during the normal operations capability tests. 5.1.3 Responsible for reviewing and approving unit specific normal operations tests prior to their performance. . l 5.1.4 Responsible for review and acceptance of each completed normal operations test. 5.1.5 Responsible for scheduling the accomplishment of all normal operation tests on a continuous four year basis. 5.1.6 Responsible for reviewing and approving gs Figure 7.1 (Normal Plant Evoltations List)and

  ,4 )                                   Figure 7.3 (Surveillance Testing).

5.2 Operator Instructor 5.2.1 Responsible for writing, conducting and verifying unit specific normal operations capability tests.

5. 2. 2- Responsible for writing Deficiency Reports and retests for steps which do not respond as expected during the performance of normal operations tests.

5.2.3 Responsible for documentation of completion of each step in the normal operations tests. 5.2.4 Responsible for determining which normal plant evolutions can be performed and documenting any reasons why there is a conflict on Figure 7.1. (~ (s) Rev.: 0 Date: 6/29/88 Page: 3 of 9 NSEM-4.10

I .i A/ 5.2.5 Responsible for establishing the expected values that should be observed during.the performance of the test, e.g., heat up rate under certainLpump/ flow conditions, and for evaluating the simulator response for acceptability. 5.2.6 Responsible for developing a list of safety equipment surveillance procedures that are or will be used for training on that unit. l (rigure 7.3). 6.0 ' INSTRUCTIONS-6.1 Establish List of Normal Plant Evolutions 6.1.1 rigure 7.1 lists the 10 Normal Plant Evolutions required by ANS 3.5. . Review rigure 7.1 and determine which of these 10 . evolutions will be performed by this  !' procedure. Check "yes" or "no" on figure 7.1 to. document this decision. 6.1.2 If an evolution on Figure.7.1 will not be tested, list the reason why. Valid reasons

'O                           for not testing an evolution could be:

o Technical Specification Limitations. Example: MP2 Technical Specifications prohibit critical operations with less than full' reactor coolant flow. o Plant procedural limitations. o Evolution covered by another certification test. The Reactor Core  ; System Test may be used as a substitute ] for #9 " Core Performance Testing", at the ASOT's discretion. It is expected that f

                                 " Core Performance Testing" is the only        I test to fall in this category.                 j i

6.1.3 For each evolution that will be covered under ) this procedure, list on figure 7.1, those plant operating procedures that will be used to fulfill each required plant evolution. k Rev.: 0 Date: 6/29/88 Page: 4 of 9 NSEM-4.10

0 , y. H l .. t ("% I q,) Note: It is the intent of this procedure to use , plant operating procedures as a basis for i normal plant startup, shutdown and l surveillance to verify the simulators J ability to perform normal plant evolutions in real time. { 6.1.4 Section 6.2 will describe the writing of a unit specific test procedure to test all required evolutions listed on Figure 7.1. At the completion of writing the unit specific test procedure, record on Figure 7.1, for each required evolution, the unit test specific step numbers which perform the test on that evolution. This will show l specifically which portions of the unit specific test procedure correspond to the required evolutions on Figure 7.1. 6.2 Writing of Normal Plant Operations Tests 6.2.1 Each unit shall write a normal plant operations test to encompass all those required evolutions listed in Figure 7.1, which have been checked "yes". [.~)

     \/                    6.2.2 The unit specific test shall be written as:
                                   . Attachment 8.1 for MP1
                                   . Attachment 8.2 for MP2
                                   . Attachment 8.3 for MP3
                                   . Attachment 8.4 for CY 6.2.3   A figure 7.2 test cover sheet shall be the cover sheet for each unit specific test.          .

l' 6.2.4 The unit specific test, as per NSEM-1.01, will be under the control of the unit ASOT as  ! far as revision level and date of procedure. ] 6.2.5 A unit specific test shall have as major sub-sections, each required plant evolution ) listed on Figure 7.1 (example, Plant Heatup, i Reactor Startup, etc.). i 6.2.6 Each major sub-section of the test (i.e , Plant Heatup) shall use plant operating procedures as the basic method of providing direction to execute the test on the simulator.

                                                                                     )

Rev.: 0 { Date: 6/29/88 ( Page: 5 of 9 , NSEM-4.10 1

s (m, l ,) Note: While it is the intent to use plant operating

  ~

procedures as the basis for performing normal plant evolutions, other means will also be used to verify Simulator performance. For example, a plant procedure for heatup may specify starting 2 RCP's. However, the Simulator RCS heatup rate needs to be verified as being consistent with the reference plant's 2 RCP heatup rate. Therefore, an approach of using operating procedures in conjunction with verification of actual simulator response is needed. 6.2.7 The order of major subsections in the test shall be logically arranged to verify that the simulator can, in real time, be continuously operated over the range from cold shutdown to 100% power. For example, a logical sequence could be:

           . 1.0   Plant Heatup
           . 2.0   Nuclear Startup
           . 3.0   Plant Sta.rtup
           . 4.0   Load change to 100% power
           . 5.0   Reactor Trip & Recovery to rated power
           . 6.0   Plant Shutdown

(~N . 7.0 Reactor Shutdown

't -       . 8.0   Plant Cooldown 6.2.8 Each major subsection (i.e., Plant Heatup) should contain th' following information:

o Initialization requirements to start the test. o which plant procedures will be used and a requirement to record which Revision Level /# of changes of the procedures to be used. o Which surveillance procedures are required to be completed during this subsection and a statement that completed surveillance forms shall be attached. o which plant operating procedure forms are required to be attached. l ()j (~ Rev.: 0 Date: 6/29/88 Page: 6 of 9 NSEM-4.10

     .er L

l b

     \_/ '                             o  A listing of any simulator responses that need acceptance criteria to be verified during performance of the operating procedure.    (i.e., when 2 RCP's are running, the RCS heatup rate shall be 20-25 F/hr) o  The completion criteria for the major sub-section, i.e. " Plant Heatup is complete when RCS temperature is at 532 F and RCS pressure is at 2250 psia with all surveillance complete".

6.2.9 The unit specific test procedure shall list all procedure forms or surveillance forms that are to be completed and attached. 6.2.10 All safety related Technical Specification surveillance that are to be available for use in training on the simulator shall be listed on figure 7.3. 6.2.11 All surveillance listed on Figure 7.3 shall be tested per this procedure. The only exception shall be multiple facility g-x surveillance. For example, if the Service Water Pump operability test - Facility I ('_) surveillance will be done, the Service water Pump operability test - Facility II surveillance can be skipped. If this exception is used, any problems found on one facility shall be checked for a common problem on the untested facility. 6.2.12 surveillance shall be performed at the required times per plant operating procedures, i.e., needed for the next mode change. 6.2.13 valve lineup forms referenced by operating i procedures need not be attached to the unit specific test, however they should be reviewed by the instructor performing the test to ensure valves are properly positioned. J 6.2.14 Fast time should be avoided where possible, It is the intent of the Normal Operations Test j to replicate actual plant evolutions, 6.3 Performance of the Unit Specific Test Rev.: 0 Date: 6/29/88 Page: 7 of 9 NSEM-4.10 1

A

         . f)
             \s /.                                           6.3.1   The unit ASOT shall sign the cover sheet to release the unit specific Normal Operations Test for performance.

6.3.2 Assigned instructors shall perform the test in the sequence specified by the test procedures. 6.3.3 Assigned instructors shall fill in any information required by the test and attach required completed forms. 6.3.4 Any deficiencies noted during the performance of the Normal Operations Test shall be brought to the attention of the ASOT to determine if a DR should be written. 6.3.5 The completed procedure and associated paperwork shall be verified by another instructor and his signature shall be recordedRon the " Normal Operations Test Cover Sheet". The instructors signing the test cover sheet for " Performed By" and " Verified By" may both have been associated with the test performance but the signatures for

                                                                     " Verified By" and " Performed By" shall be the signatures of 2 different instructors.
         .O 6.3.6   The completed package shall then be forwarded to the ASOT for final acceptance.

C'3.7 If the test is being performed during the 4 year performance cycle, the cover sheet shall contain a listing in the comments section as to what.section of the Normal Operations Test is being performed. 7.0 FIGURES 7.1 Normal Plant Evolutions List 7.2 Normal Operations Test Cover Sheet 7.3 Surveillance Listing 8.0 ATTACHMENTS 8.1 Millstone Unit 1 Normal Operations Test Procedure ! Rev.: 0 Date: 6/29/88 Page: 8 of 9 NSEM-4.10

   /*~

V 8.2 Millstone Unit 2 Normal Operations Test Procedure 8.3 Millstone Unit 3 Normal Operations Test Procedure 8.4 Connecticut Yankee Normal Operations Test Procedure

 /\

U l (}- Rev.: 0 Date: 6/29/88 Page: 9 of 9 NSEM-4.10

Figure 7.1

            ?

NORMAL PLANT EVOLUTIONS LIST UNIT t' l O) Date: Date: Instructor ASOT

1. PLANT STARTUP o Will be tested by this procedure: YES [ ] NO [ ]

o If no, state reason: o Operating Procedure (s) to be used: o . Unit Specific Test Step numbers which will complete this requirement:

                -2.             NUCLEAR STARTUP o Will be tested by this procedure:           YES [ }   NO [ ]

o If no, state reason: o Operating Procedure (s) to be used: o Unit Specific Test Step numbers which will. complete this requirement:

3. TURBINE STARTUP AND GENERATOR SYNCHRONIZATION o Will be tested by this procedure: YES ( ) NO [ ]

o If no, state reason: o Operating Procedure (s) to be used: o Unit Specific Test Step numbers which will complete this requirement: Rev.: 0 Date: 6/29/88 Page: 7.1-1 of 4 NSEM-4.10 1

Figure.7.1

      .,                                 NORMAL PLANT EVOLUTIONS LIST-
                                            . UNIT LO
4. REACTOR TRIP AND RECOVERY-o Will be tested by this procedure: YES [ ] NO [ ]

l o If no, state reason: o . Operating Procedure (s) to be used: o Unit Specific Test Step numbers which will complete this requirement:

5. HOT STANDBY OPERATION o Will be tested by this procedure: YES [ ]. NO [ ]

, o. If no, state reason: o Operating' Procedure (s) to be used: o Unit Specific Test Step numbers which will complete this requirement:

6. ' LOAD CHANGES o Will be tested by this procedure: YES I] NO [ ]

o If no, state reason: o operating Procedure (s) to be used: o Unit Specific Test Step numbers which will complete this requirement: Rev.: 0 Date: 6/29/88 Page: 7.1-2 of 4 NSEM-4.10

                -                      -       _   -       --   ._-______-__--___-__--___________A

Figure 7.1 1 NCRMAL PLANT EVOLUTIONS LIST UNIT 7, PLANT OPERATIONS WITH LESS THAN FULL REACTOR COOLANT FLOW o Will be tested by this procedure: YES [ ] NO [ ]- o If no, state reason: o Operating Procedure (s) to be used: o Unit Specific Test Step numbers which will complete this requirement:

8. PLANT SHUTDOWN o Will be tested by this procedure: YES [ ] NO [ ]

o If no, state reason: f.

                \      o  Operating Procedure (s) to be used:

o Unit Specific Test Step numbers which will complete this requirement:

9. CORE PERFORMANCE TESTING o Will be tested by this procedure: YES [ ] NO [ ]

o If no, state reason: o Operating Procedure (s) to be used: o Unit Specific Test Step numbers which will complete this requirement: ( Rev.: 0 i Date: 6/29/88 i Page: 7.1-3 of 4 NSEM-4.10

rigure-7.1 i NORMAL PLANT EVOLUTIONS LIST UNIT 4 w/ (O L

10. SURVEILLANCE TESTING o Will be tested by this procedure: YES-[ ] NO [ ]

o If no,- state reason: o Operating. Procedure (s) to be used: o Unit Specific Test Step numbers which will complete this requirement: O i i .O Rev.: 0 Date: 6/29/88 Page: 7.1-4 of 4 ! NSEM-4.10

l \ () Figure 7.2 NORMAL OPERATIONS TEST COVER SHEET UNIT- ATTACHMENT NUMBER Released for Performance By: AScr DATE Performed By: DATE Verified By: DATE Accepted By: ASOT DATE

1. List of operating Procedure Forms Attached
2. Comments Attached Yes [ ] No [ ]

l l (' Rev.: 0 Date: 6/29/88 i Page: 7.2-1 of 1 1 NSEM-4.10 l

W' . . - a, . . V _ 4 [. i; ..

          =.                                                                                           rigure 7.3
                                                                                                                              .Page         of SURVEILLANCE LISTING j                                                                               ,

UNIT (Yes/No)~

Soquential' Number Title Procedure #- To'Be Tested it 0

Approved: ASOT Rev.: 0 Date: 6/29/88 Page: 7.3-1 of 1 NSEM-4.10 a

i i s

                  )

R.J NORTHEAST UTILITIES NUCLEAR SIMULATOR ENGINEERING MANUAL NSEM - 4.11 INSTRUCTOR STATION

              %,/

Responsible J Individual: m Ma' nager, Operator Training Branch Approved: Di Nue:. ear Training Revision: 0 Date: 8/17/88 SCCC Meeting No: 88-010

              ~_]
         -r L '.

e i 1.0 PURPOSE l The purpose of this procedure is to test those features of the Simulator Instructor Station which are important in providing simulator training and may effect an operator's actions during training. 2.0 APPLICABILITY This procedure applies to the Nuclear Training Department (NTD), including the Operator Training Branch (OTB), Simulator Technical Support Branch (STSB), and other Northeast utilities (NU) organizations performing functions in support of the NU Simulator Certification Program.

3.0 REFERENCES

3.1 ANSI /ANS 3.5-1985 - This st ndard states the minimal functional requirements on design data and simulator performance and operability testing. 3.2 NRC RG 1.149-Rev. 1, April 1987 - This guide describas an acceptable methodology for certification by endorsing ANSI /ANS-3.5, 1985 with some additional requirements.

            'I
                \

3.3 10CFR 55.45(b) - Mandates a timetable for simulator . facility certification and specifies additional testing requirements. 3.4 NUREG 1258 December, 1987 - Describes the procedures and techniques which will be employed to audit certified facilities. 3.5 INPO 86-026, Guidelines for Simulator Training October, 1986. 4.0 DEFINITIONS 4.1 Deficiency Report (DR) - Form (STS-BI-FlA) used by the Operator Training Branch (OTB) and the Simulator Technical support Branch (STSB) to record all identified simulator deficiencies between the simulator and reference plant. 4.2 Input / Output (I/0) - any digital or analog computer inputs / outputs. () Rev.: 0 Date: 8/17/88 Page: 1 of 13 NSEM-4.ll

i i i

      '                                                                                     1
   'T                                                                                       4
 ?

K_/ 4.3 Snapshot - the recording of the present status of all simulator digital / analog I/O's. After this snapshot is taken, the simuletor may be initialized to this condition at some later time. 4.4 Backtrack - the ability to move the simulator back in time to conditions which had previously existed. This is accomplished by the automatic storage (at one minute intervals) of the simulators I/O's over the past hour. 4.5 Freeze - the stopping of all simulator dynamic modeling. When the simulator is taken out of freeze, the model will continue to run from the time that it was placed in freeze. 4.6 Slow Time - in reality, this is the expansion of real time ( which produces the appearance that a transient is occurring at a slower speed. The slow time which can be selected can vary from 5% to 95% of real time (at 5% increments). 4.7 Fast Time - the increase in the speed at which certain parameters (such as Xenon, condenser air evacuation, RCS heatup, RCS cooldown, turbin? metal heatup, turbine metal cooldown, and decay heat) are modeled to change. (,,) 4.8 Boolean Trigger - an algebraic expression which is used to ,

  %-               automatically activate a malfunction when its value becomes true.

4.9 Composite Malfunction - a combination of up to 10 predefined simple malfunctions which can be arranged in a logical sequence. Once built, this composite malfunction is stored and can be used at any time. 5.0 RESPONSIBILITIES 5.1 Assistant Supervisor Operator Training (ASOT) 5.1.1 Responsible for assigning Instructors to write, perform and document instructor station tests. 5.1.2 Responsible for assigning Instructors to perform retests of discrepancies identified during the instructor station test. 5.1.3 Responsible for reviewing and acceptance of the completed instructor station test after completion. Rev.: 0 (

    '~
        )                                                         Date: 8/17/88 Page: 2 of 13 NSEM-4.ll l

[l s

      .l,'+

5.1.4 Responsible for scheduling the accomplishment of all instructor station tests on a continuous four year basis. , 5.2 Operator Instructors 5.2.1 . Responsible for conducting instructor station

                                                                                    -tests required for simulator certification.

5.2.2 Responsible for writing Deficiency Reports (DR) and retests for instructor station capabilities

                                                                                    -that do not respond as required during the tests, n                                                                              5.2.3  Responsible for completing all documentation of instructor station tests.

5.3 STSB 5.3.1 Responsible for resolving any DR's generated by this procedure. 5.3.2 Responsible for providing support to OTB when requested. 6.0 INSTRUCTIONS ,

                             }

General Discussion This procedure tests those features of the simulator instructor station which could impact an operator training session. If additional instructor station testing is judged necessary by an individual unit ASOT, he has the option (per NSEM 4.01/6.1.2.5) to write a system test for the instructor station. The testing performed by this procedure ensures that all four of NU's simulators meet the basic instructor station requirements necessary to support operator training. 6.1 Backtrack 6.1.1 The backtrack capability of the simulator will be verified by initiating a reactor trip and recording data at specific times. At the completion of the transient, the simulator will be , backtracked to these same times and a second set ' of data recorded. These data points will be i compared to verify proper backtrack capability. l I [\- Rev.: 0 Date: 8/17/88 l l Page: 3 of 13 l NSEM-4.11

[ .- ID 6.1.2 Use Figure 7.1 (simulator Backtrack Verification) (._/ ' to document the testing of the backtrack capability of the simulator. 6.1.3 Identify the Unit, IC number and Date on Figure l 7.1. 6.1.4 Fill in the instrument number on Figure 7.1 for 1 analog point which will be used to evaluate the l backtrack capability of the simulator. The point chosen should be one that changes value rapidly during the upcoming reactor trip. A Level Instrument (0-100% level) is recommended. Note: A level instrument is recommended for consistency with the acceptance criteria described later. Level instruments have a consistent span of 0-100%; temperature, pressure or flow instruments have highly variable spans making acceptance criteria difficult to apply. For example, 1% error on a 600 F temperature instrument is 6 F, while a 1% error on a 300*F instrument is 3'F. Use of a 1% error criteria on a level instrument always implies a 1% level error. 6.1.5 Reset to a full power IC and keep the simulator in

     /'hg                                                                 freeze. For the instrument chosen to be recorded, '

(_/ verify the instrument value on the instructor station is consistent with the control board value. When directed to do so in the following steps, record on Figure 7.1, the reading from the instructor station to as many decimal places as the instructor station gives. 6.1.6 Place the simulator in run to stabilize and at exercise time 1 minute, trip the reactor. 6.1.7 Starting at exactly 1 minute exercise time and at exactly 1 minute intervals, manually record the value of the selected analog point on figure 7.1, under the column "Real Time Values". 6.1.8 Continue recording the data until 7 minutes exercise time has elapsed, then freeze the simulator. 6.1.9 Backtrack the simulator and record the exercise time and analog value of the selected point on Figure 7.1 for each backtrack file that exists during the 7 minute interval. The backtrack files are at 1 minute intervals. Compare backtrack & real time values against each other. O Rev.: 0 Date: 8/17/88 Page: 4 of 13 NSEM-4.11

f .- t N. 6.1.10 Agreement should be within +1% of the absolute value between the analog point taken in real time l vs. that in backtrack time. The 1% criteria is  ! applied by subtracting the real time value of level (in %) minus the backtrack value of level (in %). The absolute value of this difference , shall be less than 1%. { 6.1.11 If all data points are within 1%, the instructor will sign Figure'7.1 and submit it to the ASOT for 'j review. Otherwise a DR shall be submitted, 1 referencing this certification test / step number. 6.1.12 Whenever this test is performed, the same level instrument (selected in 6.1.4) shall be used for this test, unless extenuating circumstances exist, as determined by the ASOT. i 6.2 Freeze  ; 6.2.1 This capability will be accomplished by placing i the simulator in freeze and observing both the instructor station and the simulator to verify  ! that the computer model is not active. 6.2.2 Use Figure 7.5 (Simulator Freeze verification) to f document the accomplishment of the freeze . capability of the simulator. l 6.2.3 Place simulator in freeze and observe indications ) listed on Figure 7.5, Step 1. 6.2.4 If the conditions of Step 1 are satisfied, sign Figure 7.5 and submit to the ASOT. , 6.2.5 If the conditions are not satisfied, a DR will be written to have this discrepancy corrected. l 6.3 Snapshot 6.3.1 Reset to any IC, run for 1 minute, then freeze. 6.3.2 Record on Figure 7.6, the following information: 6.3.2.1 For any four control board meters, record the instrument ID, control board value, corresponding PPC ID and PPC value. Rev.: 0 Date: 8/17/88 Page: 5 of 13 NSEM-4.11

         = - _ - _ _ _ _ _ _ _ _ _

r, , u

    )

f , :i

        ./%\

l k/- 6.3.2.2 For any four' simulated annunciators, l record the annunciator panel location and condition (on/off). 6.3.2.3- For any 2 analog remote functions, record the ID and value. For any 2 digital remote functions, record the ID and Value. 6.3.2.4 For any four control board switches record-the ID and conditions (open, close, etc.). 6.3.3 Take a snapshot of the current conditions and store it in a temporary IC. 6.3.4 By whatever means practical, place all analog and digital points selected above to a substantially different value or state so that when resetting to the snapshot of Step 6.3.3, any problems will be more easily detected. 6.3.5 Reset to the snapshot taken in Step 6.3.3 and

 ,                             leave in freeze.                                     j 6.3.6  Compare all parameters on Figure 7.6 to the        .

A snapshot conditions and verify they are the same. 6.3.7 If conditions of all points on Figure 7.6 are the same, sign off Figure 7.6 and submit it to the ASOT. 6.3.8 If conditions of-all points on Figure 7.6 are not the same, P. smit a DR. 6.3.9 Whenever this test is performed, the same meters, anntmciators, remote functions and switches selected in step 6.3.2 shall be used  ; for this test, unless extenuating circum-stances exist, as determined by the ASOT. f 6.4 Fast Time 6.4.1 All fast time parameters modeled (i.e. heatup rate, Xenon Conc., etc.) will be checked. j 6.4.2 Complete a Figure 7.2 for each fast time parameter verified. () Rev.: 0 Date: 8/17/88 Page: 6 of 13 l NSEM-4.11

L f I l' p s- 6.4.3 Record on Figure 7.2 the parameter to be evaluated (i.e., RCS heatup rate) and IC used. 6.4.4 Identify the analog data point that will be tracked to quantify the fast time rate (example for fast time RCS heatup rate, TEll5). Preferably the analog data point selected should be from the control boards, however in some cases the instructor station or hazeltine terminal may need to be used. Ensure that the~ location (Cuntrol Board, Instructor Station or Hazeltine) and analog point ID are identified on Figure 7.2 under the section " Analog Data Point to be used for evaluation". 6.4.5 Reset to the selected IC. Record on Figure 7.2 any simulator control board manipulations that are necessary. This information shall be recorded

                              .under the heading that reads "Brief Description of Simulator Test Steps". Any manipulations so indicated shall be written and performed to maximize repeatability of results. After any simulator control board manipulations are performed, run for one minute to stabilize and then freeze the simulator.

I\ 6.4.6 Take a temporary snapshot and record value of .

    \/#                        selected analog point on Figure 7.2.

6.4.7 Return to run, allow simulator to run for exactly 5 minutes then freeze the simulator and record l final value of selected analog point on Figure 7.2. 6.4.8 Reset to the snapshot taken in 6.4.6 and record the initial value of the analog point and the fast time rate that will be selected on Figure 7.2. It is recommended the highest fast time value available be used. 6.4.9 Go to fast time and return to run. 6.4.10 Allow simulator to run for exactly 5 minutes, then freeze. 6.4.11 Record the final value of the selected analog point on Figure 7.2. 6.4.12 Compare the difference between normal time and fast time. () Rev.: 0 Date: 8/17/88 Page: 7 of 13 NSEM-4.11

E 7"%= k_) Note: Numerical' Acceptance Criteria are not appropriate for fast time. If the parameter can be speeded up using fast time, that is acceptable, since a student does not know the correlation of fast time rate at the Instructor Station to the control board response. This test should be considered documentation of simulator response to a selected fast time rate (preferably the maximum Instructor Station fast time rate). 6.4.13 If the difference is larger for fast time than normal time, fast time is verified for that parameter. If fast time is verified, the instructor shall sign Figure 7.2 and forward it to the ASOT for his review / signature. If fast time is not verffied, then submit a DR or reconsider this parameter for having a fast time option on the Instructor Station. 6.4.14 Repeat Steps 6.4.2 thru 6.4.13 for each parameter that has a. fast time capability. 6.4.15 whenever this test is repeated, the same fast time rate, analog evaluation point and simulator test sequence shall be followed to maximize reproduc-ability of results.

        \.             6.5 Slow Time 6.5.1   Slow Time capability of the simulator will be tested by comparing 2 reactor trips starting from the same conditions. One trip will be in normal time for 1 minute, the other in slow time (1/10) for 10 minutes. Values of 3 selected analog points will then be compared.

6.5.2 Data will be recorded on Figure 7.3. 6.5.3 Reset to a full power IC and record the IC number and date on Figure 7.3. i 6.5.4 Allow the simulator to run for a minute to i stabilire. 6.5.5 Freeze the simulator, record on Figure 7.3 the ID i and values of 3 selected analog points which will change rapidly during a reactor trip. Level instruments are recommended to be consistent with acceptance criteria described later. Also record the simulator exercise time. Ensure the simulator stays in freeze. Rev.: 0 k' Date: 8/17/88 Page: 8 of 13 NSEM-4.11

---- -___ -__- ___-- -                                                                        l
 -( j
 .J 6.5.6   Take a temporary snapshot and maintain simulator in freeze.

6.5.7 Trip the reactor as soon as the simulator comes out of freeze, take no actions. Allow this transient to continue for exactly 1 additional minute exercise time and then freeze the simulator. 6.5.8 Record the values of the 3 points and exercise time on Figure 7.3. 6.5.9 Reset the simulator to the snapshot taken in 6.5.6 and incord the 3 parameters and exercise time on Figure '.3. Keep the simulator in freeze. 6.5.10 Select slow time'(1/10 normal time) and trip the reactor as soon as the simulator is taken to run. 6.5.11 Allow the simulator to run for exactly 10 minutes real time, exactly 1 minute exercise time and then freeze. 6.5.12 Record the data for the 3 parameters and exercise time on Figure 7.3.

   /~T       6.5.13  If the values of the 3 points taken in Steps 6.5.8            '

and 6.5.12 are within 1%, then slow time accuracy

 ' (_)               is verified. The 1% accuracy requirement is applied to the difference (absolute value) between real time level values (in %) and slow time level values (in %). The instructor will sign Figure 7.3 and forward it to the ASOT for his approval.

6.5.14 If the 1% accuracy acceptance criteria is not met, then a DR shall be submitted. 6.5.15 Whenever this test is repeated, the same analog points shall be used to maximize reproducibility of results. 6.6 Boolean Trigger 6.6.1 This capability will be tested by building a Boolean and verifying that a malfunction will be activated when the Boolean condition exists. 6.6.2 Figure 7.4 will be used to document the Boolean Trigger test. Rev.: 0

    ~'
       )                                                  Date: 8/17/88 l                                                          Page: 9 of 13 NSEM-4.11 I

i . { ~ i, / 6.6.3 Build a Boolean Trigger and describe it on Figure 7.4. 6.6.4 Enter a malfunction to be actuated by this Boolean Trigger.

6. 6. 5. Establish simulator conditions which will trigger the Boolean and activate the malfunction.

6.6.6 If the Boolean Trigger activates the malfunction, when the Boolean becomes true, the test is successful and the instructor will sign /date Figure 7.4. 6.6.7 If the Boolean Trigger does not properly activate the malfunction, a DR will be submitted. 6.7 Composite Malfunction 6.7.1- Figure 7.4 will be used to document the Composite Malfunction test. 6.7.2 Build a Composite Malfunction and ditscribe it on Figure 7.4. The composite shall test the maximum number of malfunctions available by inserting them at various pre-programmed exercise times. 6.7.3 Enter the Composite Malfunction. 6.7.4 If all the individual malfunctions actuate at the correct exercise times, the test is successful, and the instructor should sign /date Figure 7.4. 6.8 Variable Parameter Control 6.8.1 Select an analog point to be used to demonstrate the Variable Parameter Control function of the Instructor Station. An analog point should be selected which, when changed usins the Variable

                                                    . Parameter Control of the Instructor Station, will have a discernible effect on plant operation (i.e., RCS boron concentration at power).

6.8.2 Figure 7.4 will be used to document the test of the variable Parameter Control function. 6.8.3 Reset to a 100% power initial condition. () Rev.: 0 Date: 8/17/88 Page: 10 of 13 NSEM-4.ll {

/3 ls/ 6.8.4 Set up the selected analog point on the Variable Parameter pushbutton'and record on Figure 7.4 the parameter used and increment used. 6.8.5 Verify that when the " increase" button is pushed, the selected analog value increments properly and a discernible effect can be seen at the control panels. , 6.8.6 verify that when the " decrease" button is pushed, the selected analog value increments properly and a discernible effect can be seen at the control panels. 6.8.7 If the Variable Parameter Control function operates in accordance with Steps 6.8.5 and 6.8.6, sign and date Figure 7.4. 6.8.8 If the variable Parameter Control function does not meet the above acceptance criteria, submit a DR. 6.9 I/O override Note: The I/O override function provides the ability to s override any Analog Input (AI) or output (AO), t Digita' Input (DI) or output (DO), turn off or on .

 '                                      any an.tnciator or use the annunciator "cryeolf" feature. The number of AI's, AO's, DI's, Do's and annunciators is very large. It is the purpose of this test to verify the capability of these I/O override features by testing a single example of each I/O override feature. Any specific I/O override feature used in a specific training session would be verified prior to that training session.

6.9.1 Reset to any full power IC and record the date of this test on Figure 7.7. 6.9.5 Select an annunciator which is off and record its position /name on Figure 7.7. 6.9.3 Using I/O Override, turn the annunciator on and verify that it flashes initially and when acknowledged, goes solid and stays in. Record the successful completion on Figure 7.7. 6.9.4 Select an annunciator which is on and record its position /name on Figure 7.7. r Rev.: 0

                                                                           Date: 8/17/88      l Page: 11 of 13 NSEM-4.11

d I gs k- 6.9.5' Using I/O override, turn the annunciator off and verify that it flashes initially and when-acknowledged / reset, goes off. Record the successful ccmpletion (yes or no) on Figure 7.7. 6.9.6 Test the.Crywolf I/O Override feature by selecting an annunciator initially off and record the annunciator selected on Figure 7.7, Section 2(A). 6.9.7 Use the I/O Override feature for Crywolf and observe that the selccted annunciator is initially off, flashes on and when acknowledged, the annunciator clears. Record the successful completion (yes or no) on Figure 7.7, Section 2(B). 6.9.8 Test the Digital Input (DI) I/O Override feature by selecting a DI from a control board handswitch and record its ID on Figure 7.7, Section 3(A). 6.9.9 Use I/O override to fix the selected DI in one position. Move the handswitch and verify that the selected DI does not change. Record the successful completion (yes or no) on Figure 7.7, Section 3(B). Oj 6.9.10 Test the digital Output (DO) I/O Override feature .

     \'          by selecting a Do from a control board light which is associnted with a handswitch and record its ID     l on Figure 7.7, Section 4(A).

6.9.11 Use I/O Override to fix the Do in one position. Move the handswitch and verify that the selected Do does not change. Record the successful completion (yes or no) on Figure 7.7, Section 4(B). 6.9.12 Test the Analog output (AO) I/O Override feature l by selecting an Ao associated with a meter output  ! and record its ID on Figure 7.7, Section 5(A). 6.9.13 Use I/O override to fix the analog output to a substantially different value relative to its current value. Verify the meter changes from the current value to the I/O override value. Verify , that no control board action will change the l analog output value. Record the succecsful completion (yes or no) on Figure 7.7, Section 5(B). r Rev.: 0 Date: 8/17/88 Page: 12 of 13 NSEM-4.11 4

9 l N ,,) 6.9.14 Test the Analog Input (AI) I/O override feature by selecting an AI, and record its ID on Figure 7.7, , Section 6(A). l 6.9.15 Use I/O override to fix the analog input to a substantially different value relative to its current value. Verify that no control board 'l action will change the analog input value. Record the successful completion (yes or no) on Figure 7.7, Section 6(B). 6.9.16 If all of Section 6.9 is successful, the  ! instructor should sign Figure 7.7 and forward it j to the ASOT for signature. 6.9.17 If any portions of Section 6.9 are unsuccessful, a DR shall be written referencing this test.

                                                                                                                                        ]

6.9.18 Whenever this test (Section 6.9) is reperformed, the same AI, AO, DI, DO and annunciators shall be used to ensure reproducibility of results. 6.10 The entire instructor station test shall be repeated once every four years.

7. 0 FIGURES ,

(~'} v 7.1 Simulator Backtrack Verification 7.2 Fast Time Verification 7.3 Slow Time verification 7.4 Boolean Trigger, Composite Malfunction 7.5 Simulator Freeze verification 7.6 Simulator Snapshot verification 7.7 I/O dverride Test l l 8.0 ATTACHMENTS l 1 None l l l Rev.: 0 [)/ N- Date: 8/17/88 Page: 13 of 13 j NSEM-4.ll

Figure 7.1

                  .                               SIMULATOR BACKTRACK VERIFICATION l                   F NUNIT
             .Q 1. 'IC                  , DATE RECORDED
2. DATA REAL TIME REAL TIME BACKTRACK INSTRUMENT # VALUES EXERCISE TIME BACKTRACK VALUE EXERCISE TIME 1 minute 2 minutes 3 minutes ,

4 minutes 5 minutes 6 minutes

7. minutes 3 .- VERIFICATION OF ACCEPTABLE DATA Response Acceptable per Section 6.1.10 Criteria (YES OR NO)

OR O . DR Submitted # I Instructor Date Assistant Supervisor Operator Date l Training

                   .O v

l Rev.: 0 Date: 8/17/88 Page: 7.1-1 of 1 NSEM-4.11 j

                                                                                                                                 Figure 7.2 FAST TIME VERIFICATION
                                  . UNIT                                                                                     IC                                      DATE JFAST TIME PARAMETER TO'BE EVALUATED:

ANALOG DATA POINT TO BE USED FOR EVALUATION: BRIEF DESCRIPTION OF SIMULATOR TEST STEPS: NORMAL TIME l

1) INITIAL VALUE -

(Step 6.4.6) 2). FINAL VALUE (Step 6.4.7)

3) . DIFFERENCE (Final Value - Initial Value) I FAST TIME FAST' TIME RATE USED (Step 6.4.8) 1); INITIAL VALUE (Step 6.4.8)
2) FINAL VALUE (Step 6.4.11)
3) DIFFERENCE '(Final Value - Initial Value)

If acceptable per Step 6.4.13, sign and forward to ASOT. Instructor Date Assistant Supervisor Operator Date Training O Rev.: 0 ,, Date: 8/17/88 l Page: 7.2-1 of 1 NSEM-4.ll

h' l Figure 7.3 SLOW TIME VERIFICATION UNIT

1. IC , DATE RECORDED
2. DATA Step 6.5.5 Step 6.5.8 Analog ID Exercise Time Value Exercise Time Value Analog' Point #1 Analog Point #2 Analog Point #3 Step 6.5.9 Step 6.5.12 Analog ID- Exercise Time Value Exercise Time Value n , Analog Point #1 6 l Analog Point #2 -
 'k# Analog Point #3
3. ACCEPTANCE CRITERIA If values in Step 6.5.8 for each of the 3 analog points is within 1% of the value recorded at Step 6.5.12, Acceptance criteria is met.

ACCEPTANCE CRITERIA MET (YES OR NO) Instructor Date I Assistant Supervisor Operator Date Training dev.: 0 Date: 8/17/88 Page: 7.3-1 of 1 NSEM-4.ll

                                                                                                                                       .l

t-Figure 7.4.

        *( )

BOOLEAN TRIGGER, COMPOSITE MALFUNCTION VERIFICATION UNIT,

1. BOOLEAN TRIGGER A. Description of-Boolean Trigger:

Instructor Date

2. COMPOSITE' MALFUNCTION A. Description of composite malfunction:
                                                                                           ~~~

Instructor Date 3.- VARIABLE PARAMETER CONTROL A. Parameter Controlled , Increment Used Instructor Date i Assistant-Supervisor Operator Date Training i l Rev.: 0 Date: 8/17/88 Page: 7.4-1 of 1 NSEM-4.ll

    'D e jQ                                    Figure 7.5
         ~

SIMULATOR FREEZE VERIFICATION UNIT ~

1.  : Simulator placed in freeze and the following items responded properly:

I A. Instructor Console Freeze Indicating id gh* - ON B. Computer Run/Stop Status Indicators - NOT FLASHING (or as appropriate to indicate computer not running) C. Control Board Meters - CONSTANT D. Hazeltine Terminal - ALL VALUES CONSTANT, BASED ON-RANDOM CHECKS OF HAZELTINE VALUES Q) Instructor Date Assistant Supervisor Operator Date Training 1 j -l h (/ Rev.: 0 Date: 8/17/88 Page: 7.5-1 of 1 NSEM-4.11 1 i  !

O Figure 7.6

   -                                           SNAPSHOT VERIFICATION l

UNIT 'DATE IC k

               .A.-   Meters'(A/Os)-                                                                                   !

Cntl Board Cntl Board PPC Value

                                                                    ~

Inst Number ID Value PPC Point ID (1)

                      .(2)                                                                                             .

(3) (4) _ B. Annunciators ~(D/Os) I Control (On/Off) Panel Number Condition (1) , (2) . , (3) () . C .. ( 4 )- Remote Functions

                              ' Number (ID)       Value                                                                l

( 1 )' (2) (3) (4) D. Control board Switches (ID) Switch Number Condition

                      '(1)

(2) (3)- (4) O Rev.: 0 Date: 8/17/88 Page: 7.6-1 of 2 NSEM-4.ll

Figure 7.6 SNAPSHOT VERIFICATION

 -t
   '    ) If acceptable per Step 6.3.6, sign and forward to ASOT.                                            Otherwise,
 ' 'w/

submit a DR. (DR # ) Instructor Date 1 Date d Assistant Supervisor Operator Training ( . ( l 1 i i l l l i l Rev.: 0 Date: 8/17/88 Page: 7.6-2 of 2 NSEM-4.ll

Figuro 7.7 I/O OVERRIDE TEST e { h x- /1 ) - ANNUNCIATOR OVERRIDE CAPABILITY DATE: A) ' Annunciator Selected B) Annunciator selected above, initially off, I/O override successfully turns annunciator on (YES/NO) C) Annunciator Selected l l D) Annunciator selected in (C), initially ON, I/O override l successfully' turns annunciator off l i

2) CRYWOLF i A)' Annunciator Selected j B) Annunciator initially off, flashes on, when acknowledged, the annunciator clears  ;

(YES/NO) l O3) DIGITAL INPUT (DI) OVERRIDE CAPABILITY 4 A) DI Selected B) DI does not change when I/O override is used (YES/NO)

4) DIGITAL OUTPUT (DO) OVERRIDE CAPABILITY A) DO Selected >

B) DO does not change when I/O override is used (YES/NO)

5) ANALOG OUTPUT (AO) OVERRIDE CAPABILITY A) AO Selected >

B) AO changes to selected value when I/O override is used (YES/NO) O Rev.: 0 Date: 8/17/88 Page: 7.7-1 of 2 NSEM-4.11

       ]
 ,'                   #                                                                      Figure 7.7'                             'j
'f 4 ..

I/O OVERRIDE TEST ] i: 6)- ANALOG INPUT ~(AI) OVERRIDE CAPABILITY A)' AI-Selected B) AI changes--to selected.value when I/O

                                                             . override is used                                                        l (YES/NO)                     [
                                                                                                                                      ]

i l

                                                                                                                                        \

a r .

                                                                        -Instructor                          Date Assistant Supervisor Operator             Date Training l

l l- I t b Rev.: 0 Date: 8/17/88 Page: 7.7-2 of 2 NSEM-4.11

o T;O - lu

                                                                        ..._ NORTHEAST UTILITIES NUCLEAR SIMULATOR ENGINEERING MANUAL NSEM-4.12 SIMULATOR PHYSICAL FIDELITY / HUMAN FACTORS EVALUATION O

n + h gger g imulat'or Technical V , V Support Approved: , Did Nuclear Training Revision: 0 Date: 4/13/89 89-003 SCCC Meeting No: O

u e l '. 0 PURPOSE 1.1 Identify the physical differences between Northeast Utilities' plant specific simulators and their

                                           ... appropriate reference plants.                                                               )

1.2 Assess the training impact of the. differences identified. 12 . 0 APPLICABILITY I This procedure applies to the Nuclear Training Department (NTD), including the Operator Training Branch (OTB), Simulator Technical Support' Branch (STSB), and other Northeast Utilities (NU) organizations performing functions in support of the NU Simulator Certification Program.

3.0 REFERENCES

~

3.1 ANSI /ANS 3.5-1985 - This standard states the minimal functional requirements on design data and simulator , performance and operability testing. 3.2 NRC RG.l.149 Rev. 1, April 1987 - This guide describe an acceptable methodology for certification by jn) endorsing ANSI /ANS-3.5, 1985 with some additional requirements. 3.3 10CFR 55.45(B) - Mandates a timetable for simulctor facility certification and specifies additional testing requirements. 3.4 INPO Good Practice TQ-504 - Describes techniques for effectively controlling simulator configuration. 3.5 .NUREG 1258 December 1987 - Describes the procedures and techniques which will be employed to audit certified facilities. l 3.6 INPO 86-026, Guideline for Simulator Training, October 1986. 4.0 DEFINITIONS 4.1 Deficiency Report (DR) - A form (STS-BI-FIA) used by OTB'and STSB to record all identified simulator deficiencies between the simulator and reference plant. Rev.: 0 Date: 4/13/89 Page 1 of 8 i NSEM-4.12

{) 5.0 RESPONSIBILITIES 5.1 Manager, STSB , j 5.1.1 Overall responsibility for the development i and implementation of the Simulator Physical l Fidelity / Human Factors Evaluation Procedure. l 5.2 Manager, OTB l 5.2.1 Responsible for review and concurrent approval of the Simulator Physical Fidelity / Human Factors Reports. 5.3 Unit Superintendents (or Designee) 5.3.1 Responsible for review and concurrent approval of the Simulator Physical Fidelity / Human Factors Reports for their applicable unit. 1 5.4 Supervisor, Simulator Hardware Maintenance l 5.4.1 Responsible for assigning SHM personnel to support the data collection requirements of ) this procedure. ] (~) U 5.5 Assistant Supervisor Operator Training (ASOT) 5.5.1 Responsible for assigning OTB personnel to j identify differences between the plant specific 4 simulator and the reference plant. 5.5.2 Responsible for assigning OTB personnel to assess the training impact of any differences identified. 5.6 Supervisor Operator Training (SOT) 5.6.1 Responsible for conducting a detailed review of the exceptions to determine the aggregate effect upon the training curriculum and approval of the Simulator Physical Fidelity / Human Factors Reports for their spplicable unit. A) ( Rev.: Date: 0 4/13/89

                                                                                       .                               Page 2 of 8 NSEM-4.12
6.0- INSTRUCTIONS j-

\> '6.1 Panel Simulation Evaluation 6.1.1 Control Room Layout 6.1.1.1 SHM shall be responsible-for providing an updated set of control room photo-graphs of the reference plant for implementation of this section. The photographs should provide an overview of the control room, including panels and furnishings, plus any remote panels determined by OTB to be required for training. 6.1.1.2 OTB shall determine if the simulator contains sufficient operational panels to provide the controls, instrumen-tation, alarms and other man-machine interfaces necessary to support the training curriculum. This shall include panels not in the main operating area, e.g., back panels, and any remote shutdown panels needed for abnormal or emergency operating procedures. ,/~'T A DR shall be written for any differ-ts) ence determined to have an impact on the training curriculum. Those differences which have no training irpact shall be documented on exception Form 7.1. 6.1.1.3 OTB shall compare the furnishings in the simulator control room (e.g., desks, operator's console, drawing / procedure storage facilities, key lockers, status boards, etc.) with the reference plant. A DR shall be written for any difference determined to have an impact on the training curriculum. Those differences which have no training impact shall be documented on exception Form 7.1. Rev.: 0 4 N- Date: 4/13/89 Page 3 of 8 NSEM-4.12

6.1.1.4 OTB shall compare the relative locations. ((~S(

      ,                of panels in the. simulator control room to the reference plant control room and determine if the differences (if any) detract from the' ability to conduct the training curriculum.

NOTE: Plant drawings should suffice for plant dimensions / locations, but actual plant measurements may be needed in some cases. A DR shall be written for any difference-

                       . determined to have an impact on the training curriculum. Those differences which have no training impact shall be documented on exception Form 7.1.

6.1.2 Panel Layout 6.1.2.1 SHM shall be responsible for providing an updated set of panel photographs of the reference plant for implementation . of this section. The photographs should 1 provide sufficient. detail on system location, component layout and operator cuing /information aids. 7_ . 6.1.2.2 OTB shall compare the locations of systems on panels in the simulator to determine: (a) If systems are on the same panels as in the reference plant,

                       -(b)  If systems are in the same relative locations to each other within and across panels as they are in the reference plant.

OTB shall determine if the differences (if any) detract from the ability to conduct the training curriculum. A DR shall be written for any difference determined to have an impact on the training curriculum. Those differences which have no training impact shall be documented on exception Form 7.2. Rev.: 0 Date: 4/13/89 Page 4 of 8 NSEM-4.12 i { l

("s . 6.1.2.3 OTB shall compare the general layout of (~) components within a system or on a panel to the reference plant. OTB shall determine if the differences (if an;) detract from the ability to conduct the training curriculum. Attachment 8.1 lists those characteristics which have no training impact and for which no further documentation is required. A DE shall be written for any difference determined to have an impact on the training curriculum. Those differences which have no training impact shall be documented on exception Form 7.2. 6.1.2.4 OTB shall compare the. application of operator cuing and information aids (e.g., background shading, mimic, demarcations, coding schemes, labeling schemes) on the simulator against those of the reference plant. Attachment 8.1 , lists those characteristics which have no training impact and for which no further documentation is required.

                        ,                         A DR shall be written for any difference

( determined to have an impact on the

                   's--)                          training curriculum. Those differences which have no training impact shall be documented on exception Form 7.2.

6.2 Component Fidelity Evaluation l 6.2.1 SHM shall be responsible for providing an updated set of panel photographs of the reference plant for implementation of this section. The photo-graphs should have sufficient resolution to be able to discern details such as display units, scale graduations, label wording, device type, etc. 6.2.2 OTB shall compare each component on the simulator (annunciator windows, meters, recorders, status lights, switches, controllers) to the reference plant. All physical differences greater than those listed in Attachment 8.1, Minor Differences Guideline, will be documented on the appropriate form and evaluated for training impact. For those determined to have impact on the training l curriculum, a DR shall be written. Those differences which have no training impact shall be documented on exception Form 7.3. Rev.: 0 Date: 4/13/89 Page 5 of 8 NSEM-4.12

- _ - _ - _ _ _ - _ _ _   __                                                                                        l

f

  • l
             .(~T                      6.3   Ambient Environment Evaluation
                    '%)              "

6.3.1 -Normal and Emergency Control Room Lighting _ 6.3.1.1 SHM shall be responsible for providing an updated drawing of the reference plant control. room annotated to show the location of the emergency lighting fixtures. Illumination levels will be provided for normal lighting conditions, via light meter, at locations pre-selected by OTB. NOTE: In the case of emergency lighting, as much information as practical will be collected, e.g., make and model of fixture, wattage, etc. 6.3.1.2 OTB shall compare the lighting in the simulator control room to the reference plant. . Any differences in the illumina-tion levels or location of the lighting fixtures will be evaluated to determine their effect upon the readability of displays.

                    /"N                             6.3.1.3  A DR shall be written for any difference
             -t determined to have an impact on the training curriculum. Those differences which have no training impact shall be documented on exception Form 7.4.

6.3.2 Alarms, Signals and Incidental Noise 6.3.2.1 OTB shall be responsible for providing updated recordings of those reference plant alarms, signals and incidental noises determined to be required for training. 6.3.2.2 OTB shall compare the sounds of audible alarms to the reference plant. Any differences in the volume level or tone shall be evaluated to determine their effect upon the training curriculum. NOTE: If auditory coding is used (e.g., panel identification, alarm / ring-back), it should be identical for the reference plant and the simulator. O Rev.: Date: 0 4/13/89 Page 6 of 8 NSEM-4.12

6.3.2.3 OTB shall review.the collection of simulated reference plant noises for o ' g-completeness and realism. The review should include such things as control rod step counters, turbine / steam noises, relay noises, etc. 6.3.2.4 A DR shall be written for any difference determined to have an impact on the training curriculum. Those differences which have no training impact shall be documented on exception Form 7.4. 6.3.3 Communications Systems 6.3.3.1 SHM skall be responsible for providing updated photographs of the reference plant communications equipment. 6.3.3.2 OTB shall compare the rimulator communications systems to the reference plant. j NOTE: All communications systems that are expected to be used for communicating with auxiliary operators (instructors) should be available and operational. 6.3.3.3 A DR shall be written for any differences , 4 determined to have an impact on the training curriculum. Those differences which have no training impact shall be documented on exception Form 7.4. I 6.4 Simulator Physical Fidelity / Human Factors Report 1 6.4.1 Any forms (7.1 through 7.4) used in completing steps 6.1 through 6.3 shall be grouped together with Form 7.5, Simulator Physical Fidelity / Human Factors Report. 6.4.2 The applicable Supervisor, operator Training (SOT) shall review the completed package to determine the aggregate effect upon the training curriculum. A DR shall be written for any difference which the { SOT determines has training impact. The SOT will  ! forward his final version to the Manager, operator Training for concurrent approval. Rev.: 0 ( ) Date: 4/13/89 Page 7 of 8 NSEM-4.12

6.4.3 The MOT will forward the report to the applicable l g- Unit Superintendent for concurrent approval. o g 1 l 6.4.4 The SCCC will make final acceptance of the Simulator Physical Fidelity / Human Factors Report. 6.5 Implementation 6.5.1 A Simulator Physical Fidelity / Human Factors Evaluation shall be conducted on an annual basis in those areas where the reference plant has undergone a change since the last evaluation. The previous Simulator Physical Fidelity / Human Factors Report shall then be updated to reflect the new evaluation. NOTE: For initial submittal, this procedure shall be performed in its entirety in l order to officially disposition the l existing physical scope of simulation. 6.5.2 The updated Simulator Physical Fidelity / Human Factors Report shall be forwarded to the Manager, STSB for inclusion in the Simulator Information/ Operating Guide. {s_, 7.0 Forms 7.1 Exceptions - Control Room Layout 7.2 Exceptions - Panel Layout 7.3 Exceptions - Components 7.4 Exceptions - Ambient Environment 7.5 Simulator Physical Fidelity / Human Factors Report 8.0 Attachments 8.1 Minor Differences Guideline Rev.: 0 O' Date: Page 8 of 8 4/13/89 NSEM-4.12

l- :a , \ l Form 7.1 j EXCEPTIONS - CONTROL ROOM LAYOUT UNIT: r O Completed by: Date: Reviewed by: Date: ASOT 0 ( Rev: Date: 4/13/89 Page 7.1-1 of 1 NSEM-4.12  ! l

s

     '\ /

Form 7.2 EXCEPTIONS - PANEL LAYOUT UNIT: Completed by: Date: Reviewed by: Date: ASOT

     '                                                                 Rev:  0 Date: 4/13/89 Page 7.2-1 of 1 NSEM-4.12

Form 7.3 EXCEPTIONS - COMPONENTS UNIT: Completed by: Date: Reviewed by: Date: ASOT l

            ?

(' Rev: 0 Date: 4/13/89 Page 7.3-1 of 1 NSEM-4.12 l

i

       .O tv)                                                                              Form 7.4 EXCEPTIONS - AMBIENT ENVIRONMENT UNIT:

O Completed by: Date: Reviewed by: Date: ASOT O Rev: 0 Date: 4/13/89 Page 7.4-1 of 1

                               -                         ^

r;-- ---~:- -- '~ , ', :1 h

l
                                                                                                     .j
                                                                                                    .I j

O' , Form 7.5 1 l ..

                                          ' SIMULATOR PHYSICAL FIDELITY / HUMAN FACTORS REPORT i

UNIT:. REVISION: O Approved:By:' Date: SOT 1

                          . Concurrence:                                          Date:

MOT Concurrence: Date: Unit Superintendent

                          .SCCC Mtg. No.

Rev: 0 Date: 4/13/89 Page 7.5-1 of 1

                              .-1 t
        /A
 ,'"       ^

NORTHEAST UTILITIES l NUCLEAR SIMULATOR ENGINEERING MANUAL g NSEM - 4.13 ( REAL-TIME SIMULATION VERIFICATION 3 w q.) Responsible [l  ! Individual:  ; L(

                                                                                                   /j6rfa ge '             ~

murato r "reetnic Support Approved: D Nuclear Tiaining Revision: 1 Date: 5/11/89 i

                                                                                                                 ~

SCCC Meeting No: U

              \
-                  ___m_ . _ _ _ . _ _ . _ . _ _ _ _ . _ . _ _ . .

l i 1.0 PURPOSE (~ L ,) The purpose of this procedure is to verify that the simulation models are running in real time for the Millstone 1, 2, 3 and Connecticut Yankee simulators. 1 l i I 2.0 APPLICABILITY This procedure applies to the Nuclear Training Department (NTD), including the Simulator Technical Support Branch (STSB)and the Operator Training Branch (OTB).

3.0 REFERENCES

i 3.1 NSEM-1.02: Simulator Certification Program Overview l 3.2 ANSI /ANS 3.5-1985 - This standard states the minimal functional requirements on Design Data and Simulator Performance and Operability Testing. 3.3 NRC RG 1.149-Rev. 1, April 1987 - This guide describes an acceptable methodology for certification by endorsing ANSI /ANS-3.5, 1985 with some additional requirements.

         -            3.4    10CFR 55.45(b) - Mandates a timetable for simulator

( facility certification and specifies additional k testing requirements. 4.0 DEFINITIONS 4.1 Deficiency Report (DR) - A form (STS-BI-FlA) used by the OTB and the STSB to record all identified simulator deficiencies between the simulator and the reference plant. 4.2 Simulator Design Change (SDC) - A documentation package consisting of relevant DR's and all forms indicated on STS-BI-FlE which is designed to track the resolution of DR's and ensure that ANSI /ANS 3.5-1985, and NRC Reg. 1.149 requirements are satisfied. 4.3 OTB - Operator Tra!.ning Branch of the Nuclear Training l Department. 4.4 SCE - Simulation Computer Engineering Section of the Simulator Technical Support Branch. , i i 4.5 SHM - Simulator Hardware Maintenance Section of the simulator Technical Support Branch. ( Rev.: 1 5/11/89 Date:  ! Page: 1 of 4 l NSEM-4.13 i

l l l 5.0 ' RESPONSIBILITIES

 .f                                   5.1    Supervisor, Simulation Computer Engineering (SCE) 1 Overall responsibility for coordinating the verification of Real-Time Simulation.

5.1.1 Responsible for assigning engineering resources for software support of this veri-fication 1 acess. 5 .1. 2 : Responsible for reviewing and apprLving the output of this procedure. 5.2 SCE Personnel 5.2.1 Responsible for conducting the real-time simulation verification procedure with SHM and OTB personnel. 5.2.2 Responsible for conducting Steps 6.2 and 6.3 of this procedure. 5.2.3 Responsible for documenting and resolving any software deficiencies identified by this procedure. 5.3 Supervisor, Simulator Hardware Maintenance (SHM) - 5.3.1 Responsible for assigning SHM personnel to perform and support this procedure. 5.3.2 Responsible for assigning SHM personnel to resolve any hardware deficiencies identified

                                                    .by this procedure.

5.4 SHM Personnel 5.4.1 Responsible for conducting Step 6.1 of this . procedure.  ! 5.4.2 Responsible for resolving any hardware deficiencies identified by this procedure. 5.5 Assistant Supervisor Operator Training (ASOT) 5.5.1 Responsible for assigning Operator Instructor A 1 to perform Step 6.3.3 of this procedure. m I~) Rev.: 1 i V Date: 5/11/89 Page: 2 of 4 NSEM-4.13 l l

5.6 OTB Instructor

  ,\
 , k )-            5.6.1   Responsible for performing Step 6.3.3 of this                                    j procedure.

6.0 INSTRUCTIONS 6.1 Computer clock and Timing verification Designated SHM personnel shall verify that the SEL , Computer Clock and Interrupt Timer match both the SEL ] specification and Sysgen values. ' 6.1.1 Install the RTOM board on the SEL BUS of the simulation computer and verify that the SEL BUS refreshing time is 150 n-sec at PIB-85. 6.1.2 Verify that the IOP Interrupt rate is 120Hz as defined by Sysgen at PIB-108. 6.1.3 Log.the test results from 6.1.1 and 6.1.2 into Form 7.1. 6.2 Computer Spare Time Verification Designated SCE personnel shall perform the following: 6.2.1 Restart the operating system image with RTOM.

                                                                                                              /\

6.2.2 Start the simulator the "real-time" system tasks and verify that only l /3 are running. \ 6.2.3 Activate task 'IDLETIME' from console and,get the output from printer. 6.2.4 Confirm that the spare time reported is higher than 10% for each scenario identified in Section 6.3.3. 6.2.5 Log the test results from 6.2.3 on Form 7.2 6.3 Modeling Software Real-Time Verification Designated SCE personnel shall perform the following: 6.3.1 Verify that there are no outstanding incidents on the " Simulator Overtime Log". 6.3.2 Install software counters in the core, RCS and i l feedwater models. Reset the counters to zero { before starting the test. { l l 1 Rev.: 1 O) (_ Date: 5/11/89 , Page: 3 of 4 l l NSEM-4.13 h

E l' i 6.3.3 Hava OTB run each of the following scenarios ('N on the simulator for approximately 10 minutes: N.) 0 Turbine runback / trip o Steam line breaks o Hot leg double-end LOCA o RCP locked rotor Record details on Form 7.3 as to the malfunction / severity used to ensure repeatability. 6.3.4 Verify that, for each scenario, no module is bumped because of an overtime condition (an error message will be displayed on the console if an overtime does occur). 6.3.5 For each scenario, verify that the counter / \ readings are within the limits of the expected values (Form 7.3). 6.4 Disposition of identified discrepancies 6.4.1 Forward the completed originals of NSEM-4.13, Forms 7.1 through 7.4 to the Supervisor, simulation Computer Engineering (SCE) for review and approval.

    .-m 6.4.2  The Supervisor, SCE will forward approved

(%- ) originals to ASRMS for retention with the Simulator Certification records. 6.5 Test Performance 6.5.1 This test shall be performed on a 4 year interval to comply with References 3.3 and 3.4. 6.5.2 This test shall also be performed at any time significant configuration or modeling changes occur as determined by the Supervisor, SCE. 7.0 FIGURES 7.1 SEL Computer Clock and Timing Verification Form 7.2 SEL Computer Spare Time Test Form 7.3 Modeling Software Real-Time Verification Form 7.4 Real-Time Verification Discrepancies 8.0 ATTACHMENTS 8.1 Marginal Notes Directory ('h; Rev.: 1 (_/ Date: 5/11/89 Page: 4 of 4 NSEM-4.13 l

                                                                          .___________a

e, > l: 1, Form 7.1 h 1 l () SEL COMPUTER. CLOCK AND TIMING VERIFICATION FORM i SIMULATOR UNIT

                                                                           '(MP1,MP2,MP3 OR CY)

DESCRIPTION EXPECTED VALUE ACTUAL VALUE SEL Bus 150 n-sec Refreshing Time i IOP Interrupt 120 Hz Frequency' l 1 I I i i PERFORMED BY: DATE: l I APPROVED BY: DATE: l l Supervisor, Simulation Computer Engineering j I () Rev.: 1 f ! Date: 5/11/89 j Page: 7.1-1 of 1 l NSEM-4.13 i

l , Form 7.2 1

                                                                              .SEL COMPUTER-SPARE TIME TEST FORM SIMULATOR. UNIT-(MP1,MP2,MP3 OR CY)

DESCRIPTION EXPECTED VALUE- ACTUAL VALUE SEL Computer Spare Time >10% PERFORMED BY: DATE: APPROVED BY: DATE: Supervisor, Simulation Computer Engineering O Rev.: 1 i Date: 5/11/89 ) Page: 7.2-1 of 1 1 NSEM-4.13 j o

                                      +
                            .a   4 i

Form 7.3

  .q/("~

MODELING SOFTWARE REAL TIME VERIFICATION-FORM'

                   . SIMULATOR UNIT (MP1,MP2,MP3 OR CY)

SCENARIO USED - .i;~ MALFUNCTION: SEVERITY: EXERCISE TIME: TEST RESULTS: DESCRIPTION EXPECTED VALUE ACTUAL VALUE

                                                                                                     / \

Core Model Counter 4xT(sec) 110 l [' - RCS Model Counter 4xT(sec) 110 w) Feedwater Model 4xT(sec) 110

                                   - Counter PERFORMED BY:

DATE: APPROVED BY: DATE: Supervisor, Simulation Computer Engineering 1 1 O Rev.: 1 Date: 5/11/89 Page: 7.3-1 of 1 NSEM-4.13 L

Form 7.4 i - ,'

  '3
REAL TIME VERIFICATION DISCREPANCIES f - ('(d.:

SIMULATOR. UNIT (MP1,MP2,MP3 OR CY) DESCRIPTION ~ . RESOLUTION

'O -

PERFORMED BY: DATE: DATE: APPROVED BY:  : Supervisor, Simulation Computer Engineering

  '?

Rev.: 1 Date: 11/14/88 NSEM-4.13 Page: 7.4-1 of 1

g.. a , 49 I

       .!     j i                                                                Attachment 8.1 (D

s.J : MARGINAL' NOTES DIRECTORY 1.,. Grammatical changes.

2. .Specified simulation computer as processor.to be tested for real-time operation.
                      .3.. Added a step.to ensure only real-time tasks are running.
                      - 4. Section 6.3 has been rewritten for clarity and to allow J (.                         real-time testing.on simulators with or without Vector.

processors. 1 Rev.: 1 Date: 5/11/89 Page: 8.1-1 of 1 NSEM-4.13 l

                         ~~w-------, ,, , _ _ _ _ _ _ _ _ _

l g', j ,.. . .- v NORTHEAST UTILITIES NUCLEAR SIMULATOR ENGINEERING MANUAL NSEM - 5.01 SIMULATOR MODIFICATION CONTROL PROCEDURE i

                                                                           " n i " n :l* U L 0. / L /

ag i:nUlat r Tdchnical Support Approved: A pecty,NuclearTraining Revision: 3 i i Date: 5/26/88 4 i I SCCC Meeting No: 88-007 v

l l t

         )/                                                                                                    l
            ' l.0 PURPOSE

{ l.1 To establish controls for the coordination, resolution, l and documentation of identified differences between { j each simulator and its reference plant. i 1.2 To maintain the integrity of the simulators' hardware, i software, and design databases. 2.0 APPLICABILITY 2.1 This procedure applies to persons that are required to perform comprehensive simulator status reviews, identify simulator deficiencies and/or required simulator changes. 2.2 A Simulator Design Change (SDC) shall be required A

                                                                                           /'\

whenever making modifications to the simulator that effect its fidelity to the reference plant or its operation as a simulator. 2.2.1 All software modifications shall require an SDC A whether they fall under modeling, operating /l\ system, 53, PPC, or instructor station. (_,) 2.2.2 All hardware modifications made to operator interface devices which result in observable differences, functional or visual, shall require /

                                                                                            / I an SDC. This includes main control boards, panels, control room CRT's, printers, operator's console, and the communication system; but excludes furnishings, e.g., desks, file cabinets, chairs, etc.

NOTE: Hardware Only DR's - If a Deficiency Report identifies a maintenance problem with existing equipment, it will be converted to a Maintenance Request (MR) /( j by the Unit Operations Consultant and transmitted to the Supervisor, Hardware Maintenance. A copy of the DR will be returned to the originator stating that the DR has been converted to an MR. 2.2.3 Expeditious Installation of Control Board Labels and Name Tags - To keep simulator upgrades timely and cost effective, the Simulator Configuration Control Committee has established the following guidelines for installation of control board labels and name tags. O Rev.: 3 Date: 5/26/88 Page: 1 of 23 NSEM-5.01

( ,) 2.2.3.1 Labels /name tags and other miscellaneous items that are not permanently attached to the control boards will be considered training aids j and can be added by OTB at any time without a DR. 2.2.3.2 Labels /name tags that are permanently attached to the control boards (glue, t tape, screws, etc.) will be considered changes to the hardware configuration and must be done via the DR process. The Unit Operations Consultant will identify any such DR's to STSB together 43 with any parts and installation details that may have been sent by the plant. STSB will implement these changes in an { /$\ expeditious manner with Hardware Maintenance having the responsibility for installation.

3.0 REFERENCES

3.1 ANSI /ANS 3.5-1985 - This standard states the minimal functional requirements on design data and simulator [,)T (_ performance and operability testing. 3.2 NRC RG 1.149-REV. 1, April, 1987 - This guide describes an acceptable methodology for certification by endorsing ANSI /ANS-3.5, 1985 with some additional requirements. 3.3 10CFR 55.45(b) - Mandates a timetable for simulator facility certification and specifies additional testing requirements. 3.4 INFO Good Practices TQ-505 - Describes techniques for effectively controlling simulator configuration. 3.5 NUREG 1258 - Describes the procedures and techniques which will be employed to audit certified facilities. 3.6 NTDD-17, Simulator Certification and Configuration Management Control. 3.7 INPO 86-026, Guideline For Simulator Training, October, 1986. 3.8 INPO 87-006, Report on Configuration Management in the Nuclear Utility Industry, July, 1987. Rev.: 3 ( Date: 5/26/88 Page: 2 of 23 NSEM-5.01 E_____________ ____

4 s gg i N._,4.0 DEFINITIONS 4.1 Reference Plant - The specific nuclear power plant from which the simulator control room configuration, system control arrangement and data base are derived. 4.2 Simulator - A sophisticated, computer controlled machine that replicates systems from the reference The plant specifically required for operator training. simulator produces expected plant responses for various operator inputs during normal, transient and incident conditions. 4.3 Fidelity - The degree of similarity between the It is simulator and the equipment which is simulated. a measurement of the physical characteristics of the simulator (physical fidelity) and the information or stimulus and response options of the equipment (functional fidelity). 4.4 Simulator Update Requirements - ANSI /ANS-3.5,1985 requires that design changes made to the reference plant be reviewed annually. Their impact on training / fidelity assessed and those deemed applicable, implemented in a timely manner. Paragraph 5.2 states that the first such update review [('_s/) take place within eighteen months following the simulator Ready-For-Training (RFT) date or the plant commercial operation date, whichever occurs last. Paragraph 5.3 further states that those design changes identified as having relevancy to the training curriculum / fidelity be implemented on the simulator within one year of the annual update review date. Unit First Annual Update Review Date Implementation Due Date MP2 11/30/86 11/30/87 MP1 12/12/87 12/12/88 CY 9/24/87 9/24/88 MP3 10/23/87 10/23/88 Those design changes implemented in the reference plants after the first annual review date which have A been determined to be relevant to the training program, /l\ shall be implemented on the simulator within 18 months of its plant in-service date. p Rev.: 3 Date: 5/26/88 Page: 3 of 23 NSEM-5.01 4

  ~,                         .

b 6

    .k_s                           4.5  configuration Management System (CMS) Tracking System -

A computer based DR and SDC tracking system. 4.6 Deficiency - An identified difference in a simulator quality or element (hardware and/or software) that requires review and resolution. 4.7 Deficiency Report (DR) - A form (STS-BI-F1A) used by the Operator Training Branch (OTB) and the Simulator Technical Support Group (STS) to record all identified E deficiencies between the simulator and reference plant. L 4.8 Simulator Deficiency Report Test (SDRT) - A form (STS-BI-FlB) used to document the testing required for compliance to ANSI /ANS 3.5-1985, and NRC RG 1.149. l l- This form is required for close out of the DR and SDC. 4.9 Simulator Design Change (SDC) - A documentation package consisting of relevant DR's and all forms' indicated on STS-BI-FlE which is designed to track the resolution of DR's and ensure that ANSI /ANS 3.5-1985, and NRC RG 1.149 requirements are. satisfied. 4,10 Simulator' Configuration Control Committee (SCCC) - The committee responsible for overall simulator design control and management of NTD resources involved in the simulator modification effort. The committee shall NO include as permanent members; the Director of NTD, the Managers of OTB and STS, Supervisor-SCE, Supervisor-HW . Maintenance, and the four Unit Operations Consultants. The Director-NTD shall chair the committee and the Manager-STS.shall function as the secretary. The minimum permanent members required to constitute a quorum are two management representatives (Director / Managers) and one Unit Operations Consultant. The Manager-STS shall act as chairman in the Director's absence. An ASOT can serve as an alternate for the Unit >4 Operations Consultant for his particular unit. Business for a particular unit shall be conducted only 66 in the presence of the appropriate Unit Operations Consultant /ASOT. 4.11 Originator - The person who notes the deficiency and 41 writes the Deficiency Report (DR). l Rev.: 3 Date: 5/26/88 Page: 4 of 23 NSEM-5.01

                                        ~4.12 Source Document - The data referenced and used for the design change.

4.13 Maintenance Request (MR) - A form used to identify problems with existing equipment that needs repair, adjustment, or calibration and to document all work performed by the Hardware Maintenance Branch. 4.14 STSB -'The Simulator Technical Support Branch 4.15 STS. Supervisor - Either-the simulator Computer Engineering (SCE) Supervisor or the Supervisor, Hardware Maintenance (SHM). 4.16 Hardware (HW) - Any reference to mechanical / electrical parts.of the simulator. 4.17 Software (SW) - Any reference to simulator computer code. 4.18 Technical coordinator - An individual assigned by the Supervisor-SCE to provide Quality Assurance to the SDC analysis and implementation process. 5.0 RESPONSIBILITIES

   \                                     5.1' Simulator Technical Support Manager Overall responsibility for the resolution of all Deficic4cy Reports and Simulator Design Changes.

5 .1.1 - Simulator Configuration control Committee member (SCCC). l 5.2 Simulatgr configuration control Committee (SCCC) 5.2.1 Responsible for performing a cost benefit analysis for each proposed modification brought before the committee. 5.2.2 Responsible for setting DR/SDC due dates if required after reviewing OTB needF and ANSI /ANS 3.5-1985 requirements. 5.2.3 Responsible for resolving priority conflicts among the four simulators for NTD resources. Rev.: 3 Date: 5/26/88 Page: 5 of 23 NSEM-5.01 1

e O( ,/ 5.2.4'. Responsible for' approving / rejecting all SDC's with a resource effort exceeding the following levels: Hardware Costs In Excess Of $500 jd g or Hardware Technician Time in Excess Of 8 Hours or Software Engineering Time in Excess Of 8 Hours 5.2.5 Responsible for providing meeting minutes to the following individuals within five (5) working days: The Director'NTD, the Managers of OTB and STS, i the Supervisor of Operator Training (SOT) of each unit, the Supervisors of Hardware Maintenance and Simulation Computer Engineering, and the.four Unit Operations Consultants. 5.3 Simulator Computer Engineering (SCE) Supervisor overall responsibility for coordination of the l Simulator Design Change (SDC) process. 5.3.1 Responsible for assigning software resources. 5.3.2 Responsible for closing out Simulator Design Changes (SDC's). 5.3.3 Responsible for keeping the SCCC Meeting g-i Minutes. 5.4 Simulator Hardware Maintenance (SHM) Supervisor p Overall responsibility for the resolution of " Hardware l only" Deficiency Reports and the Hardware portion of Hardware / Software Deficiency Reports. 5.4.1 Responsible for assigning hardware resources. i 5.4.2 Responsible for closing out Simulator Design l Changes (SDC's). 5.5 Unit Operations Consultants Responsible for the review, research and coordination of DR's from the time of submittal to closecut. Rev.: 3 Date: 5/26/88 Page: 6 of 23 NSEM-5.01

5.5.1 Responsible ~for reviewing all DR's submitted by OTB together with their associated retests for completeness, accuracy and duplication. 5.5.2 ~ Responsible for interfacing with ASRMS to place DR's in the Configuration Management System. 5.5.3 Responsible for presenting SDC's for OTB generated DR's before the SCCC and providing the justification of need. 5.* 4 Respor.sible for collecting or coordinating the collection of the plant design documents required for resolution of the DR. NOTE: It is not intended that the Unit Operations Consultants be the only person to write DR's or to have sole , responsibility for collecting plant information for DR's written by other instructors. The Unit Operations Consultant is responsible for ensuring that all DR's submitted by a unit are of sufficient accuracy and level of detail. 5.5.5 Responsible for providing operational expertise when requested by STSB. 5.5.6 Responsible for coordinating, but not necessarily performing, the retest to close out OTB generated DR's. 5.6 Administrative Services and Records Management Section (ASRMS) Overall responsibility for maintaining an up-to-date active design data base for each simulator. 5.6.1 Responsible for maintaining a file in Controlled Document Storage (CDS) for all completed SDC's. 5.6.2 Responsible for the entry of all required data into the CMS Tracking System in a timely manner. 5.7 Assistant Supervisor Operator Training (ASOT) /hg overall responsibility for identifying differences between the simulator and the reference plant and reporting those fidelity differences using a Deficiency Report form. Rev.: 3 Date: 5/26/88 Page: 7 of 23 NSEM-5.01

4 73 (~,,) 5.7.1 Responsible for approval of the DR and its  ! associated retest and final acceptance of /h\ l Operator Training Branch identified design  ! citange s . l 5.7.2 Responsible for-providing simulator time to the simulator Technical Support group (STS) for the I resolution of Simulator Design Changes. 5.7.3 Responsible for providing operational expertise when requested by SCE. 5.8 Simulator Computer Engineering (SCE) Personnel Specific responsibility for coordinating / implementing all modifications to the simulators involving software (S and B types) with OTB and Hardware Maintenance. 5.8.1 Responsible for conducting an analysis of the /7\ A SDC prior to its presentation before the SCCC to identify software manpower requirements. 5.8.2 Responsible for completing the SDC Details form after the modification is complete. NOTE: The actual time expended on each DR ["N included in the SDC shall be noted on the 'N ,) Details form. 5.8.3 Responsible to ensure all software data necessary to complete the modification is - included in the SDC. 5.8.4 Responsible for completing the " Software Data / Documentation Update Requirements Form" and A providing the Supervisor, ASRMS with the as- /$\ built documents for updating the Simulator Design Data Base. 5.9 Simulator Hardware Maintenance Personnel Specific responsibility for coordinating / implementing all modifications to the simulators involving hardware (H and B types) with OTB and SCE. 5.9.1 Responsible for conducting a detailed analysis 8

                                                                                             /\
                                                                                          /4 \

of the SDC prior to the SCCC review to identify l hardware scope and resource requirements. Rev.: 3

   ^                                                              Date: 5/26/88 Page: 8 of 23 NSEM-5.01
  ~
        )             -5.9.2                        Responsible-for ensuring that all hardware data                                                                                /y-necessary to complete the' modification is included in the SDC.

5.9.3 Responsible'for completing the SDC Details Form after the modification is complete. NOTE: The actual time expended on each SDC shall be noted on the Details Form 5.10 OTB Instructors Responsible for identifying and documenting any observed differences between the simulator and its-reference plant which have training impact. 5.10.1 Responsible for writing and researching DR's and retests.' 5.10.2 responsible for forwarding DR's/ retests to the Unit Operations Consultants for review.  ; 5.10.3 Responsible'for the retest of the DR unless the Unit Operations Consultant (or his designee) agrees to retest the DR.  ; e 5.10.4 Responsible for providing operational expertise [ i' J when'so requested by STSB. 6.0 -INSTRUCTIONS 6.1 Deficiency Report Form Instructions (STS-BI-FIA) 6'.1.1 SIMULATOR - Indicates specific simulator, MP1, MP2, MP3, or CY. (To be filled in by the Originator) 6.1.2 ORIGINATOR - The person writing the DR. (To be filled in by the Originator) 6.1.3 DATE WRITTEN - The date the Unit Record Tech enters the DR# into the CMS. (To be filled in by the Unit Records Technician) 6.1.4 PLANT SYSTEM - The actual reference plant system referred to by the DR. (To be filled in by the Originator) Rev.: 3 Date: 5/26/88 Page: 9 of 23 NSEM-5.01

1 . i -. ( i 6 . l '. 5 LSSD SYSTEM - The LSSD computer model system's abbreviation that i'ncludes the reference plant's A system. (To be filled in by Originator, if /3\ known, else the Unit Operations Consultant will complete) 6.1.6 DR# - The unique number assigned to the Deficiency Report. (To be assigned by the Unit Records Technician) 6.1.7 DUE DATE (Optional) - A date assigned by SCCC for resolution of the DR(s)/SDC which takes into i consideration any specific OTB needs and/or certification requirements. For those design changes implemented in the reference plants after the first annual review date (Step 4.4), a Due Date of 18 months from the plant in-service date shall be assigned. 6.1.8 PRIORITY - The relative importance of the DR. The Priority is used to determine when the DR will be resolved. (The operator Training Branch assigns the Priority). There are four. Priority levels: Priority 1 - The deficiency has "significant impact" on the quality of Operator

        -\sm                                                        Training and cannot be trained around.                This category shall be reserved for those situations where required Operator Training cannot                                                                     l be-performed until the DR(s) are resolved.                These DR's will receive the immediate attention of STS.

Priority 2 - The deficiency "has impact" on the Luality of Operator Training, but does not render the training feature unusable. The feature can be trained around with some difficulty. Based on training needs, the DR will be scheduled for resolution. Priority 3 - The deficiency "has minimum impact" cn the quality of Operator Training and can be trained around routinely. The DR will be resolved at some time to ensure simulator technical accuracy and/or comply with certification requirements.

      \                                                                                                                              Rev.: 3 Date: 5/26/88 Page: 10 of 23                        i NSEM-5.01

l I , f-k) Priority 4 - The training feature is not required to support the current training curriculum. Work will proceed as resources permit. 6.1.9 PANEL - The panel location for hardware component identified in the DR. (To be filled in by the Originator) 6.1.10 COMPONENT - The number of the component identified in the DR. This is used for DR's requiring hardware changes / modifications. (To be filled in by the originator) 6.1.11 DISCIPLINE - "H = Hardware Only", "S = Software Only" and "B = Both Hardware and Software". (This may be filled in by the Originator, the Operator Training Branch, or the Unit Operations s

                                                                                                                              \

i Consultant. It should be filled in before the Unit Records Technician assigns the DR number) 6.1.12 PDCR # - If the DR was written as the result of a reference plant PDCR, then the PDCR number shall be written here using the format: YR-UNIT-UNIQUENESS NUMBER, e.g. 88-3-000. (To be filled in by the Originator) p_ (_,) 6.1.13 TYPE - This can be used for searching for particular groups of DR's using the CMS Tracking System. Based on the Title, write the first letter of the word that best describes the DR Type in the blank. (Can be filled in by the originator but is not required) 6.1.14 STATE OF SIMULATOR - A brief statement describing the simulator state when the problem occurred. Example: Malfunction EG01, Main Generator Trip. (To be filled in by the originator) 6.1.15 TEMPORARY SNAPSHOT (IC) - Indicate the IC in which the problem occurred. If a temporary Snapshot IC was saved, indicate that IC number. (To be filled in by the Originator) 6.1.16 TITLE - A condensed (60 characters, including spaces) version of the Description used to facilitate entry into the CMS Tracking System. (To be filled in by the Originator) m

   !                                                           Rev.: 3
      ~^                                                       Date: 5/26/88 Page: 11 of 23 NSEM-5.01
        ?

(,j , i 6.1.17 DESCRIPTION .A complete, concise narrative that provides a thorough explanation of the deficiency. To facilitate expedient resolution, all pertinent information should be included: i.e.. correct / desired results; required hardware (types and quantities of switches, indicators, recorders, lights, etc.). If it is critical for the DR to be resolved within a specific time frame, furnish the required date and provide

                                ' justification for it, e.g. required to support specific training commitments.

NOTE: If the-DR is written against.a PDCR, its status in the' reference plant including the.in-service date chall be provided. 6.1.18 REFERENCES - All references used to justify the DR: prints, tech manuals, photos, PDCR's, etc. (To be filled.in by the originator) 6.1.19 REVIEWED BY TRAINING (OTB) - DR's submitted through OTB shall be reviewed and signed by the appropriate ASOT. 6.1.20 REVIEWED BY ENG (STS) - All DR's shall be reviewed and signed by the appropriate STSB - Supervisor (s) or his designee. 6.1.21 DATE CLOSED - This is the date on which the Deficiency Report was reviewed and closed. (To be filled in by the Unit Operations consultant) NOTE: The Unit Records Technician will record this date to close out the DR in CMS. 6.2 Simulator Design Change Form Instructions (STS-BI-FlE). 6.2.1 SIMULATOR - Indicates the specific simulator. l (To be filled in by the Unit Records Technician) 6.2.2 LSSD SYSTEM - The LSSD computer model system's abbreviation that included the reference plant's system for the DR's included in the SDC. (To be filled in by the Unit Records Technician) 6.2.3 DISCIPLINE - Indicated the type of DR's included I "H" = Hardware, "S" = Software, "B" f in the SDC. A l

                                 = Both HW and SW. (This shall be determined by                        Qt      l STSB personnel. It must be filled in before the SDC is brought to the SCCC.

Rev.: 3 1 Date: 5/26/88 Page: 12 of 23 NSEM-5.01 ( l I l I L - - - - _ _ --_-__--_________J

                                                                                                       )
                  .                                                                                    (

4 q (_,) 6.2.4 SDC TITLE - A description of the design change, based on the DR descriptions which should include the PDCR number, if applicable. I '. should be no longer than 60 characters and spaces. (To be filled in by the Unit RecoLd5 Technician) 6.2.5 .SDC NO. - The unique number assigned to the SDC by the Unit Records Technician. (To be filled in by the Unit Records Technician) 6.2.6 DUE DATE (Optional) - A date assigned by by SCCC for resolution of the DR(s)/SDC which takes into consideration any specific OTB needs and/or certification requirements. 6.2.7 DR NO'S - A list of the DR's included in the SDC, not to exceed ten (10). (To be filled in by the Unit Records Technician) 6.2.8 HARDWARE DESIGN CHANGE ANALYSIS ATTACHED (STS-BI-FIC) - This form shall be filled out by Simulator Hardware Maintenance and included in the SDC folder before it goes to the SCCC if the

                                                                                                /f(

SDC's discipline is "H" or "B". (To be A initialed by the Gupervisor, Hardware [6,1 O Maintenance) V 6.2.9 SOFTWARE DESIGN CHANGE ANALYSIS ATTACHED A (STS-DI-FIC) - This form shall be filled out by /1\ the Unit Software Coordinator and included in the SDC folder before it goes to the SCCC if the SDC's discipline is "S" or "B". (To be /\ initialed by the Supervisor, SCE) 6.2.10 APPROVED /NOT APPROVED /SCCC MEMBER /DATE - This section will be completed by a Simulator Configuration Control Committee member (SCCC). NOTE: SDC's requiring a resource effort $500 or l / hi less for hardware and 16 hours or less of total software engineering / hardware engineering time can be approved by either STSB supervisor. Details on these SDC's will be provided in the next SCCC A meeting minutes by the Supervisor, SCE. /$ \ 6.2.11 COMMENTS - This field is used to state the reason (s) why an SDC has been rejected or to set specific conditions governing its implementation. These comments should be carried over to the SCCC Meeting Minutes. (To ) be filled in by SCCC member) { Rev.: 3  ! s Date: 5/26/88 lj Page: 13 of 23 NSEM-5.01 l I _ -- -----_ J

D? . L-l y7

                                 )   6.2.12 HARDWARE DESIGN CHANGE DETAILS ATTACHED (STS-BI-FlD) - This form shall be filled out by                                     /f(

Simulator' Hardware Maintenance and included in "H" or the SDC~ folder (if the SDC discipline is "B") before the DR's that are in the SDC can be closed out. NOTE:. Actual hours required to resolve each SDC l /hg shall be noted here. (To be initialed by Supervisor, Hardware Maintenance) l. \ 6.2.13 SOFTWARE. DESIGN CHANGE DETAILS ATTACHED (STS-BI-FlD) - This form shall~be filled out by the Unit Software Coordinator and included in-the SDC. folder (if the SDC discipline is "S" or '

                                                  "B") before the DR's that are in the SDC can be closed out.

NOTE: Actual hours required to resolve each DR A (To.be initialed by /6 \ shall be noted here. the Supervisor, SCE or his designee) 6.2.14 COMPLETED SIMULATOR DR TEST ATTACHED (STS-BI-FIB) FOR OTB GENERATED DR'S - This form shall be the completed and signed by the Originator and/c

  - _,}                                          . Unit Operations Consultant before the DR's                                         j(\

(s_/ included in the SDC can be closed out. FOR STSB GENERATED DR'S - This form shall be completed and signed by the originator and/or the appropriate STSB Supervisor. 6.2.15 DR(s) CLOSED OUT: DATE - This is th.e date that the Deficiency Report was reviewed,. accepted and 4q closed. (To be filled in by the Unit Operations Consultant) 6.2.16 CMS TRACKING SYSTEM UPDATED - This indicates that the CMS Tracking System has been updated to reflect the closed DR's. (To be initialized by the Unit Records Technicians) 6.2.17 SW DATA /DOCU UPDATE REQUIREMENTS ATTACHED (STS-BI-FlF) - This form shall be completed and included in the SDC folder (if the SDC discipline is."S" or "B") before it can be transferred to J the Supervisor, ASRMS for updating the design data base. (To be initialized by the Supervisor, /hy SCE) Rev.: 3 Date: 5/26/88 Page: 14 of 23 NSEM-5.01 _ _ _ - - - _ _ _ _ _ _ _ _ . ._ _ _ _ _ __ ____ ____ - . _ _ . _ _. . _ _ . _ _ _ _ _ --____-______ _____ _ ___-____-_ _ m

d

             /~

( ,) 6.2.18 COMMENTS - This may be used by the Supervisor, A

                                                                                                                                          /J\

ASRMS, but is not required to be filled in. 6.2.19 SDC COMPLETED - This, when signed, indicates that all requirements of the Simulator Design Change have been completed and the SDC can be closed out. (To be signed and dated by an STSB - Supervisor) 6.2.20 CMS TRACKING SYSTEM UPDATED - SDC PACKAGE IN CDS

                                          - The SDC Form is initialized by the Unit Records Technician indicating that the CMS Tracking System has been updated, indicating that the SDC is closed and filed in CDS.

6.3 Simulator Deficiency Report Test Form Instructions. (STS-BI-FlB). NOTE: A Simulator Deficiency Report Test (STS-BI-F1B) shccid be completed at the time of DR submittal l if adeauate 'nfsrmation is available to write the necessary steps for test acceptance. Exception to this may be granted with SCCC approval.  ; The test should be written by the origine or or I /h\

            /~N                           a responsible OTB/STSB staff member.

k) 6.3.1 TITLE - This is the same title that appears on the Deficiency Report. (To be filled out by the Originator of the Deficiency Report) 6.3.2 SYSTEM - This is the reference plant system specified in the Deficiency Report. (To be filled in by the Originator) 6.3.3 DR NUMBER - The number of the specific DR that (To be filled out by jf{ the test was written for. the Unit Records Technician) 6.3.4 TEST APPROVED OTB: - All tests for DR's written by OTB must be reviewed by the Unit Operations A Consultant and approved by the appropriate ASOT. Q5 6.3.5 TEST APPROVED STSB - All tests written by STSB A must be approved by the appropriate STSB /6\ Supervisor. i Rev.: 3

           '(O)"'                                                                Date: 5/26/86 Page: 15 of 23 NSEM-5.01

4 u .- iu / TESTED SAT - All tests shall be performed and - l 6.3.6-

                   '                                 accepted by~the Operator Training Branch EXCEPT         .

those tests written by STSB. 6.3.7 DATE - The date that the test was accepted by. Operator Training. (To be filled in by person accepting test) 6.3.8 STEP - The steps will be sequentially numbered by the person writing the test. 6.

3.9 DESCRIPTION

/ DESIRED RESULT - This will specify how to perform the step and also the desired result.   (To be filled in by originator) 6.3.10 COMPLETED - The person performing the test will initial each step if the desired result is achieved. If the desired result is not achieved, the SDC will be returned to the Supervisor-Hardware Maintenance and/or the Unit Software Coordinator for resolution.

6.3.11 PAGE.. 0F.... - More than one form may be necessary to write the test. If only one form is used, the person writing the test will write "1 of 1"'to indicate only one form was used. If multiple forms are used, the number following "OF..." will indicate the total number of forms ON used. 6.4 Design Change Analysis Form Instructions (STS-BI-FlC). 6.4.1 DISCIPLINE H S - This proper letter is circled to indicate the type of change, Hardware or Software. (To be circled by the person performing.the analysis) 6.4.2 SDC NO - This is the referenced SDC in the analysis. (To be filled out by the person performing the analysis) 6.

4.3 PROPOSED CHANGE

- This is a short description of the work to be performed. Also any materials necessary to complete the HW section of the DR should be listed here.     (To be filled out by the person performing the analysis)

Rev.: 3 Date: 5/26/88 Page: 16 of 23 NSEM-5.01 1

           ~

O i/ ESTIMATED COST - This is only an estimate. 4 It

!'~' )           6.4.4 is used by the SCCC as a guide for costs associated with the SDC and may impact the approval of the SDC due to budget considerations. If no cost is associated with the SDC, then write "NA" in the space.       (To be filled out by the person performing the analysis) 6.4.5 SIMULATOR DOWNTIME REQUIRED - This is the estimated simulator downtime required to install the design change.      It is used for coordination of OTB and STSB.     (To be filled out by the person performing the analysis)                                                       1 6.4.6 MANPOWER REQUIREMENTS - This is an estimate of the manhours required to perform the design change. This is used by the STSB supervisors to                                    i
                                                                                                             )

project the department work load. (To be filled out by the person performing the analysis) 6.4.7 ANALYSIS PERFORMED BY NAME/DATE - The person performing the analysis should print his name and the date in this space. It may be necessary for the SCCC to clarify certain items on the analysis. (,) 6.5 Design Change Details Form Instructions (STS-BI-FlD). 6.5.1 DISCIPLINE H S -

                                                "H" should be circled if the form concerns   hardware details and "S" should be circled if the form concerns software details.   (To be filled out by the person performing the design change) 6.5.2  SDC NO - The number of the reference SDC.       (To be filled out by the person performing the design change) 6.5.3  DATA USED - This should list all dataItthat   was used to perform the design change.         will be useful when filling out the Software Documen-tation Update Requirements form. (To be filled out by the person performing the design change)                                      1 7s Rev.: 3

( Date: 5/26/88 Page: 17 of 23 NSEM-5.01

l 4 l-

l (O) 6.5.4 SPECIFIC TASK PERFORMED - The actual. work should be described here. Hardware / Software personnel shall fill in actualThis timewill it took to resolve serve as a guideline-the SDC package.

should future work be-required in a specific area. (To be filled in by the person performing the design change) NOTE: Optional for.HW if Work Order Numbers are listed. 6.5.5 WORK ORDER NUMBER (S) - This is used by HW to indicate " work performed" in lieu of filling in

                                                            " Specific Task Performed" above.

6.5.6 DESIGN CHANGE ON DISK - This indicates the location of the software modification. (To be filled in by the Unit Software Coordinator) 6.5.7 LOAD PATH - This indicates the load path for the installation of the software modification. (To be filled in by the Unit Software Coordinator) 6.5.8 HW MAINTENANCE /SW ENG EXECUTING CHANGE - The person that performed the design change should print his name and the date in this space. 6.6 Software Data / Documentation Update Requirements (STS-BI-FlF 1,2,3,4) 6.6.1 Discipline - Denotes discipline under which the Data / Documentation is filed. (to be filled in by responsible Software Engineer) 6.6.2. SDC No. - Number that denotes specific SDC Data / Documentation was compiled for. (to be filled in by responsible Software Engineer) 6.6.3 Simulator System - All Data / Documentation pertinent to a simulator System shall be noted if applicable. (to be filled in by responsible Software Engineer) 6.6.4 Control Panel Instrumentation - All Data /  ; Documentation pertinent to Control Panel 1 I Instrumentation shall be noted if applicable. (to be filled in by responsible Software Engineer) Rev.: 3 Date: 5/26/88 Page: 18 of 23 NSEM-5.01

y-6.6.5 Process' Computer Monitored Parameters - All data / documentation pertinent to PPC points shall

                              'be noted if applicable.

(to be filled in by responsible Software Engineer) NOTE: OTB/SCE shalltoreview formance Test ensureCertification Per-applicable' data is included. 6.6.6- Instructor Interfaces - All Data / Documentation pertinent to Instructor Interfaces shall be

                              .noted if applicable.

(to'be filled in by responsible Software Engineer) 6.6.7 Component Information - All Data / Documentation pertinent to component Information shall be noted if applicable. (to be filled in by responsible Software Engineer) 6.6.8 Simulation Diagrams - All Data / Documentation pertinent to simulation Diagrams shall be noted if applicable. (to be filled in by responsible Software Engineer) ( Miscellaneous - All data / documentation pertinent 6.6.?

                               'to the miscellaneous section shall be noted if applicable.

(to be filled in by responsible f Software Engineer) 6.6.10 Simulator Reference Plant Data - All Data / Documentation pertinent to the Simulator Reference Plant shall be noted if applicable. (to be filled in by responsible Software Engineer) 6.6.11 SW ENGINEER / TECHNICAL COORDINATOR - Initials of responsible Software Engineer / Technical coordinator. (Initialed by Software Engineer / Technical Coordinator) 6.6.12 Date - Date which check sheet was signed off by responsible Software Engineer. (to be filled in by responsible Software Engineer) Rev.: 3 Date: 5/26/88 Page- 19 of 23 l NSEM-5.01 L - - -

i4- [\ !6.8 Simulator Reference Plant Data Instructions (STS-BI-FlH) NOTE: This' form shall be used by OTB to provide data necessary.for simulator modifications when no-~~ other data is available, i.e. assumptions special test, empirical. 6.8.1 Simulator / Reference Plant - Lists specific Simulator / Reference Plant which the data was collected for/from.

                                                             .   (to be filled in by person collecting data) 6.8.2  DR# - The reference DR# for which the data was collected.
                                                             .   (to be filled in by person collecting data) 6.8.3  System -_This is the reference plant system for which the data was collected.
                                                             .   (to be filled in by person collecting data) 6.8.4  Data collected -

6.8.4.1 Date - Date on which data was collected. 6.8.4.2 Time - Time the data was taken (use Navy time).

                                                                       .  (These items filled in by person collecting data)

NOTE: If collected over extended period of time, write from/to for reference guide. 6.8.5 Data Type - 6.8.5.1 Empirical - Data collected based on observation or experience. 6.8.5.2 Special Test - Data collected as the result of a special test or evolution in the reference plant. 4 Rev.: 3 Date: 5/26/88 Page: 20 of 23 NSEM-5.01

l l l l e

j. .
    .^,

Assumption - May be used, if justified, when

    '~' )

k 6.8.5.3 data is unavailable.

                                         .   (appropriate type checked by person collecting data).

6.8.6 Data - Specific data collected for a DR which denoted a system problem. If applicable, attachments shall be included.

                              .   (to be filled in by person collecting data) 6.8.7  Assumption, Reason and Basis - Specific data collected to verify assumption is correct.

Reason and basis shall be explained here.

                              .   (to be filled in by person collecting data) 6.8.8  Data Collected By and Date - Person who collected data and the date it was accomplished.
                              .   (to be filled in by person accomplishing task) 6.8.9  Data Approved By and Date - Authorized person in OTB who is qualified to review collected data.

im k_,) . (to be filled in by person reviewing collected data) NOTE: A comprehensive review of all data pertinent to the reference deficiency (DR) noted shall be conducted prior to sign off, i.e. print revisions, coding update, reference tables, etc. After approval, the updated data shall be placed in the applicable SDC/ Data Base File by the unit records tech. 6.8.10 Superseded Data - This space used to denote if original data has been updated.

                               .  (to be filled in by the Unit Operations                                                                      j-{

Consultant reviewing collected data) 6.9 Approval _Of New Load Path 6.9.1 The Unit Software coordinator who creates the A new load path (Section 6.5.7, Form STS-B1-FlD) shall enter the following information on the /8\ LOAD PATH LOG located in the simulator computer facility room.

  /~N                                                                           Rev.: 3

( p) Date: 5/26/88 Page: 21 of 23 NSEM-5.01

i .

                                 ,     i r
                                   ,n    ,

f ji . . DATE - The date that the load path is

                      .\            /                                                                  created.

TIME - The time that the load path is-created. DISK NAME - The name of the disk pack on. which the created' load path resides. LOAD PATH - The name of the load path which con +.ains the. software modification as.results of DR(s)

                                                                                                                                                                  /h\

resolution (format LD.XXXXXX).

                                                                                       . SW ENGINEER -- The name of the sof tware engineer who creates the load path.                                       !

1 APPROVAL - The name of the Unit Operations 1 Consultant, OTB Instructor or the i ASOT who approves the load path 1 (to be filled by OTB). l 4 6.9.2 OTB shall ensure that only the approved load  ! path will be used for training. The Unit HW Technician (or individual who [ NOTE: brings up-the load) shall only bring up the newest. approved load. path recorded on  ; the LOAD PATH LOG. i 7.0 Figures None

                     !                                                                                                                           Rev.: 3 Date: 5/26/88 Page: 22 of 23 NSEM-5.01
                   ' 9:
                        ': 9..

3.0 2 Attachments

                                        -8.1                            STS-BI-FlE   " Simulator Design: Change Form" (REV.-0)            -

8.2 STS-BI-FlA " Deficiency Report Form" (REV. 1) 8.3 STS-BI-F1B " Simulator. Deficiency Report Test Form" (REV. 0) 8.4 'STS-BI-FlC " Design Change-Analysis Form" (REV. 0) 8.5 STS-BI-FlD " Design Change Details Form" (REV. 0) 8.6 STS-BI-FlF1 " Software Data / Documentation STS-BI-FIF2 Update Requirements Form" STS-BI-FlF3 (MP1,MP2,MP3,CY) STS-BI-F1F4 8.7 STS-BI-FlH " Simulator Reference Plant Data" (REV. 0) f. Rev.: 3 (' Date: 5/26/88 l Page: 23 of 23 NSEM-5.01

7 -

m. ..:
;U:                ._.

Attachment 8.1 Tp. 1,.

                                                                          ' MARGINAL. NOTE DIRECTORY Replace "will"'with "shall" Assigns.respons'ibility-for specific functions to the
                                                       -. Unit operations Consultants Correction of,name.or title
                                                         . Changed for clarity k                 'New responsibility
                                                          . Change in guideline Added for control of data entry Added to address QA'of " Training Loads" NJ Rev.: 3 Date: 5/26/88 Page: 8.1-1 of 1 NSEM-5.01

_._--_y-_-.,--- LT O O FORM APPROVED BY DIRECTOR CLEAR TRA EF F ECTIVE DATE e '1-1-87 \ I.

       ~V'                                                                 SIMULATOR DESIGN CHANGE MP5535 1-07 SIMULATOR                                                         LSSO SYSTEM                          iDG NO.

DISCIPLINE: DUE DAT E HARDWARE b SOFTWARE BOTH SDC TITLE OR NO(S). HARDWARE DESIGN CHANGE ANALYSIS ATTACHED (STS-BI-FIC) i SOFTWARE DESIGN CHANGE ANALYSIS ATTACHED (STS-Bi-FIC) APPROVED- 0 NOT APPROVED COMMENTS: j p

       ~

SCCC MEMBER DATE

      '\)1 HARDWARE DESIGN CHANGE DETAILS ATTACHED (STS-Bl FID)

SOFTWARE DESIGN CHANGE DETAILS ATTACHED (STS-Bi-FID) COMPLETED SIMUL ATOR D R. TEST ATTACHED (STS-Bi-FIB) I DAT E 40psCon) D.R.(S) CLOSED OUT SDC TRACKING SYSTEM UPDATED HW DATA /DOCU UPDATE REQUIREMENTS ATTACHED (STS-BI-FIG) SW DATA /DOCU UPDATE REQUIREMENTS ATTACHED (STS-Bi-FIF) COMMENTS, DESIGN DAl A DASE UPDATED (Sup , Strn Rec 8 Serv S'Onature) DATE SDC COMPL ETED (STS Supervasor Signature) DATE I by SDC TRACKING SYSTEM UPDAT ED- SDC PACK AGE IN CDS (Urut Rec lech) i STS RI-FIE REV.0 ORIGIN AL: SDC FOLDER C AN AR Y: ORIGIN AT OR - _ _ _ _ _ _ - - . _ _ _ - ___ - - ---------O

ex g

                                                                         /                                         i    )
                                                                     -v                                              v i

FOt<M APPROVED BY DIRECTOR, N L E AR TRAININ EF F ECilVE DAT E

               - ~

s 1-1 87 J DESIGN CHANGE ANALYSIS I MP5534 REV 3-87 HARDWARE SOFTWARE PROPOSED CHANGE j I l

             's i

i l ESTIM ATED COST. $ A SIMULATOR DOWNTIME REQUIRED HOURS MANPOWER REQUIREMENTS (including Downterne) __ MANHOURS ANALYSIS PEMF ORMED BY (Pont Nume) DATE j ORIGINAL: SDC FOLDER C AN ARY: ORIGIN ATOR STS-DI-FIC j REV0

                                                                                             =  ,                                                         .. . _ ,

k _ _ _ _ _ _ _ _ _ _ - . - - a

O O FORM APPROVED BY DIRECTOR, N CLEAR TRAININ EFFECTIVE DATE

  .i g)  -
    'V ;.

V 1-1-87 DESIGN CHANGE DETAILS MP5533 REV. 3-87 DISCIPLINE. SDC NO HARDWARE O SOFTWARE DATA USED: SPECIFIC T ASK PERFORMED (include listings and drawinpt. If possible ): s

                             % HARDWARE ONLY WORK ORDER NUMBER (S)

{ l

                              > SOFTWARE ONLY DESIGN CHANGE ON DISK LOAD PATH ALL DOCUMENTS USED ARE ENCLOSED FOR INCLUSION INTO THE DATA BASE.
     ,                                                                          NOTE: A LIST OF DOCUMENTS MAY SUFFICE                                   i
     \      '

HW TECH /SW ENG. EXECUTING CHANGE (Pnnt Name) DATE I STS-Bi-FID REV.O ORIGINAL: SDC FOLDER CANARY: ORIGINATOR

                                                                                                                                                   ~
                                                                                                                                                     ~

a . i k_______.. _ _ _ _ _ k

rx FORM AP Y IF4E . UCL R TRAINING EFF ECTivE DATE g3 , 8-29-86  !

       ).

PAGE OF SIMULATOR DEFICIENCY REPORT TEST MP5499 9-86 TITLE SYSTEM TEST APPROVED OT: STS: DR NUMBER TESTED SAT. lDATE STEP DESCRIPTION / DESIRED RESULT COMPLETED i l l l l l l l l l . i 1 4 f i 1 l \ l l -DO NOT COPY- STS-BI-F1 B l REV.0 m (

r o U o\

                                                                                                                                                       ;q     i FORM APPROVED B " PFCTOR NU        AR TRAIN   G                                                 EF F ECTIVE DATE                              )

g- W M 7 87 y' .. '+ SIMULATOR REFERENCE PLANT DATA MP5606 5 87 DH# SIMUL ATORJREFERENCE PLANT SYST EM h DATE TIME s ' DATA COLLECTED (NA IF ASSUMPTION) D AT A T YPE O EMPIRICAL SPECI AL TEST O ASSUMPTION DATA lNO TE:ANY AT TACHMENTS MUS T BE REFERENCED IN 1H'S SE CIION AND 1WO I?) COPIE S INCLUDED) { V , j. ASSUMPTION (If ABOVE DAlA IS AN ASSUMP1 ION. THE RE ASON AND DAde% MVS T BE WRillE N HL REI 1l D AT E DA1 A COLLECl ED BY DATA APPHOVE D BY (OTO) DATE t> 0 NO YES THIS DATA SUPERCEDES DATA ON PREVIOUS STS-BI-F1H. DR# (STS) ___ STS-DI-F 1H l REV O ORIGINAL: SDC F OLDER CANARY: DATA BASE FIL E MW'" b%%.

        - - - - - - - .         _ _ _ _ _ - - - - -       --           - - _ _ = -         - - - . - - _ _ _ _ _ - - - _ - - - - _ - - _ _ _ . . - . - _ _ - _ - _ _ - _ _ . - - - - _ - - -                                             - ---_-_- -_.- _

b & (J~ - ' FORM APPROVED BY DIRECT NU LEAP EFFECTIVE DATE

                                                    %Q.s*                           { RAINING 4                                                                                                                                                   8-29-86 DEFICIENCY REPORT MP5498 'REV 1-87 SIMULATOR                                                  DAT E WRITTEN                                                                                             DATE DUE                         DR#

ORIGINATOR PLANT SYS. LSSD SYS. PRIORITY PANEL COMPONENT DISCIPLINE PDCR# H Os e TYPE A ADD DELETE C CHANGE M MODIFY REPLACE TITLE (60 Characters and Spaces) STATE OF SIMULATOR TEMPORARY SNAPSHOT IC DESCRIPTION REFERENCES j REVIEWED BY TRAINING (OT). DATE CLOSED REVIEWED BY ENG (STS):

                                                                                                                                                   -DO NOT COPY-                                                                                            STS-el-F1 A REV,1 ORIGINAL: SDC FOLDER              CANARY: CDS                                                            PINK: OPERATOR TRAINING                                  GOLDENROD: OPS CONSULTANT                        -

_. .l..._ . . _ - - ._ _ _ . _

O O APPROVED BY DIRECTOR-NUCL AINI G EFFECTIVE DATE

 /    ..                                W M                                                            3-1-88 1                                                                                                          STS-BI-F1F1 Rev. 0 O STS-Bi-FiF2 Rev. ,

SOFTWARE DATA / DOCUMENTATION sTS-Bi-riF3 Rev. o UPDATE REQUIREMENTS

                                                                                                               $ 'S-Bi-FIF4 Rev. 0 OP5067 REV 2-88 DISCIPLINE                                                                           SDC NO.

SOFTWARE CHECK Y OR N/A.WHICHEVER IS APPLICABLE  ; SIMULATOR SYSTEM ' INSTRUCTOR INTERFACES ' Y N/A Y N/A A. SIMULATOR DESCRIPTION A. MALFUNCTIONS )) B. DESIGN REFERENCES [ B. REMOTE FUNCTIONS C. ASSUMPTIONS O COMPONENT INFORMATION D. SIMPLIFICATIONS A. AIR-OPERATED VALVES CONTROL PANEL INSTRUMENTATION B. SOLENOID / MOTOR VALVES A. METERS C. PUMP MOTORS ] B. RECORDERS D. TRANSMITTERS FOR METERS. RECORDERS AND CONTROLLERS C C. CONTROLLERS [ D. SWITCHES [ SIMULATION DIAGR AMS E. LIGHTS O ^. 'aTesT Revisio" F. MISCELLANEOUS SIMULATOR REFERENCE PLANT DATA G. ANNUNCIATORS A. STS - BI - FIH PROCESS COMPUTER MONITORED PARAMETERS MISCELLANEOUS ' A. PPC POINTS O A. i,0 0vEnRiDE B ANNUNCIATOR OVERRIDE C COMME N T S. O ALL PERTINENT DOCUMENTS ARE ENCLOSED FOR INCLUSION INTO THE DESIGN DATA BASE SW ENGINEER / TECHNICAL COORDINATOR DAT E

   ==%                                           ORIGINAL: SDC FOLDER CANARY' DATA f)ASE rile                                ~F*

l

            )
    's , ,,               NORTHEAST UTILITIES NUCLEAR SIMULATOR ENGINEERING MANUAL NSEM-5.02 RETEST GUIDELINES
m.

k. Responsible

  • Individual:

Manager, operptor Training Approved: ecto , Nuclear Training Revision: 0 Date: 4/13/89 SCCC Meeting No: em V l

f l". 0 - PURPOSE 1.1 To ensure that, adequate testing is performed whenever a L modification is made to the simulator that affects its fidelity relative to'the reference plant or its functional operation as a simulator. 1.2 To ensure that Simulator Certification Documentation and Simulator-Initial Conditions remain up to date as Deficiency Report (DR's) are closed out. 2.0 APPLICABILITY This procedure. applies to the Nuclear Training Department (NTD), including Operator Training Branch (OTB), Simulator Technical Support Branch (STSB), and other Northeast Utilities (NU) organizations performing functions in support of the NU Simu)ator Certification Program.

3.0 REFERENCES

L 3.1 ANSI /ANS 3.5-1985 - This standard states the minimal [" functional requirements on design data and simulator performance and operability testing. 3.2 NRC.RG 1.149 Rev. 1, April,-1987 - This guide describes L an acceptable methodology for certification by endorsing ANSI /ANS-3.5, 1985 with some additional requirements. 3.3 10CFR 55.45(b) - Mandates a timetable.for simulator facility certification and specifies additional testing requirements. 3.4 NUREG 1258 December, 1987 - Describes the procedures and techniques which will be employed to audit certified facilities. 4.0 DEFINITIONS 4.1 Deficiency Report (DR) - form (STS-BI-FIA) used by the Operator Training Branch (OTB) and the Simulator Technical Support Branch (STSB) to record identified deficiencies between the simulator and reference plant. 4.2 Simulator Design Change (SDC) - A documentation package con'sisting of relevant DR's and all forms indicated on STS-BI-FlE which is designed to track the resolution of DR's and ensure that ANSI /ANS 3.5-1985, and NRC Reg. 1.119 requirements are satisfied. O Rev.: 0 Date: 4/13/89 NSEM-5.02 Page: 1 of 12

i [~}'

     \./.

4.3 Normal Plant Evolutions - Evolutions that the simulator shall be capable of performing, in real time, that simulate routine reference plant evolutions. See NSEM 4.10 for a detailed list of these evolutions.

4. 4' Surveillance Testing - Operational testing on safety related equipment or systems. See NSEM 4.10 Figure 7.3 p

for a detailed list of surveillance that a simulator is capable of performing. 4.5 Simulator Operating Limit - A given simulator condition beyond which simulation is unrealistic or inaccurate and negative training may be provided. Simulator operating limits may be imposed due to plant design limits, computer code model limits, or observed anomalous. response. Refer to NSEM 4.08 for additional information. 4.6 Simulator Instructor Guide (SIG) - A training document outlining the sequence of events for a simulator training session. SIG's also contain additional information for the instructor conducting the session. 4.7 Malfunction - A specific equipment failure which J produces discernible indications in the Control Room that replicates the same equipment failure should it

          ~

occur in the reference plant. Specific preprogrammed malfunctions are available at the simulator instructor station. 4.8 Major Malfunction - Those malfunct;ons which produce extensive integrated effects in a number of plant o systems which requires complicated analysis to verify acceptable response. 4.9 Initial Conditions (IC's) - A set of analog / digital points that are stored on the Simulator's Computers so that a starting point is available for a simulator session. Physical components (handswitches, relays, etc.) must also be manipulated to match the analog / digital initialization points (switchcheck). 4.10 Certified IC - An IC which has been reviewed by an SRO qualified instructor and verified to have consistent control board and remote function conditions as the reference plant would under the same conditions. 4.11 System' Test - A test developed for each modeled plant system that ensures proper response of all control board instrumentation, controls, annunciators, PPC points, remote functions, flowpaths and components that are associated with an individual plant system. O Rev.: 0 Date: 4/13/89 NSEM-5.02 Page: 2 of 12 L___________ _ _ _ _ _ _ _ _ _ _. _

J fj.

   .\--      [)                                                                                                      4.12    Remote Function .An instructor initiated input to the simulator.model which will provide the same discernible effects as the corresponding manual operation in the reference plant.

4.13 Certified Remote Function --Those remote functions which will be tested to work correctly and may be used in simulator training and exams. 4.14 "Cause & Effect" Document - A description of the

                                                                                                                            -simulator response (effect) to the insertion of a a                                                                               >< specific malfunction or malfunctions. Each malfunction
                                                                                                                            ' description also contains the physical "cause" of the malfunction as well as a description of the significant effects on plant-operation.due to the malfunction.

4.15' Performance. Test - A defined group of' tests conducted to verify a simulation facility's performance as compared to actual or predicted refer',.nce plant performance. A performance test is required for initial-certification and for every subsequent four year period in order to maintain certification. Performance testing for certification maintenance is intended to be an on-going process with approximately 25% of the testing performed during each year of the four year cycle.

     -()                                                                                                             4.16: Operability Testing - a defined group of tests conducted to verify:
1. The overall completeness and integration of the simulator model,
2. Steady state performance of the simulator to that of the reference plant,
3. Simulator performance for a benchmark set of transients against established criteria.

Operability testing is a subset of the Performance Test and is required annually for maintenance of certification. i Rev.: 0 Date: 4/13/89 NSEM-5.02 Page: 3 of 12

. [i

 ' ss ,/

4.17 " Pen and Ink" Change - A temporary handwritten update to a Simulator Certification document as opposed to a typed revision of that document. When the word-

              *                              " revision" is used in this document, a " pen and ink" change is acceptable. As discussed under ASOT responsibilities, the ASOT shall determine the requirements for timing of formal typed revisions due to " pen and ink" changes.

4.18 Design Limits - Extreme values for specified plant parameters. Design limits are obtained from

                                         ,   engineering design and accident analysis documents, e.g.:  maximum RCS' pressure, peak containment pressure, etc.

4.19 Model Limits - Physical conditions which cannot be simulated by the model coding, e.g.: " critical pressure and temperature, core melt, clad melt, etc. 4.20. Anomalous Response - Simulator response which violates the physical laws of nature.or differs greatly from expected response. Expected response may be based on plant data, accident analysis, or best estimate evaluation. fs 5.0 RESPONSIBILITIES

   +
   \              5.1                        Assistant Supervisor Operator Training (ASOT) 5.1.1   Responsible for approving DR retests of OTB initiated DR's.

5.1.2 Responsible for assigning the Operations Consultant or'other operator instructor (s) to review NSEM-5.02 Form 7.1 after eech DR/SDC is retested but prior to closecut of the DR/SDC. 5.1.3 Responsible for approving the completed NSEM-5.02 Form 7.1. 5.1.4 Responsible for determining the frequency of formal typed revisions of various performance test documents as a result of " pen and ink" changes. 5.1.5 Overall responsibility for ensuring Simulator Certification Documentation remains up-to-date. 5.1.6 The ASOT is responsible for the final decision on whether a Simulator Deficiency is severe enough to constitute a simulator Operating Limit requiring implementation of NSEM 4.08 Sections 6.4, 6.5, 6.6 and 6.7. Rev.: 0 Date: 4/13/89 NSEM-5.02 Page: 4 of 12

l l 5.1.7 Responsible for maintaining a location for ' 7)N (_ working copies of simulator certification performance tests prior to transmittal to Controlled Document Storage. 5.2 Operations Consultant / operator Instructors (As Assigned) 5.2.1 Responsible for writing and performing OTB initiated DR retests. 5.2.2 Responsible for implementing this procedure I after DR's/SDC's are retested but prior to l closecut of the DR/SDC and for signing Form 7.1 when complete. 5.2.3 Responsible for performing " pen and ink" changes or revisions to performance tsst documents and indexes and updating / reshooting certified /- non-certified IC's, as required by this procedure. 5.2.4 Responsible for running performance tests on the l simulator to investigate simulator performance. l 5.2.5 Responsible for placing the completed signed and approved NSEM-5.02 Form 7.1 in the SDC package

    ,, s                                                                      for closecut.
   !    )
    's /                                                              5.2.6   ResponsiDie for recommending at the end of each year reference plant control boards that will need to be photographed.

5.2.7 Responsible for updating all controlled copies of Malfunction cause & Effect Descriptions and Indexes as changes are made. 5.3 STSB 5.3.1 Responsible for providing assistance as necessary when issues of Simulator Operating Limits are under review. 5.3.2 Responsible for taking control room pictures at the reference plant on a yearly basis. 5.3.3 Responsible for performing Real Time tests, when needed. 5.3.4 Responsible for making recommendations to OTB on the scope and complexity of a DR retest. Rev.: 0 Date: 4/13/89 NSEM-5.02 Page: 5 of 12 W .___--u- - - - - - - -__- - - - - - - - - - - - - - - - - - - -

c -

6. 0 ' , INSTRUCTIONS 6.1 General Discussion
                             '6.1.1  The retest associated-with a DR shall be written such that the extent of the' retest is commensurate with the magnitude-and complexity of the changes being contemplated. ANSI /ANS 3.5 (1985) requires:                            " Testing shall'be-conducted-and a report prepared ...                                                    If simulator design changes result in significant configuration or performance variations. When a' limited change is made, a specific performance test on the affected systems and components shall be performed."

Thefguidance contained in this procedure:for maintaining simulator certification performance tests valid may also be used to provide guidance for writing the DR retest. Keeping Simulator Certification Performance Tests valid may be viewed.as the minimum. scope of the retest. The ASOT has final. approval over the scope and' content of a' retest.

                            '6.1.2   After a Deficiency Report-(DR) is implemented on a simulator.and retested as acceptable, O                                  simulator certification documentation and simulator Initia1' Conditions (IC's) may require updating. After the completion of the DR retest but prior to closcout of the SDC packa7e, Form 7.1 shall tur reviewed for any impact to simulator certification documentation or Simulator Initial Conditions.                                   Form 7.1 lists 11 specific areas requiring review.                                                   Each of the 11 areas is discussed in detail in sections 6.2'through 6.12 below. The completed Form 7.1 shall be signed by the individual (s) who have completed the checklist, approved by the ASOT and shall be placed in the SDC package as documentation of the review.

6.1.3 A "yes" or "no" determination needs to be checked on each of the 11 items listed on Form 7.1. A "no" requires no further action. A i "yes" will require the action that is listed in the applicable sections below. O Rev.: 0 Date: 4/13/89 NSEM-5.02 Page: 6 of 12 ; 2___-___-_--- .

i m 1' l-

      'N      622'     System Test Update Required-L :[t/..

6.2.1 Changing an existing System Test. 6.2.1.1 Not all DR's will result in a need to change a system test. DR's that invalidate existing wording or system test results shall require changing the system-test. The addition of new ,

   ,                                                       annunciators, hardware, or flowpaths       1 shall result in changing or adding to a system test.

6.2.1.2 For.the purpose of completion of Form i 7.1, the SDC may be closed as soon as a " pen and ink" change is made to the respective system. test. For System Tests, another acceptable approach is to attach a copy of the completed DR and/or retest to the system test, for incorporation at the next typed revision. Actual timing of. typed revisions of a system test is at ASOT discretion. 6.2.2 Adding,or deleting an entire System Test. ( ). Should it be necessary to add or delete an entire. System Test, revision will be necessary to NSEM-4.01 Attachment 8.1, 8.2, 8.3 or 8.4 (as appropriate), NSEM-4.07 Form 7.1 and NSEM-4.07 Attachment 8.1, 8.2, 8.3 or 8.4 (as  ; appropriate). i 6.2.3 If-a system test is completely performed in its entirety for a DR retest, it may be taken credit for in fulfilling the once per four year performance requirement of NSEM-4.07. 6.3 Reshoot Certified (and/or Non-Certified) Initial

,                      Conditions (IC's) 6.3.1                     If a DR results in the need for reshooting of certified IC's, the SDC shall not be closed until all certified IC's have been reshot.

Conduct of training prior to reshooting of certified IC's is addressed in NSEM-4.02 Section 6.11. Updating of all or some non-certified IC's is at the discretion of the operations Consultant / assigned Operator Instructor or ASOT. Rev.: 0 Date: 4/13/89 ) ) NSEM-5.02 Page: 7 of 12 l __ - - - - - - - . - - - - - )

i l 6 :. 4 - Malfunction Cause & Effect' Descriptions'and Index 6.4.1 Changing an' existing Malfunction Cause and-Effect Description.

                                                            -6'4.1.1 If a DR on a= malfunction results in a change to.an existing Malfunctions Cause and Effects description, it shall be revised. The Malfunction Cause and Effects index shall also be reviewed and revised as necessary.

Refer to NSEM-4.01 Section 6.3 for

                                 ,       .. e                           details-on content required for a Malfunction Cause and' Effects description. Also review the malfunction index on NSEM-4.07 Form 7.1 and Attachment 8.1,.8.2, 8.3 or 8.4 (as appropriate) and revise as necessary. The malfunction test shall also be reviewed since a change to a Malfunction cause and Effects description may also effect its respective test procedure.

6.4.2 Adding or deleting a Malfunction cause and Effect Description. {

    ,(-                                                      If a malfunction is.added or deleted and N N-therefore-a Malfunction Cause and Effect Description is added or deleted, the mal-function index shall be revised, NSEM-4.07 Form 7.1 shall be revised, NSEM-4.07 Attachment 8.1, 8.2, 8.3 or 8.4 (as appropriate) shall be revised and the specific Malfunction Cause and Effect description shall be added or deleted as appropriate.      Refer to NSEM-4.01 Section 6.3 for a description of what information is required for Cause and Effects description content, if a new Malfunction Cause and Effects description needs to be written.

6.4.3 Revised Malfunction Cause and Effects descriptions and/or indexes shall be placed in other copies of the specific units' Malfunction Cause and Effects Descriptions Book. The ASOT and Operations Consultant shall agree on the number and location of copies of the unit specific Malfunction cause and Effect descriptions. The Operations Consultant shall provide a copy of all Malfunction Cause and Effect changes to the Unit Software Engineer. O Rev.: 0 Date: 4/13/89 NSEM-5.02 Page: 8 of 12

L c'

             !         6.5~    Certified Malfunction Tests 6.5.1   Changing an existing malfunction test If.a DR on a malfunction results in a change to an existing malfunction test, it shall be
                                      . revised. The malfunction cause and effects description for that malfunction shall also be i                                     reviewed for changes and revised as necessary.

6.5.2 Add'.ng or Deleting a Malfunction If a DR results in adding or deleting a certified malfunction then a test procedure must'be written (if adding a new malfunction) or deleted (if deleting a currently certified malfunction). The Cause and Effects index shall be updated as well as NSEM-4.07 Form 7.1 and NSEM-4.07 Attachment 8.1, 8.2, 8.3 or 8.4 (depending on unit). A Malfunction Cause and Effects-description will also need to be written-or deleted-(as appropriate). Refer to NSEM-4.04 or NSEM-4.05-for how to write a new malfunction test procedure. NSEM-4.04 will be used if the new malfunction is categorized as a

                                       " major malfunction" as defined in NSEM-4.04, otherwise NSEM-4.05 will describe how to write

() 6.5.3 the new malfunction test procedure. For the purpose of completion of Form 7.1, the SDC may be closed as soon as a " pen and ink" change or revision is made to the respective malfunction test and cause and effect description. 6.5.4 Should a malfunction test be performed in its entirety as part of the retest, it may be taken credit for in fulfilling the once per four year performance requirement of NSEM-4.07. 6.6 Certified Remote Functions 6.6.1 Changing an existing certified remote function If a DR results in a change to an existing certified remote function, the test procedure for that remote function, which is contained in an NSEM-4.01 system test or in NSEM-4.03 tests, shall be updated. NSEM-4.03 Figure 7.2 shall i also be reviewed and revised as necessary. 1 o Rev.: 0 Date: 4/13/89 NSEM-5.02 Page: 9 of 12 I _____

e q; , i I (# 9 '6.6.2' Adding or' deleting a certified remote: function s.g If a DR results'in adding or' deleting.a certified remote function, NSEM-4.03 Figure '7.2 shall be revised. If deleting a. certified remote function, any reference to it shall be. removed from its NSEM-4.01 System. Test or NSEM-4.03 test,.as appropriate. .If adding a new-certified remote function, a test for the remote function shall be written into the appropriate NSEM-4.01 System Test or NSEM-4.03

                                           ,.                                              test procedure.
                                                                                          'For the purposes of completion of Form 7.1,
                                                                                      ~

6.6.3 the . SDC.may'be closed as soon as a " pen and ink" ' change or revision is made to the necessary test documents or forms.

                                        '6.7                                      Simulator operating Limits 1                                                                         6.7.1    It is possible the successful retest or unsuccessful resolution of a DR could indicate that.the simulator has a deficiency which is serious enough to implement NSEM-4.08, Simulator-Operating Limits. Simulator Operating. Limits.are made known to the-simulator instructor by either freezing the simulator when design or model limits are reached or administratively documenting the deficiency in Simulator Guides where the problem may occur.      It is unlikely that new design or model limits will need to be added to those that already freeze the simulator (NSEM-4.08 Sections 6.1 and 6.2). It is possible that deficiencies in safety related systems (NSEM-4.08 Section 6.1 and Form 7.1) could be significant enough to warrant administrative action to ensure negative training does not occur. A deficiency will be considered significant if it meets the criteria of NSEM-4.08 Section 6.3.7, 6.3.8 or 6.3.9.

The ASOT may exercise judgement in this decision. A decision to implement (or investigate) shall result in performing NSEM-4.08 Section 6.7. Note: Simulator Operating Limits should also be considered when DR's are submitted, since a deficiency may be serious enough to justify administrative warnings to instructors on Simulator limitations. O Rev.: 0 Date: 4/13/89 NSEM-5.02 Page: 10 of 12

l 1. ! / '6.7.2 Closecut'of NSEM-5.02--Form.7.1 and the SDC L1 1 package may occur at the discretion of the ASOT' even if'a "yes" is determined for " Simulator Operating Limits". 6.8 Annual Operability Testing 6.8.1 DR's which change, add to, or delete any of the steady state parameters, instrument range /- error, or simulator steady state results as shown on NSEM-4.09 Figure 7.1 shall result in a revision to that test. 3- . 5 . 8.2 .DR's which affect'any of'the test. procedures for the transient tests for NSEM-4.09 (Attachment 8.1, 8.2, 8.3 or 8.4 as appropriate) shall result in a revision to that test. v 6.8.3 DR's which could affect the results of the yearly operability testing transient test procedure shall be_ considered by the ASOT, at his discretion,.for running one or more of the transients to verify results are_still within acceptance criteria, or to consider changing transient acceptance criteria. q 6.9 Normal' Operations and Surveillance 6 . 9. l' If a DR results in the possibility that a normal operating procedure (see NSEM-4.10, Figure 7.1) cannot be used on the simulator, or

                                                    ,  one of the surveillance on NSEM-4.10 Figure 7.3 cannot be used on the Simulator; then test that portion of the normal operating procedure or surveillance to ensure it can be used on the Simulator.

6.9.2 NSEM-4.10 Attachment 8.1, 8.2, 8.3 or 8.4 (as appropriate) should also be reviewed to ensure that no changes are necessary to the normal operations and surveillance test procedure and/or sequence. 6.10. Instructor Station Testing 6.10.1--If a DR affects any of the following features of the instructor station then that portion of the instructor station test shall be reviewed

                                                      -for impact:    Backtrack, Fastime, Slowtime,                          I Boolean Triggers, Composite Malfunctions, Simulator Freeze, Snapshot ability, Annunciator override, Crywolf, DI/DO/AI/AO Qverride capability.

1 Rev.: 0 Date: 4/13/89 NSEM-5.02 Page: 11 of 12 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _- J

    /~ ' '                                                   '6.10.2    If the DR is-the. result of a new hardware-l (, ~'

addition,. ensure that the new hardware AI/DI's

                                                                       ~ are-contained-in switchcheck and the I/O override screens have been updated for any new-AI's, AO's, DI's or.DO's.

6.10.3 If the entire instructor station' test is run as _part of a DRcretest, it will meet the NSEM-4.07 performance for running the test once every 4 years. L < 6.11 Physical Fidelity 6.11.1 As discussed in NSEM'4.12,. pictures will be taken at-the reference plant once per year '<ni those control panels in the reference plant that have changed. STSB-personnel will-take these pictures. The Operations Consultant /- Assigned Operator Instructor shall informally

                                                                      . maintain a list, as DR's are closed, to determine which reference plant control boards will need pictures taken at.the end of each year. Open DR's and a walkdown of the reference' plant control boards shall also be an input to this decision,
    ,                                                  6.12    Real Time Test Verification i                                                                                                               -

6.12.1 If a DR results in any doubt as to'whether the simulator is running in real time, a real time test, per NSEM 4.13 shall be performed by assigned STSB personnel. 6.12.2 If an entire real time test is performed per NSEM-4.13, it may be.used to fulfill the once per four year requirement of NSEM-4.07. 7.0 FORMS 7.1 DR Retest Checklist L 1 ([]) l Rev.: 0 Date: 4/13/89 NSEM-5.02 Page: 12 of 12

FORM 7.1 f_. DR CLOSEOUT CHECKLIST When retest of a DR is complete and prior to closecut of its associated SDC package, ensure the following is considered: YES. NO Is a System Test Update Required? (See Section 6.2 of this procedure for additional information-

                                                                     .n   .

Do certified or Non-certified IC's need reshooting? (See section 6.3 of this procedure for additional information) Do Malfunction Cause and Effect(s) Description (s) or Index(es) need updating? (See Section'6.4 of this procedure for additional information) Do any Malfunction Tests need updating? (See Section 6.5 of this procedure for additional information) Does this affect certified remote functions? (See Section 6.6 of this procedure for additional information) Does this affect Simulator Operating Limits? (See Section 6.7 of this procedure for additional information) t Does this affect Annual Operability Testing? (See Section 6.8 of this procedure for additional information) Does this affect Normal Operations or Surveillance Capabilities (See Section 6.9 of this procedure for additional information) l Does this affect any Instructor Station Tests? (see l Section 6.10 of this procedure for additional information) Does this affect Physical Fidelity? (See Section 6.11 of this procedure for additional information)

                                                                         'Could this affect Real Time Simulator Performance?      (See Section 6.12 of this procedure for additional information)

When' complete, add this checklist to the SDC Package. I checklist Completed by Approved by OPS Con or Op Inst ASOT O Rev: 0 Date: 4/13/89" NSEM-5.02 Page: 7.1-1 of 1

       .g O                                                                                                                              u l

NORTHEAST UTILITIES

                          . NUCLEAR SIMULATOR ENGINEERING MANUAL                                                                       j NSEM-6.01 STUDENT FEEDBACK Responsible Individual                                              l         tt_

Manirg e r ," Ope'r a to) Training Branch h_ Approved: 7 Dirac W Nuclear Training 4 Revision: O l l' Date: January 12, 1989 SCCC Meeting No: 89-001 O

1'y y

  %.                             ;1.0'                   PURPOSE

.Q The purpose of this procedure is to define the methodology 1: used to obtain student feedback regarding simulator fidelity on the Northeast Utilities simulators for Millstone 1, 2, 3 and Connecticut Yankee. l 2.0 APPLICABILITY l l This procedure applies to the Nuclear. Training Department  ; (NTD), including the Operator Training Branch (OTB), the j Simulator Technical Support Branch (STSB),.and other l Northeast Utilities (NU) organizations performing functions J in support of the'NU Simulator Certification' Program.

3.0 REFERENCES

3.1 _ ANSI /ANS 3.5 - This standard states the minimal functional requirements on design data and simulator performance and operability testing. 3.2 NRC Reg 1.149 - Rev. 1, April 1987 - This guide describes an acceptable methodology for certification by endorsing ANSI /ANS-3.5, 1985 with some additional requirements. I' 3.3 NUREG 1258 - Describes the procedure and techniques which are employed to audit certified facilities. l 3.4 INPO 86-026 Guidelines for Simulator Training, 1 October, 1986. l l 3.5 10 CFR 55'.45, Operating Tests. j j l 4.0 DEFINITIONS 4.1 Reference Plant - The specific nuclear power plant from which the simulator control room configuration, system control arrangement and data base are derived. 4.2 Fidelity - The degree of similarity between the simulator and the equipment which is simulated. It is a measurement of the physical characteristics of the simulator (hardware fidelity) and response of the j equipment (functional fidelity). 4.3 Deficiency Report (DR) - a form (STS-BI-F1A) used by the Operator Training Branch (OTB) and the Simulator Technical Support Branch (STSB) to record defi- , cencies between the simulator and reference plant. i Rev.: 0 i ON - Date: 1/12/89 Page: 1 of 5 NSEM-6.01

4: l I: 4: , 5.0 RESPONSIBILITIES 5.1 Assistant Supervisor, Operator Training (ASOT) overall responsibility'for identifying differences between the simulator and the reference plant. 5.1.1 Responsible for determining the frequency of distributing student. assessment forms during simulator training,.after considering NTM and TPIP requirements. 5.1.2 Responsible.for assigning Operator Instructors / Operations Consultant'to research and document. simulator deficiencies based on student feedback comments. 5.1.3- Responsible for ensuring an annual survey is conducted to solicit student comments on simulator fidelity, or assigning Operator Instructors.to do.so. i 5.2 operator Instructors / operations Consultants 5.2.1 Responsible for identifying and documenting I any. observed differences between the simulator and its reference plant which have (-A

3) training impact.

5.2.2 Responsible to explain to student's their role and responsibilities in the feedback process. 5.2.3 Collect Student Assessment Forms at the conclusion of training. 5.2.4 -Responsible for researching student feedback comments.

                                    ,    5.2.5   Assigned Operator Instructors / Operations Consultants are responsible for collating results of the annual survey of student feedback.

6.0 INSTRUCTIONS 6.1 Obtaining student feedback on Simulator fidelity via NTM-2.05, Student Assessment, Figure 7.2. Rev.: 0 [ Date: 1/12/89 ( Page: 2 of 5 NSEM-6.01

a 1

 . i%

(_); 6.1.1 'At the: completion of each " block" of simulator training, all students shall be provided an opportunity to comment on simulator fidelity using the NTM-2.05, Student Assessment Figure 7.2. The

                        . definition of a " block"'of training shall be determined by-the ASOT, based on NTM requirements, TPIP requirements, and his overall judgement of the appropriate frequency for student feedback forms.

6.1.2 Instructors responsible for distributing the student assessment forms to the trainees shall encourage comments on Simulator response and hardware fidelity with the reference plant. 6.1.3 -Student assessment forms containing comments on Simulator response.and/or hardware fidelity shall be forwarded to the applicable Unit Assistant Supervisor, Operator Training,

                         .ASOT),

( for review. 6.1.4 The ASOT shall review trainee comments on Simulator fidelity. He shall assign the Operations Consultant / Operator Instructor to ("N ' review /research those deficiencies which () could result in simulator modifications.

           ' 6.1.5        A determination of whether to make a change to simulator hardware or software shall be made.by the ASOT, Operations Consultant and assigned Instructors by considering the following:
                         'l)   If a licensee comment is incorrect, no further consideration need be given.
2) Assuming the licensee comment to be correct, consider the following:

o The impact of the discrepancy on the Operators', or teams', ability to use normal, abnormal and emergency procedures. o The impact of the discrepancy on protecting plant personnel. o The impact of the discrepancy on the possibility of tripping the plant. Rev.: 0 1/12/89

    \                                                          Date:

Page: 3 of 5 NSEM-6.01 u___-_-___ __ - _ _ .

[:e-  ! l-l 4 L L lk /

     )                                        o  The impact of the discrepancy on the potential for damaging plant equipment.

o The impact on cost and schedule for the simulator. o The impact on the overall physical fidelity of the simulator control room. o The possibility of negative training.

3) If unable to resolve discrepancies based on the preceeding criteria, the ASOT shall present the discrepancy to the TPCC for final resolution.

6.2 Obtaining Student Feedback on Simulator fidelity via Annual Simulator Fidelity Evaluation Survey. Note: The annual Simulator Fidelity Evaluation Survey shall be performed prior to initial certification of the simulator and at least once every 2 years thereafter. Annual performance will be at the discretion of the x ASOT. ( ')

  '#                              6.2.1   On an annual basis, or at least once every 2 years, all licensees shall be provided an opportunity to comment on Simulator response and hardware fidelity with the reference plant by responding to specific questions contained in an Annual Simulator Fidelity           j Evaluation Survey questionnaire, Form 7.1.            I 6.2.2   The Unit ASOT shall assign an Operator              '

Instructor to distribute the Annual Simulator Fidelity Evaluation Survey to all Unit licensees. 6.2.3 Licensee's responses to the Simulator Fidelity Evaluation Survey questions shall be completed and researched by assigned Operator Instructors. , i 6.2.4 The ASOT, Operations Consultant and operator Instructors shall make a determination on each licensee comment to in.plement , or not implement, a change to the simulator.

  /                                                                           Rev.:  0

( Date: 1/12/89 Page: 4 of 5 NSEM-6.01

s ki
 '[

D6.2.5 The-criteria used to determine whether.a change shall be made.shall be the same as those specified in item 6.1.5 of this procedure. 6.2.6 All licensee responses to the annual survey-questionnaire, their. associated rating, and frequency of similar responses shall be j listed in a formal, summary letter, and shall be distributed'to all licensees. 6.2.7 The disposition of. feedback comments shall be included in the formal summary letter-to inform the licensees of the survey results.  ; l 6.3 Disposition of. Student Feedback Forms. 6.3.1 NTM-2.05, Figure 7.2, student feedback forms shall not be required to be retained for any simulator certification purpose. Any DR's resulting from a' student feedback form should indicate that the source of the DR was from student feedback. 6.3.2 Annual survey forms returned and summary letter shall be part of simulator certification records and shall be forwarded to controlled Document Storage. (c) 7.0 FORMS 7.1 Simulator Fidelity Evaluation Rev.: o O Page: l 5f2/89 gg NSEM-6.01 L l

_ _ _ _ _ _ _ 7 _._- -__- - - _ _ - . [6 , X l FORM 7.1 O[~5- . TO: 'All1 Operating Licensees -  ; Unit FROM: Assistant Supervisor, Operator Training

SUBJECT:

Accuracy.of Simulator Unit Please take'a few' minutes to fill'out and return the attached questionnaire-concerning accuracy of the Simulator. (Unit) We are interested in knowing of any deficiencies you may have

- observed between the simulator and actual plant-Control' l() Rooms / Control Boards. Including your name on the survey form is optional.

Results of the survey will be distributed to all licensees for your information. All' items brought to our attention will be addressed 1in the results of the. survey, with a stated disposition. Please direct your comments toward differences between the simulator and . Do not address training (Unit) l' topics or issues. Please send your response to (Instructor's Name) by- . If you have any questions, please call (Date) on . (Name) (Extension) L Rev.: 0

          -                                                                             Date:                 1/12/89 NSEM-6.01                Page:                7.1-1 of 6

bl '

        ~                                                                        FORM 7.1 INSTRUCTIONS
                        .For the following questions please circle the evaluation point-which BEST applies to each question.

N= .No observable difference between simulator and actual plant l' = The difference between simulator and actual plant is observable but has LITTLE OR NO AFFECT on the operators actions or diagnostic ability. 2= The difference between simulator and actual plant may cause confusion.or impair the operator's ability to diagnose or take the required actions PROMPTLY. 3= The difference between simulator and actual plant causes confusion. The difference may cause an INCORRECT DIAGNOSIS and/or cause the operator to take INCORRECT ACTIONS. S= The difference between simulator and actual plant may not affect-the operator's actions or diagnostic ability, but I FEEL STRONGLY that the difference should be corrected. Please provide the specific differences in the comment section for any q estion not evaluated as "N". Providing your name is not mandatory. . [#') Rev.: 0 i

  \_/                                                                                                 Date:  1/12/89      i NSEM-6.01              Page:  7.1-2 of 6 l
      ;ea n

rm . FORM 7.1 SIMULATOR FIDELITY. EVALUATION-A. PANELS,-INDICATION AND CONTROLS

1. 'Did you observe any differences between.the plant and the simulator regarding panels, meters, switches, lights, scales, ranges, locations, etc.? N12 3S 2.J Did you' observe any differences between the plant and the' simulator.regarding mimic, back shading, tags, labels, etc.? N12 3S B. INFORMATIONAL AIDS
1. Did you observe any differences between the plant and'the simulator regarding the availability of aids and reference materials such as procedures, forms, prints, drawing, operator aids, etc.? N123S NAME (Optional) l l Rev.: 0
i. Date: 1/12/89 NSEM-6.01 Page: 7.1-3 of 6 i

i 6 1' l L' FORM 7.1 (} ! \ f l C. AUDIBLE

1. Did you observe any differences between the plant and the simulator regarding the types and level of noise such as annunciators, printers, background, turbine, steam or incidental sounds? N1235 D. COMMUNICATION n

k_,) 1. Did you observe any differences between the plant and the simulator regarding the amount and type of communication devices available? N1 23S i l l NAME (Optional) [ Rev.: 0 ) k- Date: 1/12/89 l NSEM-6.01 Page: 7.1-4 of 6 l

    '4             ,,                                                                                                                                    :)

>. I h E v ] l 1

                -:  i                                          FORM 7.1 J
                    -E. ENVIRONMENT
1. 'Did 1 you; observe any difference between the plant'and the simulator regarding the ,

i amount and type of. normal and/or Emergency Lighting?: N12 35 1 l l l

2. Did you observe.any difference between the
                               ' amount,; type,'and arrangement of the furniture?                                                                                                 N123S l

F. ' PLANT COMPUTER-

1. Did you. observe any difference between the
                                 . plant and the simulator regarding the PPC input and output devices (CRT's, keyboards, printers, etc.)?                                                                                           N123S i

i

2. Did you observe any difference between the ]

plant and the simulator regarding the PPC functions, capabilities and responses? N123S NAME (Optional) l Rev.: 0

              )                                                                                            Date:                   1/12/89 NSEM-6.01                                       Page:                   7.1-5 of 6

- ____ ______ -- _- - 1

m.

      ,'~h                                          FORM 7.1 Q,) '

i. G. SIMULATOR RESPONSE

1. Did you observe any simulator response which you believe to be incorrect? N12 35
2. Was there any procedural section, step or N123S
      /)                  operation you were unable to perform

( .) because of limitations of the simulator? NAME (Optional) A Rev.: 0 (--) Date: 1/12/89 NSEM-6.01 Page: 7.1-6 of 6 a____-__-_ _ __ ._. _ _ - _ .____________a

  . .a     a ..
iy yy.
ty.-

NORTHEAST UTILITIES NUCLEAR SIMULATOR ENGINEERING MANUAL 1 4

                              -NSEM - 6.02 DEVELOPMENT OF NEW SIMULATOR GUIDES i

Responsible Individual: Mahager,operaporTraining Branch Approved: D , Nuclear Training Revision: 0 Date: January 12, 1989 SCCC Meeting No: 89-001

n

  ,  s 1.0 PURPOSE N']      This procedure prt iu+s guidance for ensuring that new simulator guides, developed for training, use only certified remote functions, malfunctions, initial conditions, software systems, and installed hardware. The procedure also provides guidance when a guide requires certifying a previously uncertified item.

2.0 APPLICABILITY This procedure applies to the Nuclear Training Department (NTD), including Operator Training Branch (OTB), Simulator Technical Support Branch (STSB), and other Northeast Utilities (NU) organizations performing functions in support of the NU Simulator Certification Program.

3.0 REFERENCES

3.1 ANSI /ANS 3.5-1985 - This standard states the minimal function requirements on design data and simulator performance and operability testing. 3.2 NRC RG 1.149 Rev. 1, April, 1987 - This guide describes an acceptable methodology for certification

  ,S             by endorsing ANSI /ANS-3.5, 1985 with some additional requirements.

() 3.3 10CFR 55.45(b) - Mandates a timetable for simulator facility certification and specifies additional testing requirements. 3.4 INPO Good Practice T0-504 - Describes techniques for effectively controlling simulator configuration. 3.5 NUREG 1258 December , 1987 - Describes the procedures and techniques which will be employed to audit certified facilities. 4.0 DEFINITIONS 4.1 Deficiency Report DR - Form (STS-B1-FlA) used by the Operator Training Branch (OTB) and the Simulator Technical Support Branch (STSB) to record all identified simulator deficiencies between the simulator and reference plant. l ( (' Rev.: Date: 0 1/12/89 NSEM-6.02 Page: 2 of 9

b[g., 4.2 ' Simulator Operating Limit - A.given. simulator condition beyond which simulation is unrealistic or inaccurate and limits-may be imposed due to plant design limits,-computer code model limits, or observed anomalous response. 4.3 Design Limits ~ Extreme values for specified plant parameters. Design limits are obtained.from engineering. design and accident analysis documents, e.g.: maximum RCS pressure, peak containment pressure, etc. 4.4- .Model' Limits - Physical conditions which cannot be simulated by the model coding, e.g.: critical pressure and temperature, core melt, clad melt, etc. 4.5 Anomalous Response - Simulator response which violates the physical laws of nature or differs greatly from expected response. Expected response may be based on plant data, accident analysis, or best estimate evaluation.

4. 6- Simulator Instructor Guide (SIG) - A training document outlining the sequence of events for a simulator training session, SIG's also contain additional information for the instructor conducting the session.

0s . 4.7 Initial Condition (IC) - an operational status.at which the simulator can be initialized. Included are time in core life, xenon, decay heat, power level, system and component operational status. 4.8 Remote Function (REM) - an instructor initiated input to the simulator model which will provide the same discernible effects as the corresponding manual operation in the reference plant. 4.9 Malfunction (MALF) - an instructor initiated input to the simulator model which will provide the trainees with similar discernible effects, (initial indic-ations and response to corrective actions), as those of a corresponding equipment malfunction in the reference plant. 4.10 Major Malfunction - those malfunctions which produce extensive integrated effects in a number of plant systems.which requires complicated analysis to verify acceptable response. Rev.: 0 O NSEM-6.02 Date: Page: 1/12/89 3 of 9

      -    s-
            ,     <t is]                                                                                                                 l 5.0    ' RESPONSIBILITIES 5.1       Assistant Supervisor operator Training (ASoT) 5 .1.~ 1 Responsible fcr approving completed Simulator Guide Development Check Lists, NSEM-6.02 Form 7.1.

5.2 operator Instructors 5.2.1 Responsible for implementing the controls of this procedure when developing new simulator guides. 5.2.2 Responsible.for completing NSEM-6.02 Form 7.1 . for each new simulator guide developed.  ! 5.2.3 Responsible for initiating revision of , certification lists, tests, and schedules; l identified in this procedure; when a new simulator guide requires that a previously l uncertified remote function, malfunction, or IC be certified. l 6.0' INSTRUCTIONS 6.1 Ensur.ng Remote Functions Used In New Simulator _7 s Guides Are Certified p 6.1.1 Remote functions used in developing a new  : simulator guide should be selected from those specified on NSEM-4.03 Figure 7.1.as being. certified. 6.1.2 If-the scenario being developed requires 1 using an uncertified remote function, refer to section 6.6 of this procedure. 6.1.3 If the scenario being developed requires a j remote function which does not presently exist, a DR must be submitted. Testing- , requirements and revision to appropriate certification lists and testing schedules will be in accordance with NSEM-5.02. 6.1.4 When all remote functions act; in the completed. guide have been vet.fied to be certified, the developer suo41 initial the l appropriate check on NSEM 6.02 Form 7.1. l 1 l

                                                                                                                           }

Rev.: 0 Date: 1/12/89 , NSEM-6.02 Page: 4 of 9

i. )

isl ti; i t.

         '[\-       -

6.2 Ensuring' Malfunctions Used in New Simulator' Guides Are Certified 6.2.1 Malfunctions used in developing a new simulator. guide should be selected from those specified in the unit specific attachment'to NSEM-4.07Las being certified. 6.2.2 If the ' scenario being developed requires using an uncertified malfunction, refer to section 6.7 of this procedure. 6.2.3 If the scenario being' developed requires a malfunction which does not presently exist, a DR must be submitted. Testing requirements and revision to appropriate certification lists and testing schedules will be in accordance with NSEM-5.02. 6.2.4 When all malfunctions used in the completed guide have been verified to be certified, the developer shall initial the appropriate check on NSEM 6.02 Form 7.1. 6.3 Ensuring New Simulator Guides Use Certified Initial Conditions-6.3.1 Instructors developing new simulator guides

              '( [.                        shall compare initialization requirements against those specified for the certified IC's (as designated on the instructors station IC tableau).

6.3.2 Select the initial condition which fulfills the-initialization requirements for the guide. Note: The guide may specify that the IC be modified, at the start of each presentation, using certified remote functions and/or reference plant procedures. 6.3.3 If initialization requirements for the scenario do not match well with any certified initial conditions or extensive modifications are required, refer to NSEM-4.02 for develop-ment of new certified IC's. 6.3.4 When the initial condition and any i modifications, per the above note, have been verified as certified; the developer shall initial the appropriate check on NSEM 6.02 Form 7.1. Rev.: 0

          - Os                                                                   Date: 1/12/89 NSEM-6.02                    Page: 5 of 9 L

8' , T

   ;g1   .
 .g           ^.

6.4 Ensuring That New Guides Contain Appropriate Cautions f 0 Concerning Simulator Operating Limits 5-). 6.4.1 Instructors shall review the list of plant design and simulator model limits to

                             . determine if' conditions in the scenario may cause a limit to be exceeded.

6.4.2 If any limit (s')'may be exceeded, a note shall

    ;                         be included in the body of the guide, just prior to the initiating conditions. The note should contain the following:

6.4.2.1 The conditions / trainee responses which could cause the limit to be exceeded. 6.4.2.2 The parameter name and the numerical value of the limit. 6.4.2.3 The response of the simulator if the limit is exceeded, (freeze, alarm, etc.). 6.4.2.4 Instructions regarding whether or not training should proceed and any y information to be-provided to

       =

trainees. Example: '"SG tube i primary to secondary AP has exceeded the design limit of 2000 psid, tube ruptures may occur-if this condition was to occur." 6.4.3 Instructors shall review the list of simulator " anomalous responses" listed on NSEM-4.08 Form 7.2 to determine if events in the scenario may cause En anomalous response. 6.4.4 If required, a caution shall be included. The caution should be located in the body of the guide, just preceding the directions which could lead to an anomalous response,  ! and contain the following:  ; 6.4.4.1 A bold heading, e.g., CAUTION 6.4.4.2 A brief description of the anomalous response. 6.4.4.3 The actions which would cause the anomalous response, if appropriate. Rev.: 0 l 'lO NSEM-6.02 Date: Page: 1/12/89 6 of 9

 'r g                                                                Directions to inform trainees that
  +

J, -6.4.4.5 J \~/ - specific indications will not provide accurate information due to simulator modeling limitations. 6.4.5 When the entire scenario has been evaluated for simulator _ operating limits and any required notes and/or cautions have been.  ! included, the developer shall initial the appropriate check on NSEM 6.02 Form 7.1. 6.5 Testing Newly Developed Simulator Scenarios 6 . 5.1 - Each newly developed simulator guide shall be test run to verify proper response from , simulator hardware and software systems  ; required by the scenario. 6.5.2 All initial conditions, remote functions, malfunctions, I/O overrides, and manip-ulations specified in the guide shall be checked in the same sequence and conditions as specified. 6'.5.3 Where_ trainee actions will significantly affect simulator response, additional testing

will'be required. Example: Simulator
 .(                                                    response to an ATWS with and without operator action to manually trip the reactor.

6.5. 4. The developer shall evaluate the simulator response to ensure that response is reasonable for existing conditions. This opportunity should also be used to ensure that the simulator response will meet the goals of the. session. 6.5.5 If the' test run produces unexpected /unex-  ! plainable results, perform the following: 6.5.5.1 Refer to the respective Simulation System Diagrams to verify that required flowpaths, functions, and controls are modeled. l 6.5.5.2 If required items are not modeled, the guide should be modified to I obtain desired results using , existing modeling; or a DR could be  ! submitted to expand the scope of simulation. l Rev.: 0 L Date: 1/12/89 NSEM-6.02 Page: 7 of 9 C_-_------_-_----_----___ _

_= - _ ______ - _--- - _ _ _ . __ - ._ _ - - _ . _ _ _ . _ _ . _ - _ _ _ _ .

y 1 , ,
          ),

em ' -l' f- 6.5.5.3 Submit'a DR for unexpected /unex-plainable response. 6.5.6 If the: simulator guide needs to be modified due to unexpected response or inability to e meet session goals, refer to-the appropriate section(s) of this procedure to ensure that all items remain certified. 6.5.7 After the completed simulator guide.has been-1 succes'sfully test run, the developer shall initial the appropriate check on NSEM-6.02 Form 7.1.

6. 5. 8 ' The completed NSEM-6.02 Form 7.l' check list shall accompany the new simulator guide during the review and approval process.

6.6 Certifying An Uncertified Remote Function Note:- A remote function is certified by development and successful completion of a testing procedure. The test procedure can be incor-porated as part of the respective system test or as'a separate remote function test. The first option is preferable since performance

           ~ -

test scheduling is automatically covered.

         '~

6.6.1 Refer'to NSEM-4.01 for development and execution of a remote function test as part Hof its respective system test, or: 6.6.2 -Refer to NSEM-4.03 for development and execution of a-separate test procedure for the remote function. 6.6.3 After the remote function has been successfully tested, initiate revisions to include the remote function and/or its test l procedure in the following: j 6.6.3.1 NSEM-4.03 Figure 7.1, Certified Remote Functions List. 6.6.3.2 NSEM-4.07 Unit Specific Appendix A, Performance Test Schedule (only if the remote function is certified per NSEM-4.03). Rev.: 0 ,4 N .. Date: 1/12/89 NSEM-6.02 Page: 8 of 9 i

w Q ' ti! lQ3 ' _ 6 '. 7 Certifying An Uncertified Malfunction-L 6.7.1 With the concurrence of.the ASOT,. categorize the malfunction (s) as major or minor. 6.7.2 For major malfunctions, refer to NSEM-4'.04' for development and execution of a test procedure. 6.7.3 For minor malfunctions, refer to NSEM-4.05

i. for development and' execution of a test procedure.
                                                             ~

6.7.4 -After the. malfunction has been successfully tested, initiate' revisions to include.the malfunction in the following:- 6.7.4.1- NSEM-4.07 Unit Specific Appendix A,

                                                                 -Performance Test Schedule.

6.7.5 Review / revise the Cause and Effect document for the malfunction per NSEM-4.04. 6.8 Disposition Of Forms Generated 6.8.1 The simulator' guide development-checklist, NSEM-6.02 Form 7.1 is reviewed and approved

     ~

by.the ASOT'at the same time as the newly developed. simulator guide. 6.8.2 After the SOT releases the simulator guide for use, the simulator guide development checklist will remain with the original copy of the guide. 7.0 FORMS 7.1 Simulator Guide Development Checklist 8.0 ATTACHMENTS None Rev.: 0

f ( Date: 1/12/89 ,

NSEM-6.02 Page: 9 of 9 _ _ - - - -_ -- i

.h c g$ FORM 7.1

 !                              4                       SIMULATOR GUIDE DEVEL, ' MENT CHECKLIST
   %)

Guide #: Developer: Remote Functions: Initials All remote functions contained in the guide or likely to be requested by trainees have been verified to be certified. Malfunctions: All malfunctions contained in the guide have been been verified to be certified. Initial Conditions: The initial condition (s) contained in the guide have been verified to be certified or have been developed from certified IC's in accordance with NSEM-4.02. Simulator Operating Limits: The simulator guide has been evaluated for operating limits and/or anomalous response. Appropriate notes and/cr cautions have been included, if required. Test Run: The scenario contained in the guide has been test run on the simulator. Simulator response is reasonable and expected. Completed: Developer Date Approved: ASOT Date r~% Rev.: 0 I ss ,) Date: 1/12/89 NSEM-6.02 Page: 7.1-1 of 1

 ,/~]

N.) NORTHEAST UTILITIES NUCLEAR SIMULA;vR ENGINEERING MANUAL f 1 1 I NSEM-6.03 COLLECTION OF PLANT PERFORMANCE DATA / t' t l l

   %)                                               #

Responsible Individual: h Manager, Opergtor Training Branch Approved: g_' recp r,Nuclear _ Training Revision: 1 Date: June 27, 1989

                                                ~

SCCC Meeting No: l  ! (

1.0 PURPOSE

e--)
  \-                                      This procedure provides guidance for the collection and maintenance of reference plant performance data.

2.0 APPLICABILITY This procedure applies to the Nuclear Training Department (NTD), including Operator Training Branch (OTB), Simulator Technical Support Branch (STSB), and other Northeast Utilities (NU) organizations performing functions in support of the NU Simulator Certification Program.

3.0 REFERENCES

3.1 ANSI /ANS 3.5-1985 - This standard states the minimal functional requirements on design data and simulator performance and operability testing. 3.2 NRC RG 1.149 3ev 1, April, 1987 - This guide describes an acceptable methodology for certification by endorsing ANSI /ANS-3.5, 1985 with some additional requirements. 3.3 10CFR55.45(b) - Mandates a timetable for simulator

 /                                               facility certification and specifies additional

(_, testing requirements. 3.4 INPO Good Practice TQ-504 - Describes techniques for effectively controlling simulator configuration. 3.5 NUREG 1258 December, 1987 - Describes the procedures and techniques which will be employed to audit certified facilities. 3.6 INPO 86-026, Guideline for Simulator Training, October, 1986. 4.0 DEFINITIONS 4.1 Deficiency Report (DR) - form (STS-BI-FlA) used by the Operator Training Branch (OTB) and the Simulator Technical Support Branch (STSB) to record all identified simulator deficiencies between the simulator and reference plant. I I 1 l l / )

   \                                                                                         Rev.:    1                           l Date:    6/27/89                     !

Page: 1 of 8 NSEM-6.03 > l L _ _ _ _ _ _ - - _ - - _ - - - - )

                 ):

g j- 4.2 Reference Plant Data Book (PDB) - a compilation of

           '                                                                                  reference plant data for specific plant transients /

evolutions. The data' defines plant parameter response to specific initiating events or evolutions.

d Reference plant data may be used to verify simulator
                                                                                             -response for certification testing, for training development, or as supporting data for DR submittal.
                                         -5.0'                                       RESPONSIBILITIES 5.1     Assistant Supervisor, Operator Training (ASOT) 5.1.1    Responsible for assigning individuals to collect and' collate data for selected plant events.

5.1.2 Responsible for reviewing new PDB entries and assigning personnel to perform testing or develop training, as appropriate. 5.1.3 Responsible for ensuring that the Operations Consultant and instructors associated with the unit are made aware of new PDB entries. 5.2 operator' Instructors 4_ 5.2.1 -Responsible, as assigned, to collect and collate data for selected plant events. 5.2.2 Responsible, as assigned, for performing simulator testing based on PDB entries, (certification and routine testing). 5.2.3 Responsible for reviewing new PDB entries. 5.2.4 Responsible, as assigned, for developing training based on PDB entries. 5.3- Operations' Consultant In addition to the responsibilities of 5.2.1, 5.2.2, and 5.2.3 above; the operations Consultant is responsible for maintaining and updating the Reference Plant Data Book. 6.0 INSTRUCTIONS 6.1 Maintaining the Reference Plant Data Book (PDB) 6.1.1 The reference plant data book contains plant data for specific transients and evolutions which have occurred at the reference plant. l

           ,(-                                                                                                                                   Rev.:                                      1 Date:                                     6/27/89 Page:                                     2 of 8 NSEM-6.03
        -W' 6.1.2    'The PDB may'also contain non-event miscellaneous data such as computer display "0*~                                         formats, printouts of various computer programs, sketches, and sundry forms.       Such data may be grouped under a common
                                                " miscellaneous" heading. .This group.shall f6                                               have.its own tabbed divider and index entry.

Other guidance for identifying data collected-should be used only as appropriate. ! n 6.1.3- Each transient or evolution shall be maintained as a separate'section containing all of the. appropriate data for that event. 6.1.4 Each event shall be sequentially numbered and; be preceded by a tabbed divider. 6.1.5 An index shall be maintained, listing each event by its sequential number and-descriptive title.

                                      -6.1.6    The.first page for each event shall be a brief description of the event. To the extent possible, the following should be included:

6.1.6.1 Time and date of event.

      '                                         6.1.6.2     Initial conditions:   power, MWTH, MWE, mode, burnup,' major equipment status, abnormal alignments, control rod position, equipment out of service, on-going evolutions.

6.1.6.3 Initiating event, with component specific identifier if multiple similar components exist. Example:

                                                            "A" RCP tripped.

6.1.6.4 operator actions affecting plant response. Timing should be included, if known. 6.1.6.5 Failure of equipment which should have responded to the event and thereby affects plant response. 6.1.7 Copies of plant chart recorders shall be j labeled to indicate the parameter name and  ; instrument ID number. l Rev.: 1 Date: 6/27/09 Page: 3 of 8  ! NSEM-6.03  :

    =-___--____-___-___--____________________.

L i

    ,-~g

( ) 6.1.8 To the extent possible, copies of plant chart  ;

    \/                       recorders should be marked to indicate the time of the initiating event. Add!.tional time marks or a notation of chart speed should be included. This will aid in correlating parameter responses.

Note: The above may not be possible for plant data collected prior to implementation of this  ; procedure, but should be included for subsequent data collection. 6.1.9 Where known, the time of operator actions should be indicated on plant data which is affected by the action. 6.1.10 Each page of data for a specific event shall be marked with the sequential number for the event. 6.1.11 The PDB shall be maintained in the location designated in NSEM 3.01, Figure 7.1. 6.2 Identifying Events for Incorporation in the Reference Plant Data Book [} (_ ,f 6.2.1 Potential events may be identified by one or more of the fs ' lowing: 6.2.1.1 Revie. ring SS logs, PIR's, LER's, or morning meeting notes. 6.2.1.2 Reviewing plant operating history summarica i 6.2.1.3 Verbal reports from plant I management, operators, or others. i 6.2.1.4 Reviewing plant scheduling documents for upcoming evolutions, tests, or surveillance. 6.2.2 As a minimum, the following events should be considered as potential candidates for data collection: 4 6.2.2.1 Plant trips with complications. 6.2.2.2 Major equipment malfunctions which initiate transients. l 4 n i f

       %-l                                                        Rev.:  I Date:  6/27/89 Page:  4 of 8 NSEM-6.03                                 \

L

I'\- '

                                                                                                  '6.2.2.31    Power escalation data for post refueling plant startups, i                                                                                                                                   I h + '                                                                                               6.2.2.4    .Special tests.

6.2.2.5 Non-routine surveillance testing. 6.2.2.6' Transients initiated by operator , I error in control board manipulations.

                                                                 +                                 '.

6 2. 2. 7 Other events with good potential for training. 6.2.3 Events selected as-potential candidates for-data collection should be compared to

                                                                                                                     ~

existing events contained in the PDB. o 6.2.3.1 Events similar to pre-existing PDB events'may be deleted from consid-eration. Examples: plant trip due to loss'of "A" RCP vs. "B" RCP, low SG level trip due to loss of "A" feed pump vs. "B" feed pump (effects are .similar/ symmetrical). l'~ 6.2.3.2 The ASOT should be consulted when. .

          \                                                                                                    uncertainties exist concerning the       !

value of. data collection for a given event. 6.2.3.3 The ASOT shall assign one or more individuals to collect plant data for events selected.

                                                            .6.3   Collecting Plant Data for Events Selected for the PDB i

6.3.1 Data collection should be performed in a timely manner while data is readily retrievable and specifics are fresh in the minds of those involved. 6.3.2 The assigned individual shall make arrangements with the operations Deparment to copy chart recorder traces when the pertinent sections are contained on chart rolls installed in recorders. Note: Due to plant evolutions and the critical nature of certain chart recorders, it may be necessary to wait until the chart is changed out before copying.

            \                                                                                                                            Rev.:   1        {

Date: 6/27/89 l Page: 5 of 8 NSEM-6.03 _ _ _ _ _ _ _ _ _ _ _ _ ____ _ _ _____ _ a

3:

   ,{                                                              6.3.3         ' Plant typer data relevant to the event-should JA                                                                            be copied.

6.3.4> The SS log should be reviewed.for the event time period and pertinent facts extracted. 6.3.5 Information from PIR's or copies of PIR's should:be collected. 6.3.6- operations personnel involved in the event should be interviewed to determine operator

                                                                 .                actions and timing.

6.3.7 Surveillance / test data forms'(if appropriate), should be copied. 6.3.8 Any other sources of pertinent data should be investigated and pertinent data extracted or copied. 6.4 Collating Data Collected 6.4.1- Using the PDB index, select the next sequential number for identifying the event. 6.4.2 Enter a brief descriptive title in the index, next to the number used. Handwritten' entries

                                                                                  'are acceptable.

6.4.3 Mark all data sheets with the number for the event. 6.4.4 Mark each chart recorder trace with the parameter name and instrument ID number. 6.4.5 Mark each chart recorder trace for time of the initiating event. Additional time marks or a chart speed notation should be added to aid in correlating parameter response. 6.4.6 Where known, indicate the time of operator action where parameter response is affected by the action. 6.4.7 Where known, indicate the time of major equipment actuations which affect parameter j response. j 6.4.8 Develop a brief description of the event as a l' cover page. Refer to step 6.1.6 of this procedure for guidance on content. 1' Rev.: 1 Date: 6/27/89 Page: 6 of 8 NSEM-6.03

I l l I I 16.4.9 Data sheets for the event should be arranged a

\/                              in logical order with the first page being            i the description developed per 6.4.8.                1 Note:   " Logical order" may' vary from event to event'       '

based on the nature of the initiating event. 6.4.10 The completed package shall be entered in the  ; PDB with a tabbed divider identified with the event number from the PDB index.

               . 6.5 Using The Reference Plant Data Book 6.5.1   New PDB entries shall.be reviewed by the AsoT to determine the following:

6.5.1.1- If the event data could and should be used as a basis for simulator certification testing. 6.5.1.2 If the event should be used as a basis for developing training. 6.5.1.3 If the event should be tested for simulator response due to its unique nature. T 6.5.1.4 If the event does not fall in one of the above categories, but may be useful as a historical record. Example: support future DR submittal. 6.5.2 Based on_the determination of 6.5.1, the ASoT shall assign personnel-(as appropriate) to perform the following: 6.5.2.1 Incorporate the event data and perform certification testing as required, per the appropriate NSEM procedure. 6.5.2.2 Develop simulator and/or classroom training based on the event. 6.5.2.3 Perform simulator response testing, simulator response evaluation, and DR submittal as required. Rev.: 1 Date: 6/27/89 Page: 7 of 8 NSEM-6.03 __ = _ _ -____ __

I l r" l l [% L/ t 6.5.3 The Operations Consultant, Unit Software Coordinator, and all instructors, associated with the unit, should be made aware of the existing event data. This may be handled in an informal manner and documentation is not required. 7.0 FORMS l None 8.0 ATTACHMENTS 8.1 Marginal Note Directory i f~ t x, l t Rev.: 1 Date: 6/27/89 Page: 8 of 8 NSEM-6.03

ATTACHMENT 8.1

 ,(

MARGINAL NOTE DIRECTORY

1. Clarified Name of Procedure I

I 1 1 I l Rev.: 1 e: / /89 NSEM 6*03 Page: 8.1-1 of 1 I I

   .h                                                                                       !

7 i

 ; ry.                                                                                      l l

NORTHEAST UTILITIES  ; i a NUCLEAR SIMULATOR ENGINEERING MANUAL ll l NSEM-6.04 ) MAJOR PLANT MODIFICATIONS i i oy Responsible Individual: Minager,OpefatorTraining Branch Approved: . D g Nuclear Training Revision: Date: January 12, 1989 SCCC fleeting No: 89-001 l

4 J 73 1.0 PURPOSE ] i

            \'                   This procedure defines the process to be used to expedite simulator modifications when changes to the reference plant  ,

are of such an extensive scope as to seriously challenge l the ability of the simulator to function as a plant I referenced simulator. 2.0 APPLICABILITY This procedure applies to the Nuclear Training Department { (NTD), including Operator Training Branch (OTB), simulator j Technical Support Branch (STSB), and other Northeast i Utilities (NU) organizations performing functions in support of the NU Simulator Certification Program. l i

3.0 REFERENCES

3.1 ANSI /ANS 3.5-1985 - This standard states the minimal functional requirements on design data and simulator performance and operability testing. 3.2 NRC Reg 1.149 - Rev. 1, April 1987 - This guide describes an acceptable methodology for certification by endorsing ANSI /ANS-3.5, 1985 with some additional

            <x                          requirements.

I\ ') 3.3 NUREG 1258 - December, 1987 - Describes the procedure and techniques which will be employed to audit certified facilities. 3.4 NSEM-5.01 - This procedure details the Simulator

                                       ..;dification Control process.

3.5 NEO 5.18 - This procedure defines the requirements for the preparation, review, approval, and control of Project Descriptions. 4.0 DEFINITIONS 4.1 Deficiency Report (DR) - form (STS-BI-FlA) used by the Operator Training Branch (OTB) and the Simulator Technical Support Branch (STSB) to record all identified simulator deficiencies between the simulator and reference plant. Rev.: 0 (_ Date: 1/12/89 Page: 1 of 6 NSEM-6.04

d f- Major Plant Modification - a significant change made

                                                              ~

4.2 Ow to the. reference plant which cannot be trained around on the' simulator and would result in negative training. Major plant modifications such as the extensive component relocations /changeouts associated with a Control Room Design Review, seriously challenge the ability of the simulator to function as a plant-referenced training / examining tool. 5.0 RESPONSIBILITIES 5.l' Simulator Configuration control Committee (SCCC) Responsible for approval of NSEM-6.04 Form 7.1, designating a design change as a Major Plant Modification. 1 5.2 Manager, operator Training (MOT) Responsible for concurrence of NSEM-6.04 Form 7.1, designating a design change as a Major Plant Modification. 5.3 Manager, Simulator Technical Support Branch (MSTSB)

                               . Responsible for allocation of hardware and software resources to support timely implementation of Major
        \                       Plant Modifications.

5.4 Supervisor, operator Training (SOT) H5.4.1 Responsible for allocation of unit OTB resources to support timely implementation of Major Plant' Modifications. 5.4.2 Responsible, with the ASOT, for recommendation of NSEM-6.04 Form 7.1, designating a design change as a Major Plant Modification. 5.5 Assistant Supervisor, Operator Training (ASOT) 5.5.1 Responsible, with the SOT, for recommendation of NSEM-6.04 Form 7.1, designating a design change.as a Major Plant Modification. Rev.: 0 ( Date: 1/12/89 Page: 2 of 6 NSEM-6.04 l

a , o l l -l

  • L I(') 5.5.2 Responsible for' scheduling simulator avail-D (J ability to support timely implementation of .

Major Plant Modifications. i 5.5.3 Responsible for ensuring-appropriate administrative controls are instituted if simulator training is to be delivered prior L 'to the successful completion of a Major Plant Modification. 5.6 Operations Consultant / Operator Instructors Responsible, as assigned, for performing activities supporting the timely implementation of Major Plant Modifications. 6.0. INSTRUCTIONS 6.1 Identifying Plant Design Changes with the Potential for Designation as Major Plant Modifications. Note: Due to the impact on NTD resources and simulator availability it is essential that Major Plant Mod-ifications be identified in their early stages. Awareness of forthcoming plant design changes, with

                ,s                                                                                       major impact, via normal contact with plant staff

() generally precedes formal notification by a signif-icant: margin. Early awareness'may also come from the project engineer fulfilling the NEO 5.18 requirement to include simulator hardware in the Project

                                                                                                        -Description.                                                                           At the earliest opportunity, OTB should research all major plant design changes to determine the need to implement this process.

6.1.1 Upon awareness of a forthcoming plant design change with major impact, the ASOT shall assign the Operations Consultant or an Operator Instructor to research and follow the project. 6.1.2 The individual assigned shall schedule meetings with engineering and operations personnel involved in the project. 6.1.3 To the extent possible, the assigned individual shall obtain project schedules, copies of drawings and prints, parts lists,  ! purchase orders, and any other documents necessary to determine the scope of the project. Rev.: 0

       .(~                                                                                                                                                                                                                  Date:         1/12/89
         '(                                                                                                                                                                                                                  Page:        3 of 6 NSEM-6.04 l

e. gg When the necessary documents have been () 6.1.4 obtained, the assigned individual, the ASOT, and the SOT shall meet to review the project and make the determination as to whether or not to recommend that the project be designated a Major Plant Modification. The following impacts of later implementation shall be considered: 6.1.4.1 The ability to " train around" the change and the limits that would be imposed on the scope of simulator training. 6.1.4.2 The ability of the simulator to function as a plant referenced simulator. 6.1.4.3 The potential for negative training. 6.1.5 Proceed to section 6.2 for plant design changes where implementation of this process is recommended. Otherwise the design change will be handled per NSEM-5.01. 6.2 Recommending Implementation of the Major Plant Modification Process eS i/ 6.2.1 Fill out an NSEM-6.04 Form 7.1 for the plant design change. The following information shall be included: 6.2.1.1 A description of the change including panels, systems, and controls affected. 6.2.1.2 Justification for implementing this process, (training limitation, simulator fidelity, negative training potential, etc.). 6.2.1.3 Expected plant in-service date. 6.2.2 The ASOT and SOT shall review the information contained on NSEM-6.04 Form 7.1 for complete-l ness and accuracy and sign the form, 6.2.3 The signed NSEM-6.04 Form 7.1 shall be presented to the mot for concurrence. l i i

 /~T                                              Rev.:  0

(,) Date: 1/12/89 Page: 4 of 6 NSEM-6.04 1 E

1 l I l l~ 6.2.4 The ASOT and SOT should brief the MOT on the i

         \~'                            change, discuss possible alternatives and provide any additional information requested.

6.2.5 When the MOT signs NSEM-6.04 Form 7.1 indicating concurrence, proceed to section 6.3. Otherwise the plant design change will be handled per NSEM-5.01. 6.3 Approving Implementation of the Major Plant Mod-ification Process 6.3.1 Copies of NSEM-6.04 Form 7.1 shall be distributed to the Director, NTD and the Manager, STSB for review prior to discussion at an SCCC meeting. 6.3.2 The design change shall be brought up for discussion at the next available SCCC meeting by the appropriate Operations Consultant cr ASOT. 6.3.3 The SCCC shall establish an implementing date based upon training requirements, simulator availability, the plant implementation schedule, STSB resources and the priority of g w, other scheduled work.

                              6.3.4   when the chairman signs NSEM-6.04 Form 7.1 indicating SCCC approval in designating the design change as a Major Plant Modification, proceed to section 6.4. Design changes disapproved by the SCCC will be handled per NSEM-5.01.

6.4 Implementing Major Plant Modifications 6.4.1 For an approved Major Plant Modification, submit a DR. Include the date at which the DR becomes priority 1 (the date when training must stop). 6.4.2 The completed NSEM-6.04 Form 7.1 shall be submitted with the DR. The form will then become a permanent entry in the SDC which , addresses the DR. j I 6.4.3 The design change will be handled per the j process specified in NSEM-5.01, with the j following additional requirements. Rev.: 0 l ()g (, Date: 1/12/89 ] Page: 5 of 6 NSEM-6.04 l 1 - _ _ _ _ _ _ _ _ - _ _ _ _ i

tf

     .p, i.;
', r The Manager, STSB is assigned
i ^^j 6.4.3.1 responsibility'for allocating:

hardware and software resources to ensure timely implementation of the Major Plant Modification. 6.4.3.2 The respective SOT is. assigned; responsibility for allocating unit OTB resources to support-timely implementation of Major Plant Mod-ifications. , 6.4.3.3 Preparation and development work may begin immediately. However, due to

                                          .the extensive impact on NTD resources and simulator                                     ,

1 availability, hardware and-software changes shall not be made to the simulator until the respective modifications to the reference plant are installed in final form. 6.4.3.4 Simulator training shall not be delivered after the plant inservice-date until the respective simulator modifications have been completed and successfully tested. As an-L/ % alternative, administrative controls T may be instituted to limit the simulator curriculum.to prevent negative training. 7.0 FORMS .. Major. Plant Modification Designation and Justification i

                                                                                                       \

Rev.: 0 1

 'O 3                                                               Date:     1/12/89                     !
      ' ~

Page: 6 of 6  ; NSEM-6.04

    .i '

i' 0 i

   , -~s .                                  FORM 7.1                              l i      i
   'x/                       MAJOR PLANT MODIFICATION DESIGNATION                 l AND JUSTIFICATION Unit:                    Expected Plant In-Service Date:

Plant Design Change

Description:

                                                                                 }

J l l l { 1 l l

   /~)

x_/ I l Rev.: 0 (^}

   \~-                                                         Date:  1/12/89 NSEM-6.04            Page:  7.1-1 of 2 L--- -_ --- _

I i FORM 7.1

  \l                                  MAJOR PLANT MODIFICATION DESIGNATION 1

AND JUSTIFICATION Justification: 4

  /'T

)

    ~.

RECOMMENDATION: DATE: ASOT RECOMMENDATION: DATE: SOT I CONCURRENCE: DATE: ) MOT \ l APPROVAL: DATE: j SCCC ) I SCCC MEETING NO.: j 1 '- Rev.: 0 (I)x-NSEM-6.04 Date: Page: 1/12/89 7.1-2 of 2

                                                                                          ]'

_ _ _ - - _ _}}