ML20076L491

From kanterella
Jump to navigation Jump to search
Forwards Detailed Control Room Design Review Per Requirements of Generic Ltr 82-33, Suppl 1 to NUREG-0737 Requirements for Emergency Response Capability. Reg Guide 1.97 Implementation Rept Will Be Submitted by 831101
ML20076L491
Person / Time
Site: Marble Hill
Issue date: 07/15/1983
From: Shields S
PSI ENERGY, INC. A/K/A PUBLIC SERVICE CO. OF INDIANA
To: Eisenhut D
Office of Nuclear Reactor Regulation
References
RTR-NUREG-0737, RTR-NUREG-737 GL-82-33, SVP-0108-83, SVP-108-83, NUDOCS 8307190209
Download: ML20076L491 (113)


Text

{{#Wiki_filter:f 4 PUBLIC SERVICE INDIANA S. W. Shields July 15, 1983 Senior Vice President - SVP-0108-83 Nuclear Division Mr. D. G. Eisenhut Docket Nos.: STN 50-546 Director, Division of Licensing STN 50-547 U. S. Nuclear Regulatory Commission Construction Permit Nos.: Washington, D. C. 20555 CPPR - 170 CPPR - 171 Marble Hill Nuclear Generating Station - Units 1 and 2 Generic Letter No. 82-33

Dear Mr. Eisenhut:

On April 15, 1983, Public Service Company of Indiana, Inc. (PSI) provided a response to Generic Letter 82-33, " Supplement 1 to NUREG-0737-Requirements for Emergency Response Capability." That response contained schedule dates for submittals to the Nuclear Regulatory Commission (NRC) and completion of the subject activities. The purpose of this correspondence is to provide the submittal date of the Regulatory Guide 1.97 (Instrumentation for Light-Water-Cooled Nuclear Power Plants to Assess Plant and Environs Conditions During and Following an Accident) planned implementation report and to transmit the Marble Hill Detailed Control Room Design Review (DCRDR) Program plan. The Marble Hill-specific Regulatory Guide 1.97 planned implementation report will be submitted by November 1, 1983. The DCRDR program plan is attached for your review. Pursuant to Generic Letter 82-14, One (1) signed original and thirty-nine (39) copies of this lecter are provided for your review. Forty (40) copies of the DCRDR program plan are also enclosed. If you have any questions regarding this matter, please contact me at your convenience. Sincerely, hh Oh O O 6

s. W. Shields p

PDR Wy P. O. Box 190, New Washington, Indiana 47162 812. 289.3000

{- O PUBUC SERVICE Letter: D. G. Eisenhut July 15, 1983 SVP-0108-83 SWS/DJH/bak Attachment cc: J. G. Keppler J. E. Konklin J. F. Schapker i

i l' DETAILED CONTROL ROOM DESIGN REVIEW PROGRAM PLAN FOR THE MARBLE HILL GENERATING STATION Submitted by: Public Service Indiana In-Response To: GENERIC LETTER 82-33 NUREG-0700 AND NUREG-0801 CONTROL ROOM DESIGN' REVIEWS ARD0626/627D e w

I PREFACE l The following document was prepared by. Public Service Indiana Nuclear Systems Division, in conjunction with human engineering consultants from the ARD Corporation of Columbia, Maryland. This Detailed Control Room Design Review (DCRDR) program plan was developed by a group of specialists in the fields of plant operations (licensed operations), instrumentation and control engineering, nuclear engineering, training and human engineering. It is intended to outline the procedures which will be. adopted. to satisfy the requirements of a DCRDR. 2 9,.

TABLE OF CONTENTS SECTION PAGE 1.0 REVIEW PLAN 1-1 1.1 Introduction 1-1 1.2 Approach 1-5 1.3 Foun'dation Processes ' l-10 1.4 Investigative Processes 1-10 1.5 Implementation and Scheduling 1-11 1.6 Final Report 1-11 1.7 Glossary of Terms 1-12 2.0 MANAGEMENT AND STAFFING 2-1 2.1 DCRDR Team 2-1 2.1.1 Program Manager 2-2 2.1.2 Task Force Leader 2-4 2.1.3 Task Force Administrator 2-4 2.1.4 Human Factors Engineering 2-5 2.1.5 Other Review Team Members 2-7 2.1.6 Technical & Professional Specialists 2-9 2.2 DCRDR Work Plan 2-9 l --iii-

SECTION PAGE 3.0 DOCUMENTATION AND DOCUMENT CONTROL 3-1 3.1 Input Documentation 3-1 3.2 Output Documentation 3-2 3.3 Document Control 3-3 3.4 Data Base Management System 3-5 4.0 REVIEW PROCEDURES 4-1 4.1 Review of Operating Experience 4-1 4.1.1 Examination of Available Documents 4-1 4.1.2 Control Room Operating Personnel Survey 4-7 4.2 System, Function Review and Analysis of Control Room Operator Tasks 4-14 4.3 Control Room Inventory 4-17 4.4 Control Room Survey 4-21 4.5 Verification of Task Performance Capabilities 4-22 4.6 Validation' of Control Room Functions 4-23

5.0 ASSESSMENT

AND IMPLEMENTATION 5-1 5.1 HED Assessment 5-3 5.2 Recommendation Assessment 5-7 5.3 Implementation and Scheduling of Recommendation 5-8 6.0 FINAL

SUMMARY

REPORT 6-1 -iv-

SECTION PAGE 7.0 BIBLIOGRAPHY 7-1 1 7.1 U.S. Nuclear Regulatory Commission l Regulations 7-1 7.2 Supplementary References 7-3 APPENDIX - DCRDR Data Collection Forms A-1 1 e -v

' LIST OF FIGURES FIGURE PAGE 1.1 DCRDR Process Flow Chart 1-3 1.2 Marble Hill DCRDR Activity Flow Chart 1-6 2.1 Structure and Management of the PSI DCRDR Review Team ~ 2-3 E2. 2 DCRDR Task Responsibility Chart for Project Management 2-10 2.3 DCRDR Activity Plan and Approval Cycle 2-12 3.1 Sample Human Factors Evaluation Data Base System 3-4 4.1 Activity Flow Chart for Examining Available Documentation 4-3 4.2 Operating Experience Review Report 4-6 4.3 Activity Flow Chart for the Control Room Operating Personnel Survey 4-8 4.4 Survey Biographical Data Sheet 4-13 4.5 Task Analysis Fault Tree 4-15 4.6 Activity Flow Chart for Validation Review 4-24 4.7 Sample Lines of Workstation - Work Flow Data 4-29 4.8 Validation Review Work Sheet 4-35 { 5.1 Activity Flow Chart for HED Assessment and l Corrective Action Implementation 5-2 l 5.2 Suggested Schedule for Corrective Actions 5-9 -vi-

Public Service Indiana 1.0 REVIEW PLAN 1.1 Introduction Several special inquiry groups were established by the Nuclear Regulatory Commission (NRC) to investigate the cause and con-sequences of the accident at Three Mile Island #2 (TMI-2). It became evident during these investigations that human error played an important role throughout the accident. Therefore, special attention was focused toward issues incorporated within the discipline of human factors engineering (e.g., man-machine interface design, procedures, manning, and training) which were influential in causing or contributing to the cause(s) of the accident. The primary conclusion reached by the human factors engineering investigation was that human errors were due, in large part, to inadequate equipment

design, information presentation, and operator training.

The results of this study were documented in NUREG/CR-1270, " Human Factors Evaluation of Control Room Design and Operator Performance at Three Mile Island-2" (Vol-umes 1, 2, and 3). Following this human factors review and the assessment of other inquiry groups, the NRC deemed it necessary that a human factors engineering review be performed on all nuclear power i i 1-1

Public Service Indiana plant control rooms. This requirement was documented in NUREG-0660, "NRC Action Plan Developed as a Result of the TMI-2 Accident"; NUREG-0694, "TMI-Related Requirements for New Operating Licensees"; NUREG-0737, " Clarification of TMI Action Plan Requirements"; and Generic Letter No. 92-33," Supplement 1 to NUREG-0737 - Requirements for Emergency Response Capability". Operating reactor licensees and applicants for operating li-censes are required to perform a DCRDR to assess and evaluate the control room workspace, instrumentation,

controls, and other equipment from a human factors engineering point of view, in accordance with Task I.D.1 of NUREG-0660.

This process takes into account both system demands and operator capabilities and then identifies,

assesses, and implements control room design nodifications that will correct inadequate or unacceptable item.t.

The DCRDR process as described in NUREG-0700 is divided into four major activities, as illustrated in Figure 1.1: Planning o a Review e Assessment and Implementation Reporting e This report reflects PSI's efforts to satisfy the planning phase, and details the PSI program plan developed to address the requirements of the remaining DCRDR activities. The overall review

plan, the management and review
staff, documentation and document
control, review procedures, and procedures for assessing human engineering discrepancies for PSI are described.

The primary objective of the proposed DCRDR is to satisfy a licensing requirement ensuring that proper human engineering 1-2

Public Service Indiana NUREG 0700 e if ANALYZE CR DESIGN REVIEW OBJECTIVES, RESOURCE REQUIRE-MENTS, CONSTRAINTS, RELATED ACTION Pl.AN CONCERNS d PLANNING PHASE ir DEslGN REVIEW sugggy

  • 08[D L

PLANNING SCHEDULE REPORT 4 'r PERFORM R EVIEW REVIEW PHASE 1r HUMAN ENGINEERING DISCR EPANCIES s if ANALYZE SAFETY IMPLICATIONS AND WAYS OF CORRECTING DISCREPANCIES y ASSESSMENT AND WORK PLAN l IMPLEMENTATION PHASE q AND SCHEDULE i FOR IMPROVING CR I HUMAN ENGINEERING I i 1F INITIATE CR IMPROVEMENT PROGRAM \\ l ir REPORTING PHASE SUSMIT REPORT ON DESIGN REVIEW AND CR IMPROVEMENT Pt AN Figure 1.1 DCRDR Process Flow Chart 1-3

Public Service Indiana principles and practices have been incorporated into the design of the Marble Hill generating station control room. In addi-tion, the integration of other NRC action items affecting the operator will aid in reducing the probability of operator error. These action items are referenced in NUREG-0801 and include: e Items I.C.1, I.C.8, and I.C.9 of NUREG-0660, Improved Emergency Procedures NUREG-0899, " Criteria for Preparation of Emer-gency Operating Procedures" i e Items I.C.6 and I.D.3 of NUREG-0660, Verification of the Correct Performance of Operating Activities Regulatory Guide 1.47, " Bypassed and Inoperable i Status Indication for Nuclear Power Plant Safety Systems" e Item I.D.2 of NUREG-0660, Installation of a Safety Parameter Display System Console NUREG-0696, " Functional Criteria for Emergency Response Facilities" NUREG-0835, " Human Factors Acceptance Criteria for a Safety Parameter Display System" e Item III.A.1 of NUREG-0660, Upgrading, of Licensee Emergency Support Facilities Regulatory Guide 1.97, " Instrumentation of Light Water Cooled Nuclear Power Plants to Assess Plant Conditions During and Following an Accident" l NUREG-0814, " Methodology for Evaluation of Emergency Response Facilities." 1-4

Public Service Indiana 1.2 Approach The Marble Hill generating station is presently under construc-tion and is not scheduled for fuel loading until June 1986. The control boards for Unit 1 are not scheduled to be completely installed until December 1983. However, it is PSI's intention to establish an operator task requirements data base by reviewing and analyzing system requirements as soon as possible. Through early development of this data base, many human engineering discrepancies (HEDs) can be identified and resolved without significant impact to PSI's construction and fuel load schedules. The data base can also be used to better address the other ongoing NRC human engineering programs previously mentioned. To accomplish the above objectives and to comply with the guidelines established for a DCRDR in NUREG-0700, the PSI task force plans to address the DCRDR in two phases. Figure _ l.2 presents the proposed activity flow chart for the Marble Hill generating station DCRDR. The initial control design review will take place over a ten-month period (Table 1.1). During this. phase, the foundation processes identified in NUREG-0700, establishing the benchmarks for human engineering descrepancy identification, will be accomplished. These processes are intended to satisfy that portion of the DCRDR. In addition, portions of the investiga-tiva processes identified in NUREG-0700 using the information from the foundation processes will be accomplished to identify HEDs associated with the control boards and the remote shutdown panel. To assist this process, a full-scale, " breadboard" mock-up of the control boards and remote shutdown - panel has' been constructed by PSI Engineering. .The boards are accurate replicas of the control room common vertical panels, the Unit 1 1-5

O Operating ,\\ Emperience ^ Review 3 i + System Function f I 4 Control Room and Discaepancy y Checklist Survey -k

  • 0

~ ""**I ~ -( /' Findips I[ Task Analysis Capabilities Functions 71 L l sf f 4-Control Room 4 [< ] Inventory ( f 4 ? h, hN f / 5 s t t / z' y H s {/ a l .~ m f + Prfaritize and .blect and . _= ~ "o. / i a Sunsaary Implement Determine De t e.v-ine Report g Changes kSchedule Control Correctave, 4,- agnificonie 9f -{ Room Changra HEDs e ,,f l l Actions g m 1 C y y/ ~ U 15, r. n f Complete CR Checklist Verify Verify NUREC-0660 Determane CR Changes + Act I M ecrepancyf Significance of y m Survey p HEDs M. e(o tL ,/ r u's f f / H* a a Implement Supplemental Prioritize and Selebtand f =0eterni Schedule Control @ "torrec t a,ne Changes Report vef g Room Changes Actions f '.i-g ,-f, f-s s Figure 1.2 Marble 11i11 DCRDR Activity Chart + ert. J

l l Correspnding Months After Start Task i.ctivities NURIT' 1 2 3 4 5 6 7 8 9 10 11 12 13 14 Reference REVIE1 Foundation Processes Operating Experience Review 3.3(0700) b Systera Function and Task Analysis 3.4(0700) b Control Roam Inventory 3.5(0700) b Investigative Processes Control Rocan Checklist Survey Mh Verify Task Performance Capabilities b Validate Control Rocan Functions b Compile Discrepancy Findings b ASSESSMmT N1D IMPLDENTATION Select / Design Corrective Actions b Schedule & Implement Design Changes M E h, REPORP Sumnary ReFort b [. 2;;' s E ?- Table 1.1 Initial Phase Milestone Chart { 4

control boards, and the remote shutdown panel as they are presently designed. These boards incorporate the modifications made to the control boards at Marble Hill's base plant, Byron Station of the Commonwealth Edison Company, as a result of its Preliminary Design Assessment (PDA). In this initial review phase, a number of HEDs will be identified, documented and assessed, and changes scheduled and implemented, before the actual control boards and control room become fully functional. A summary report detailing these activities and results will be prepared and submitted.

However, because the mock-up necessarily lacks dynamic fidelity, and because of the incomplete construction stage of the control room, it is recognized that many human engineering considerations cannot be assessed during the initial review phase.

These considerations will be addressed in the final review phase. The final review phase will commence nine months before scheduled fuel load. This review will consist of a verifica-tion of the control room modifications made as a result of the initial review, as well as the completion of. the review items and processes that could not be addressed in the initial review (e.g., an assessment of the control room's lighting, acoustics and ventilation). In addition, the integration of the other NRC action items affecting the operator, such as Items I.C.6, I.D.3 and I.D.2, will be assessed and verified. HEDs identified in this phase will be assessed and a corrective action implementation schedule prepared. These activities will be documented in a supplemental report (see Table 1.2) 1-8 l

Corresponding Months to Fuel Load Task Activities Reference 9 8 7 6 5' 4 3 2 1 RWIN Complete Control Roca 3.6 (0700) d Checklist Survey Verify Control Room (0801) 2h Changes Verify NUREG-0660 (GL82-33) b Action Item Integration Compile Discrepancy 3,9 (0700) /h Findings Determine Significance & 4.2(0801) Categorize IIEDs 4.2(0700) g E Select and Decign Corrective 4.2.2(0700) / g Actions 4.3(0801) Prioritize & Schedule 4.4(0801) -d Begin Implementation 4.3(0700) d .c Final Supplemental Report 4.6(0801) /h un Interacti6n with NRC 5.0(0700) i Finalize and Verify E Implementation 5.0(0801) /h 7 R. 5m Table 1.2 Final Phase Milestone Chart m

public Service Indiana 1.3 Foundation Processes As identified, industry-wide reviews of Licensee Event Reports (LER) for similarly designed control. rooms, that have generic applicability, will be used to identify conditions which may increase the probability of operator error affecting the safe operation of the generating station. In addition, operating personnel will be interviewed to obtain feedback based o'n previous operating experience. Concurrently, a systema review, function analysis, and task analyais will be conducted on the emergency systems. The procedures of these analyses will closely follow the NUREG-0700 guidelines to identify the information flow between man and machine. A control room inventory will also be prepared on a panel-by-panel basis to identify all instrumentation,

controls, and equipment within the control room.

This information will be compared with the requirements identified through the analysis of operator tasks, 1.4 Investigative Processes Using the foundation processes as a basis, the investigative processes will provide the appropriate information necessary to determine the adequacy of components, tasks,.and functions from a human engineering perspective. This will include a detailed control room survey of the contro? boards, consisting of a human factors engineering examination-of components,

tasks, functions, and related guidelines.

This will be followed by a verification of task performance capabilities, including: the verification of instrumentation and equipment availability, and the verification that operator task peformance is not adversely 1-10 O

Public. Service Indiana affected by the operator / control board interface. Subsequent to the verification processes, a validation of the control room functions will be conducted to determine-if the functions allocated to the control room operating crew can be accom-plished within the structure of the defined operating and emergency procedures, and the design of the control room as it exists. Deficiencies will be identified and documented during this part of the review. The PSI mock-ups, although highly accurate, do not provide sufficient information in all cases to permit a thorough investigative review. However, the investigative review will permit identification of HEDs in the current design in order to take corrective actions prior to the Unit 1 control boards becoming fully functional. A set of HEDs will be identified as a result of the investigative processes. Corrective modifications will then be considered, scheduled, and implemented. l.5 Implementation and Scheduling Upon completion of the DCRDR investigations, a review of the HEDs will be conducted. The review will serve to identify the significance of each of the HEDs, as well as provide the review 3 team with the opportunity to determine the appropriate actions necessary to correct the HEDs. A schedule will be - develcped, and the control room changes will be implemented accordingly. 1.6 Final Report At the conclusion of the control room design review, a report will be submitted which will: summarize the overall review

process, describe. the identified human engineering discrep-ancies, describe control room design improvements implemented 1-11

during the course of the review, and identify and justify proposed design improvements and tneir schedules for implementation. The Final Report will consist of the Summary Report, to be submitted at the end of the initial review, and the Supplement Report, to be submitted prior to fuel load. 1.7 Glossary of Terms Because there are differences in use of terms (even among practitioners within the same field), the following definitions are provided to reduce ambiguity. DETAILED CONTROL ROOM DESIGN REVIEW (DCRDR): The control room design review as required by NUREG-0660, Item I.D.l., and as described in NUREG-0700 and Generic Letter 82-33. DCRDR REPORT: Final report of the results for the DCRDR (as required by NUREG-0660, Item I.D.I). ENHANCEMENTS: Surface modifications that do not involve major physical changes; for example, demarcation, labeling changes, and painting. FUNCTION: An activity (or a static role) performed by one or more system constituents (people, mechanisms, structures) to contribute to a larger activity or goal state. FUNCTIONAL ALLOCATION: The distribution of functions among the human and automated constituents of a system. FUNCTIONAL ALLOCATION REVIEW (ANALYSIS): The examination of system goals to determine what function they require.

Also, examination of the required functions with respect to available manpower, technology, and other resources to determine how the functions may be allocated and executed.

In NUAEG-0700, 1-12

primarily the identification of established functions and exanination of how they are allocated and executed. HUMAN ENGINEERING DISCREPANCY (HED): A departure from some benchmark of system design suitability for the roles and capabilities of the human operator. HUMAN FACTORS ENGINEERING: The science of optimizing the performance of

people, especially in industry.
Also, the science of design of equipment for efficient use by people.

LONG-TERM: Correction will be implemented according to a schedule to be developed by the station and submitted to the NRC with the final report and supp"lemental report. NEAR-TERM: Correction will be implemented according to a schedule to be developed by the station and submitted to the NRC. In the final report the schedule will indicate that the corrections will be completed by the end of the first refueling outage after NRC acceptance. In the supplemental report the schedule will indicate that the corrections will be completed by the end of the second refueling outage after NRC acceptance. OBJECTIVE (MISSION, GOAL): The end-product as a result of a coordinated group of activities. OPERATOR (LICENSED): Any individual in a facility who manipulates a control or directs another to manipulate a control. PLANT EVENT: Those plant activities which occur during normal, abnormal and emergency operations. PLANT SYSTEM: Group 'of people and/or equipment constituents linked together (e.g., CVCS, Feedwater). 1-13

PROMPT: Corrections will be made according to a schedule to be developed by the station and submitted to the NRC. In the fina.. report, the schedule will indicate that the corrections will be completed by fuel

load, assuming prompt NRC acceptance.

In the supplemental report, the schedule will indicate that the corrections will be completed by the end of the first refueling outage after NRC acceptance. SIGNIFICANT HEDs: Those HEDs which, alone or in combination with other HEDs, may increase the potential for operator error and/or may have serious impact on system performance. SUBTASK: An activity (action step) performed by a person (or machine) directed toward achieving a single task. SYSTEM: A whole which functions as a whole by virtue of the interdependence of its parts., An organization of interdepend-ent constituents that work together in a patterned manner to accomplish some purpose. SYSTEM (s) ANALYSIS: Examination of a complex organization and its constituents to define (usually, but not necessarily, in mathematical terms) their relationships, and the means by which their actions and interactions are regulated to achieve goal states. TASK: A specific

action, performed by a

single system constituent person or equipment that contributes to the accomplishment of a function. In NUREG-0700, only tasks allocated to people, in particular to control room operators, are addressed in detail. Moreover, in accordance with Generic Letter 82-33, only tasks associated with emergency systems will be evaluated. VALIDATION: The process of determining if the physical and organizational design for operations is adequate to support 1-14

1 4 4 effective integrated performance of the functions of the control room operating crew. VERIFICATION: The process of determining if instrumentation, controls, and other equipment meet the specific requirements of the tasks performed by operators. i T I l l-15 L O 9 ~ + - ~ ~ - - 4

P Public Service Indiana 4 I 2.0 MANAGEMENT AND STAFFING 4 The basic purpose of the Detailed Control Room Design Review (DCRDR) is to identify and correct those factors in the control room environment and functions which are not fully in concert { with the safe and efficient operation of the facility. The DCRDR activities will be implemented by experienced Operating, 2ngineering and human factors engineering personnel. These individuals will perform the DCRDR with input from other

studies, analyses and concerns involving human factors 4

engineering considerations discussed-in NUREG-0660. l l 2.1 DCRDR Team The PSI DCRDR team will consist of a group of professionals from various disciplines with the wide range of skills necessary for the performance of the design review. The team will be supplemented, as required, by other disciplines'such as j reliability-analysis and visual performance assessment. During. the course of the review, any additional specialists (e.g., lighting, acoustics) required for specific tasks will be made I ( available as needed. A statement of responsibility and minimum L qualifications is provided for each team member.~ f I 2-1 _. _ _ ~ ~. _... _.. _,

l Public Service Indiana 4 Prior to beginnning the review, team members will be briefed on the methods and content of relevant NRC documents and general human factors engineering principles and methodology. Team 4 members will also be provided ' the opportunity to familiarize themselves with the general design and operation of the plant. [ Any general or specific procedural issues will be resolved at this point. The review team members are encouraged to document dissenting opinions, if appropriate. They will also be provided access to required plant facilities or personnel and necessary documents or information to' perform their assigned tasks. The structure and management of the review team is illustrated in Figure 2.1, and the names of the personnel who will fill each position are indicated. These individuals meet the minimum qualifications for the positions, detailed below, which they will hold. i 2.1.1 Program Manager (Mr. Neil Farr, PSI) It is the primary task of the Program Manager (PM) to insure that the procedures and plan are implemented. Further, the PM has the int.srnal responsibilities relating to cost and overall schedule. The PM is responsible for providing the necessary support to the project from other departments. The PM, with the Task Force Leader, will review the progress of the project pertaining to reviewing progress, problems identified and/or

resolved, deviations from schedule or plan and any other relevant information.
Finally, the PM~ is-responsible for coordinating the review activities with corporate management and with the NRC.

l l e 2-2

Program Manager PSI N. Farr Task Force Leader PSI B. Steen-Larsen Task Force iluman Administrator Fac,to rs Eneineer PSI ARD P. Gropp R. Kershner u Iw Architectural Systems Op rations Engineering Licensing Engineering PSI S&L PSI PSI M. Phillippe R. Florian S. Meyer P. lilland-NSSS J. Ulrich-BOP J. Zwyner-I&C y er U. n Station Training y Engineering Q PSI PSI [* i D. Italligan B. Orender m E S. 8 Figure 2.1 Structure and Management of PSI DCRDR Review Team

Public Service Indiana Minimum qualifications for the role or Program Manager include: e B.S. in an engineering discipline e 10 years of experience in an engineering discipline, including at least two of which are in nuclear engineering. 2.1.2 Task Force Leader (Mr. Brian Steen-Larsen, PSI) The primary responsibility of the DCRDR Task Force Leader (TFL) is to insure the availability of the necessary expertise to explore and resolve the Human Factors Engineering issues. The TFL will work closely with the PM to integrate the human factors engineering-related action items into the control rooms. He will also coordinate the DCRDR activities with the technical review leaders and will be responsible for scheduling and directing the evaluation and reporting process. Should the TFL receive a written discenting opinion from the Human Factors Specialist (HFS), it will be the TFL's responsibility to transmit to the HFS, in writing, the Review Team's response and maintain a file of this correspondence which will become part of Marble Hill's DCRDR permanent record file. Minimum qualifications for the role of Task Force Leader include e B.S. in an engineering discipline o Five years of experience in an engineering discipline, of which at least one is in nuclear engineering. 4 2.1.3 Task Force Administrator (Mr. Paul Gropp, PSI) The Task Force Administrator (TFA) will assume the responsibility for the control room review process, supervising 2-4

Public Service Indiana the project. This responsibility includes scheduling of plants for the project and integrating the human factors action items. The TFA will be responsible for supporting day-to-day activities, arranging administrative

services, assisting in directing the evaluation and reporting
process, and for ensuring proper documentation and document control.

It is also a responsibility of the TFA to insure the technical excellence and competency of the participants in the project. Accordingly, the TFA will manage the assignment of personnel relative to the review. The TFA is also responsible for the overall maintenance of the level of professional competence l required. l l l Minimum qualifications for the role of Task Force Administrator include: l. e one year of engineering experience in nuclear engineering i e Six years of experience in an engineering discipline, three of which may be replaced by a B.S. degree in an l engineering discipline t 2.1.4 Human Factors Engineering l l Human Factors Engineering will insure that human factors principles are not compromised during the DCRDB and will I provide assurance of the quality of results of each DCRDR conducted. 2.1.4.1 Lead Human Factors Specialist (Robert Kershner, ARD Corporation) The Lead Human Factors Specialist (LHFS) is responsible for ensuring that generally accepted Human Factors principles are t r j 2-5 1 [

not compromised nor ignored during the implementation of the ~ DCRDR. The Lead Human Factors Specialist / Consultant will be tasked to work closely with the PM,

TFL, TFA and other technical review leaders throughout the control room review and, with them, provide the human factors engineering technical leadership of the entire DCRDR project.

The LHFS will coor-dinate the activities of the Human Factors Specialist (s) and verify that task performance quality is maintained at a level necessary for a valid and comprehensive review. In addition, it will be the responsibility of the LHFS to rectrd dissenting opinions on methodology, technique, review findings, assessment and HED corrective actions that he has from the majority opinion of the DCRDR Review Team and report those opinions, in writing, to the DCRDR Project Manager. Minimum qualifications for the Lead Human Factors Specialist include: / e M.A. or M.S. in human factors engineering or related discipline e Five years of experience in human factors engineering, one of which is in nuclear control room review Human Factors Specialists will work with the review team and will be involved directly in the systems function analysis, task analysis, control room survey, verification and validation process. The HFS will be indirectly involved with the assess-ment and implementation phase and the writing of the final report. Minimum qualifications for Human Factors Specialists include: e B.A. or B.S degree in human engineering or related discipline e one year of experience in human factors engineering l l l 2-6 b

h 2.1.5 Other Review Team Members The remaining members of the PSI technical review team will be directly involved in various aspects of the review. Engineering will assist in the identification of the event sequences, functions, and tasks for review of the system functions and operator task requirements. They will be recponsible for the control room instrumentation inventory, verification, validation, and_the assessment and implementation of changes in response to the identified Human Engineering Discrepancies. Experienced operators from PSI Operations will assist in identifying operator

tasks, the control room inventory, verification of task performance capabilities, validation of the Control Room functions, and assessment and implementation of changes in response to the identified HEDs.

Qualifications for Operating and Engineering personnel will be as follows: 2.1.5.1 Nuclear Station Operators (NSO) (M. Phillippe, PSI) The Nuclear Station Operators included on the DCRDR team will be required to be Reactor Operator (RO) licensed, or'to be RO license certified, and to have had a minimum of two years experience in a Pressurized Water Reactor (PWR) control room. 2-7

Public Service Indiana-2.1.5.2 Station Operations Subject Matter Experts (SME) (D. Halligan, PSI Training) 4 The SME position will require being licensed as a Senior Reactor Operator (SRO), or be SRO certified, with at least five years experience in Nuclear Operations. 2.1.5.3 Engineering and Operations Departments Station Project Engineers (SPE) (P. Hiland, PSI - NSSS systems; J. Ulrich, PSI - BOP systems; B. Orender, PSI Operatiens) Station Project Engineers will have at least a bachelor's degree in an engineering discipline plus four years of experience in engineering, of which two years are in nuclear engineering. 2.1.5.4 Instrumentation and Control (I&C) Engineer (J. Zwyner, PSI) The minimum qualifications for the I&C engineer _will include a j B.S. degree in either electrical or mechanical engineering plus five years of engineering experience, two of which are in nuclear design. i Licensing,

Training, Architect / Engineers and Nuclear Steam l

Supply System vendors will provide support, as necessary, to l the-DCRDR Project Manager. Minimum qualifications for these l support personnel will be as follows:- l 2.1.5.5 Licensing (S. Meyer, PSI) l l l The licensing. representative must have a bachelor's degree plus two years of experience in nuc'. ear plant licensing. l l 2-8 s - -,.

i Public Service Indiana i (D. Halligan, PSI) 2.1.5.6 Training _ The training -representative ' must hold or have held a Senior Reactor Operator (SRO) license, or be SRO certified, with at leust five years of experience in Nuclear-Operations, of which one is in nuclear systems training. 2.1.5.7 Architect / Engineers (A/E) (R. Florian, S&L) The A/E representative must have a bachelor's degree in an engineering discipline with eight years of experience in ~ ~ nuclear systems design. An advanced degree in. 'an engineering I discipline would replace three years experience in nuclear systems design. 2.1.6 Technical and Professional Specialists-(TEP Spec.) on occasion, it may be necessary for the task force to procure the assistance of specialized individuals from outside its ranks. Minimum qualifications for these Technical and Professional Specialists will be a bachelor's degree (or equivalent) in the' specified discipline and one year of professional experience in the field; previous experience in power plants or other process control applications is preferred. l 2.2 DCRDR Work Plan The work plan is an integral supplement to the. DCRDR. It provides a codel of the review process which will enable the PM, TFL, TFA, LHFS and I&C engineer to understand the totality _ of the requirements comprising the project. The Task Responsibility Chart (Figure 2.2) portrays the relationshp of organizational position and management -tasks to be accomplished and the review process for tasks with the appropriate l 2-9

Public Service Indiana I&C TASK PM TFL TFA LHF3 HFS SPEC 1. Pro 2 ram Definition X e 0 0 2. Master Schedule Preparation X 0 0 3. Sub-schedule Pie-paration X e 0 4. Schedule Maintenance X e e 5. Periodic Update Reports 0 0 X 6. Define DCRDR Tech-nical Requirements X e 0 0 0 7. Define DCRDR Human Factors Requirenants X e 0 0 0 8. Authorize Changes in

  1. 2 and #3.

X 9. Detail Schedule for Plant-specific Review (DCRDR) w X e

10. Conduct Plant-specific Review (DCRDR) 0 X

e 0 0

11. Review and Approve Changes (HEDs) 0 0

X e e i

12. Manage Changes X

0 0 0

13. Program Assessment X

0 0

14. Corrective Action Sign-off X

e 0 0

15. Final Report Pre-0 X

e e paration

16. Final Report Review 0

0 0 X

17. Final Report Approval 0

0

18. Final Report Delivery 0

X = Primary Responsibility 9 = Support Responsibility i 0 = Approval Authority I Figure 2.2 DCRDR Task Reponsibility Chart for Project Management 2-10 i

Public Service Indiana personnel. The value of the chart is that it -illustrates how each position and individual is relateci. to each other. In addition to the work plan, a schematic diagram of the DCRDR review process and approval cycle is included in Figure 2.3. This schematic traces project responsibilities throughout the control room reviews for each station. A management control DCRDR status report form is included (in the Appendix) to maintain budgetary control. ? 2-11

l DCSDR CORPORATE PROCRAM APhT0 VAL PE*" CYCLE 4 P90 CRAM DCRDR PROGRESS scurwm PROGRAM _p C" g START REPORTING MANAGER ADMINISTRAT10N 0 l IN-PROCRESS pgyggg DCRDR PROJECT y SEVIEW & g... q.. MANAGER MANACLHENT OF APPROVAL SUPPORT N I Ed N ^0 HTE EXCEPTION ACTIVITY REP 0kTING & l HUMAN FACTORS I SCHEDULINC INFOR. REVIEW j SPECIALIST ' 8EVIEW l O DATA +$ HJMAN FACTORS cc,ts.EcrioN i 4, j SPECIALIST (S) ' 8EVIEW m h i SUPPORT STAIF PERSONNEL 4 8g3-5' COLLECTION y Primasy Information Flow Review & Approval Cycle ( As-needed) REI' ORT DRAFT p-e

2o.

D s N Figure 2.3 DCRDR Activity Plan and Approval Cycle l

~ ~- l l l a Public Service Indiana i 4 3.0 DOCUMENTATION AND DOCUMENT CONTROL The importanca of data management before, during and after the DCRDR cannot be overemphasized. Adequate documentation and document control creates a traceable and systematic translation I of information from one phase of the DCRDR to the next. It is mandatory that the DCRDR team have immediate access to a complete, up-to-date library of documents to: a) provide a support base to manage and execute the various steps and phases-- j of the control room reviews; and b) provide a design data base from which future control room modifications may be used. Therefore, a data base library will be established to ensure the. success of the DCRDR process.. The documentation and document control will be coordinated with. Marble. Hill DAP 5.02-REV 2, Nuclear Division - Records Management Program, to ensure that the output documentation complies with this-procedure. This section describes the documentation system -(input / output documents) and documentation-management / control procedures which Public Service Indiana will use to support its Detailed Control Room Design Review. l 3.1 Input Documentation At a

minimum, the DCRDR team will review the following l

-documentation in order to better' understand 'the control room f design / personnel requirements: I f l 4 System Lists e System Descriptions 3 l . ~., -..._ _,..___u_.

Piping and Instrumentation Drawings e e Control Room Floor Plan (Lighting, HVAC, Acoustics, etc.) Panel Layout Drawings e e Panel Photographs e List of Acronyms, Abbreviations Description of Control Room Coding Conventions e Samples of Computer Printouts e e Procedures (Emergency, Abnormal and Operating) e Guidelines for Procedural Development Other Human Factors / Control Room Studies e As additional documents are acquired or written, they will be added to the data base library. Forms are referenced in their applicable program plan sections; reports too cumbersome for inclusion are described and referenced in each applicable section and will be physically maintained by the TFA. 3.2 Output Documentation In order to facilitate systematizing and recording the control room design review, a series of standard forms have been developed. These forms appear in their entirety in the Appendix, except where indicated by an asterisk; Appendix page l numbers appear in parentheses. Operating Experience Review Report (A-2) e Control Room Human Engineering Discrepancy Record e (A-3 a & b) Personnel Survey Summary (A-4) e e Control Room Review Task Development (A-5) e Control Room Inventory Form (A-6 a & b) e Validation Review Worksheet (A-7) e Air Velocity Survey Record (A-8) l 1 3-2

Public Service Indiana Humidity / Temperature Record (A-9) e Lighting Survey-Luminance and Reflectance Record (A-10) e Lighting Survey-Illuminance Record (A-ll) e o Sound Survey Record (A-12) e Photographic Log (A-13) e System Review Summary Reference (A-14) e Index of Reviewed Reports (A-15) Operationc Personnel Interview Protocol

  • e Response-Summary Sheet
  • e e

Pre-Assessment Form

  • e Assessment Rating Form
  • e NUREG-0700 Section 6 Checklist
  • 3.3 Document Control DCRDR document control is required for traceability, retrievability and assurance of quality.

Public Service Indiana recognizes that a data collection / analysis effort, such as that inherent in a DCRDR, can generate untold volumes of paperwork which, if managed improperly, could result in a great loss of time and money. Marble Hill Sta' tion intends to implement a data base management system (DBMS) to collect,

update, analyze, and provide the information necessary to fulfill the requirements of its DCRDR.

An example of a method for using the DCRDR DBMS is illustrated in Figure 3.1. Implementation of the DBMS will minimize the number of manual transformation steps currently required in the data collection / analysis effort. Furthermore, it will afford the l DCRDR team the capability of instantaneous data analysis. Through the use of the DBMS parameters, any number or combination of data points will be accessed and analyzed on an as-needed basis. It will also provide a low-cost, reliable and efficient means of storing the DCRDR's base data and documentation. 3-3 f

.= DATA BASE USER (S) SOFTWARE DATA BASE STORAGE CONFIGURATION 4l,,. t L t t t OPERATOR SUB-TASKS 1 p 3 n USER 0

  1. 1 F

,[ T. l T T T3lT4 n g 2 OPERATOR TASKS A f F f f f3 4 n US R R OPERATOR FUNCTIONS l 2 S SYSTEMS & SUB-SYSTEMS l3 1 l62lS 3lS4l... n u T P P P P P E SYSTEM PROCEDURES 3 2 3 4 n USER

  1. 3 7

A SYSTEM EVENTS 1 2 3 4 E, E E E E E' V w STANDARD STANDARD CUSTOM m REPORT REPORT REPORT y r-

  1. 1
  2. 2 OUTPUT REPORTS & SCREENS e

i FlouRE 3.1 SAMPLE HUMAN FACTORS EVALUATION DATA BASE SYSTEM

Public Service Indiana 3.4 Data Base Management System -The DBMS will be implemented on a minicomputer

  • and commercially-available software.

It will consist of a master program with memory storage devices to hold the data extracted from various source documents. The-program will

perform, electronically, those functions which previously were performed manually.

Because manual handling of. data is largely eliminated after data is entered into the system, _the DBMS can greatly reduce duplication of efforts, document loss and errors resulting from unnecessary har$dling of data. When the DBMS is implemented, the TFA will create a series of data files and records using information derived from various source documents. Each source document contains specific forms, charts, schedules, etc., required for the DCRDR and each I will constitute a single data file. Data files, in turn, will be comprised of individual records which represent the specific parameters contained in the file forms, etc. The file then serves as a model of the document from which it was created, as well.as an area to store data records. Initially, the source documents will include those reports and forms listed previously in this chapter. Tne TFL, TFA and LHFS will be instructed in system structure, use and maintenance. Upon completion of training, each will be able to perform all functions of which the system is capable. These functions include: addition / deletion and editing of data

  • located at the home offices of ARD Corporation with on-site microcomputer interfacing terminals 3-5 y

ar+w y y, y r e-c r*- g-p

Public Service Indiana records; creation of new files; selective examination of particular

records, based on user-selected criteria; performance of
analysis, based on user-selected critoria; creation of specialized and standard reports; creation of hard copies of data a I system alteration, adjustment or expansion.

To avoid file damage and/or unauthorized data manipulation, access to the DBMS must be restricted by limiting user training and by issuing passwords to a limited number of users. Since there will be limited access to the system, it will be set up so that direct access will be permitted according to how much training an individual has in its use and how great the individual's need for access is. In addition, a hard-copy, written log will be kept for system use. The log will make system use auditing easier, allow for more sophisticated use of the system and will make it possible to use the files interactively. Three levels of system users will be established. The highest level of user will be those who are fully trained in its use and are responsible for the system, i.e., the TFL, TFA and LHFS. The next level of user will be those who are fully trained but are not responsibl'e for the system. The lowest level of user will be those with limited training. The TFL, TFA and LHFS will have the " master" passwords for all files, allowing them to perform all functions on the system. The mid-level user will have access to " read / write" passwords, allowing for all functions except for those involved with system alteration. The lowest level of user will have access to " read only" passwords, allowing only for calling up and reading information on the system; they will be unablie to alter data.

Also,

" read only" users will be somewhat limited in 4 3-6

9 Public Service Indiana handling data between files. The TFL will determine which team members will be mid-and low-level users. The actual procedures for system use will be concise and easy to understand. Once the program is loaded, the user will open functions to be performed. a file and choose from a variety of, The program to be used is written in " menu" style, i.e., the user is presented with a number of alternative functions and will make a choice among them. A minimum set of the main functions will include: e Display, edit, delete records e Add records e List records to the printer e Create or load short forms of documents e Print records e Maintain file (TFA only) e Close and exit from files l 3-7

1 1 i Public Service Indie.na 4.0 REVIEW PROCEDURES The objective of a DCRDR is to satisfy the requirement for performing a human factors engineering review of the control rooms to determine the extent to which the control rooms provide the operators with sufficient information to complete their required functions and task responsibilities safely. The review will also determine the suitability of the designs of the instrumentation and equipment in the various control rooms. 4.1 Review of Operating Experience This review will be done to ensure that problems encountered in either plant operation or in preparation for operation are addressed. This section of the Program Plan discusses the two methods that will be used to review operating experience; these are:

1) an examination of industry-wide historical documents and 2) a survey of control room operating personnel.

4.1.1 Examination of Available Documents Human error in performing complicated tasks is a well doc-umented fact and the potential for it is always present. In 4-1 ., ~.... - -. - - ,r-m v-

I Public Service Indiana the nuclear power industry, human error can combine with poor design features resulting in very serious consequences, as TMI-2 so clearly demonstrated. The industry is fortunate that instances of past human performance error and/or equipment / design problems are documented in plant records and can be used as a data base. This guideline presents the approach that PSI Marble Hill Station will use to tap that archival data base. This data base will be used to assist the review team in identifying areas of potential human performance problems. Specifically described herein are the approaches that will be used to: 1) review the " literature" of documented errors and/or problems for similar plants, 2) analyze the errors found and compare them for their applicability to Marble Hill, 3) collect and record information pertinent to the errors that

apply, and 4) input the recorded data into the DCRDR investigative processes (Figure 4.1).

Literature (Source) See.rch Because Marble Hill is a new plant under construction, it has no in-house base of operating error reports.that could be reviewed. Therefore, the PSI operating experience review will be predicated on industry-wide generic reports and documents. The review will concentrate on documentation germane to PWR-type plants, but information from BWR operating experiences will be used if and when applicable. The following method will be followed to identify and collect appropriate" material: 1. A Human Factors Specialist (HFS), with the assistance of the DCRDR Task Force Leader, will generate a list of possible documentation sources. This list should

include, but will not necessarily be limited to, industry-wide Licensee Event Reports (LERs), NSAC/INPO Significant Event Reports (SERs) and Significant Operating Event Reports (SOERs),

Outage Analysis

Reports, Final Safety Analysis Reports (when 4-2

i Public Service Indiana l REVIEW LITERATURE FOR DOCUMENTED ERRORS V ANALYZE ERRORS APPLICABLE TO MARBLE HILL Y RECORD RELEVENT ERRORS Y INPUT DATA INTO DCRDR INVESTIGATIONS l l I ~ Figure 4.1 Activity flow chart for examining available documentation 4-3

Public Service Indiana obtainable), Industry-related journals (e.g., Nuclear Satety), Owners Group memoranda, INPO reports, and documents from similar plants (if obtainable). 2. The

HFS, with clerical assistance and/or station assistance, will collect as much information as possible from the sources identified in step 1

(above), for the preceding five years. The search may continue further into the past if the collected data does not provide an adequately broad base. 3. The HFS and subject matter expert will review and, sort the material collected for a more detailed analysis. Retained material will meet at least one of the following criteria: a. involved control room operating crew error b. resulted in a derating, outage, or reactor trip c. involved controls, displays, or equipment in the control room d. involved operating procedures e. resulted in personal injury f. resulted in a technical specification violation g. resulted in an adverse impact to plant or public safety h. happened at a PWR-type plant i. happened at a BWR-type plant on a system also found at a PWR-type plant Determining Applicability to Marble Hill The material described above may or may not be pertinent to the PSI Marble Hill station. However, the material collected will be reviewed by the DCRDR Task Force. It will be the 4-4 ~ _ _.

l Public Service Indiana responsibility of the task force personnel from technical-staff, station engineering, station electrical, operations, and training to determine which of the documented errors obtained 1 could possibly occur at Marble Hill, based on its current l design. If ANY of the Task Force members feel that an incident similar to the one(s) under review could occur at Marble Hill, the document describing the incident will be retained for analysis. Each retained report will be assigned a record index number. Document Analysis / Recording The retained item from the preceding review will be delivered to the HFS. The HFS will carefully review each report / document i and record the information below in the appropriate space i provided on the Operating Experience Review Report Form. (See Figure 4.2). 1. The report index number 2. The error / problem that occurred 3. The operating status of the plant at the time the incident happened 4. The maintenance and/or systems status conditions (s), as relevant 5. Transient anomalies that occurred 6. The sequence that led to the incident 7. The control room instrumentation, controls, displays, and/or equipment involved 8. The outcome of the incident 9. Any corrective measures taken i If the above information is not in the report or document, the HFS will make every effort to obtain it. This may entail contacting the plant where the incident actually occurred. 4-5 r -,~,--,----,~.n e.,, ,v,-

Public Service Indiana Figure 4.2 OPER ATI:lC EXPERIE:lCE REVIS'd REPORT System: Panel Id entifica tion Iium be r: Component Identification Number: Component :;am e : Date: Reviewers: Index No.: E r ro r/P r o blem; Operatind Status: daintenance/ Systems Conditions: Transient Ancmalies: Sequence of Events: Ins t rumentation Involved: Outcome: Co rrec tiv e Measures: Is identified component ccceptable? YES: If no, descrepancy number: 1 4-6 i

Public Service Indiana Data / Result Reporting The HFS will give the completed Operating Experience Review Report to the DCRDR Task Force Administrator. This individual will review the report and complete the upper portion of the form. This portion of the form serves to identify the location at Marble Hill of the

controls, displays, and equipment involved in the incident so that the HFSs can focus on it in the DCRDR investigative processes.

The DCRDR^ Task Force Administrator will then deliver the report to the Lead Human Factors Specialist for dissemination to the human engineering specialists. As these. individuals peform their review, they will note on the form in the space provided, whether the controls / displays / equipment referenced have been adequately addressed at Marble Hill from a sound human factors perspective. If so, the YES box will be checked, if not, a discrepancy will be written and the discrepancy index number reported on the Operating Experience Review Report. Completed reports will be returned to the DCRDR TFA for filing. 4.1.2 Control Room Operating Personnel Survey The objective of the PSI Marble Hill Operating Personnel Survey is to obtain the special and pertinent knowledge that operating personnel possess about the control room system features they have experienced and those they have observed on the breadboard mock-ups. The objective of the Operating Personnel Survey Guideline is to provide a method to the Human Factors Spe-cialists for performing, the survey as defined in NUREG-0700, and is outlined in Figure 4.3. The basic elements of the method include: 1) survey construction, 2) survey implementation, and, 3) data analysis. e 4-7 l

Public Service Indiana 1 DEVELOP SURVEY V IMPLEMENT SURVEY Y ANALYZE RESULTS V IDENTIFY & RECORD HED's Figure 4.3 Activity Flow for the Control Room Operating Personnel Survey i 4-8

Public Service Indiana Survey Construction: An open-ended, structured interview survey approach will be adopted; nine content-topic areas will be addressed in the survey. The topics to be included are those suggested by, and listed in NUREG-0700. Specifically, and in the order of presentation, the areas covered will be: Workspace Layout and Environment e e Panel Design e Annunciator Warning System o Communications e Process Computers e Procedures e Staffing and Job Design e Training Method of Survey Construction: For each. topic area the following will be accomplished: s 1. Questions will be independently generated by Human Factors Specialists to address various subtopics within each area. The question orientation will be predominantly along the lines of the Critical Incident Technique to ensure that responses are as objective as possible. Subjective items will,

however, also be included.

More than one question will be written for 4 each subtopic. For each question generated, follow-up probing type questions will also be written. 2. A team of experts consisting of personnel with operating expertise, psychometric expertise, and training expertise, will review and evaluate the structured interview protocol questions written. I 4-9

Public Service Indiana Questions that meet the following criteria will be retained for inclusion in the interview protocol: questions will be direct, employing A. Simplicity common everyday language, and be as brief as possible. B. Clarity - questions will be unambiguous so that the responses received will be unbiased and accurate. questions will be free of C. Objectivity emotionally charged

words, such as good / bad, strong / weak, etc.,

to ensure that open, forth-right and honest perceptions and experiences are obtained. though more prevalent in multiple-D. Error Free choice and rating form type questionnaires, all surveys are susceptible to social desirability,

leniency, control
tendency, and halo-type errors.

Retained items will be those that have i the minimum tendency toward these error types. l As an aid in evaluating the effectiveness of the process, the team members will use a rating scale to judge the questions on each of" the preceding criteria. The scale will be of the three-point l differentiation

variety, with a

score of 3 representing total acceptability of the question, 2 representing marginal acceptability, and 1 indicating the item is not acceptable. Question ratings will be averaged across both criteria and team members to produce a score. For inclusion in the interview 4-10 _-.~.

public Service Indiana

protocol, a question must have a score of 2.0 or better.

3. With the retained questions as a base, the team will then assemble questions for each topic area of the survey so that the area is sampled completely in item content. When the number of acceptable questions in a topic area is considered' insufficient by the team members, the team will request that additional items be written by the Human Factors Specialists for review and evaluation by the team. Survey Implementation All previously licensed operator personnel with control room operating experience will be interviewed. The interviewing will be accomplished early in the review process. The Operations Lead will be responsible for identifying the appropriate personnel for interviewing, and for supplying the Human Factors Specialists with their names and working schedules. In conjunction with the Operations Lead, the Human Factors Specialist will develop a schedule for interviewing and will write an introductory letter to the interviewees. The introductory letter will briefly explain the purpose and importance of the DCRDR and how the interview phase affects the review process. In addition, the letter will ensure the participants of the confidentiality of their responses, convey the station's support of the review, describe how the results of the survey will used, and specify the time and place for the letter recipient's interview. All interviews will be conducted on an individual basis and will be conducted at the station in a designated private location by a trained and experienced Human Factors Specialist. The day before an individual's interview is scheduled, the Human Factors Specialist will contact the individual to remind him of the interview. 4-11

Public Service Indiana j f At the outset of ~ each interview, the Human Factors Specialist / Interviewer will reiterate the contents of the introductory letter and obtain the following " biographical" information on j each participant: name, age, sex,. height, number of years an R.O. license held, and number of years an S.R.O. license held. This information will be recorded on the Survey Biographical Data Sheet (SBDS) (see Figure 4.4). The index number on the f SBDS will correspond to the index number. on the Interview Protocol Sheets (to be developed after protocol questions are 1 generated) upon which the Human Factors Specialist / Interviewer will' record the participant's responses to each question during the interview. If not explicit in the interview. protocol l question, it will be the ' Human Factors Specialist / Interviewer's j responsibility to ensure that respondents consider the questions posed under various modes of reactor operation, and/or for different event types. Data Analysis: After all interviews have been completed, the Human Factors Specialists will examine and review all protocol j sheets on an item-by-item basis summarizing responses on a l Response Summary Sheet (to be developed subsequent to the l protocol's development).

Response

frequency data will be f generated. Both positive and negative control room features, as identified by the respondents, will' be documented on a I system-by-system basis for consideration in subsequent review t j' processes. The biographical data will be summed and averaged l to provide the Human Factors Specialist with an indication of i the experience level upon which the survey response data -i s predicated. All documentation generated in this review process i will be retained for future references (i.e., development of procedures, training programs, personnel relations, etc.). t t 9 4-12 ,.,y., ., _.,. _ _.,. _,+. _ ..,,w__.m., _.._m,*r-e-.-m~--w-+-**v-ww----> vn---w -we+ -,-+-~<*----+-,--**-w. ,,-+-r4v-w,-4-- e------------,----we-

Public Service Indiana SURVEY BIOGRAPHICAL DATA SHEET DATE: TIME: 13?ERVIEsER INDEI NUMBER. INTERVIEWEE:---------------------------------------------------- AGE. SEX: HEIGHT. YEARS NUCLEAR OPERATING EXPERIENCE AT BWR PWR f YEARS CONTROL BOARD OPERATING EXPERIENCE AT BWR PWR YEARS A R.O. LICENSE HELD. 1 i YEARS A S.R.0 LICENSE HELD. i r i Figure 4.4 Survey Biographical Data Sheet 4-13

Public Service Indiana 4.2 System, Function Review and Analysis of Control Room Operator Tasks The objective of these analyses is to determine, to the extent practical, if system performance requirements can be met by combinations of the instrumentation, equipment, software, and. personnel to insure that operator performance requirements do not exceed operator performance capabilities. Figure 4.5 illustrates the procedural approach PSI will use to conduct their system function task analysis. This approach is intended to yield a comprehensive body of data regarding the require-ments imposed on the operators. A top-down approach, starting with a review of systems, subsystems, and their functions, will be conducted to identify all-operator functions and tasks. Major systems and subsystems reviewed will include: reactor control and instrumentation systems, safety engineered systems, feedwater systems, radwaste systems, and power generation and distribution systems. The event sequences to be analyzed will reflect the full spectrum of plant operations with emphasis on emergency conditions. The following will be included: 1. The event sequences to be analyzed will come from the events used to develop the plant-specific, upgraded Emergency Operating Procedures. 2. Sequence of failure events (including multiple j failures for transients and accidents). Examples are i as follows: i a) Small break loss of coolant accident i b) Inadequate core cooling i 4-14 l l l l L

Public Service Indiana EVENT PROCEDURES SYSTEMS / SUBSYSTEMS 4 FUNCTIONAL ALLOCATION AN.U.YSIS .\\ OPERATOR FUNCTION 1 F2 F3 Fn OPERATOR TASK T T3 T g 2 n OPERATOR ^ 1 ST2 ST3 STn Operator subtasks must satisfy requirements for controlling the event under analysis Figure 4.5 Task Analysis Fault Tree 4-15

i Public Service Indiana c) Anticipated transient without reactor

trip, following the loss of off-site power d)

Multiple failures of tubes in a single steam generator and tube ruptures in more than one steam generator once the system criteria are identified and the system functions associated with each plant event are determined, each function will be related to a machine input or operator task. For each function, the display information necessary for an operator decision to activate a control or monitor a system state will be determined.

Next, the adequacy of operator control actuation feedback will be determined followed by environmental effects and constraints.

The objective of the task analysis is to establish the information flow between man and machine or the functional activity describing the manner. in which the information is transferred. The intent of the task analysis is to identify the behavioral requirements imposed on the operator where the task is defined as a group of discriminations, decisions, and activities related by temporal proximity, immediate purpose, and common output. The elements of the task considered for this review include: e The stimulus to the

operator, which triggers performance of the task e

The required response to that stimulus (i.e., the performance criterion) j l 4-16

.i Public Service Indiana e A procedure for performing this response (which i includes the equipment to be used for performing the 1 i task) A goal or purpose that organized the whole task e -The data resulting from this analysis will provide the informa-tion necessary for conducting the human factors engineering j review of the control board mock-ups 'and the-investigative 1 review of the Marble Hill control room..The task data will-be l j organized into requirements for the workstations / panels of the l control room. The workstation panel requirements will then be compared to the control room inventory to verify that the design will support operator tasks. 4.3 Control Room Inventory 1 i i A complete list of all control room instrumentation, controls, i equipment, and procedures will be compiled to assist the DCRDR per NUREG-0700. Each item on the inventory will be identified by system, subsystem, and function / subfunction. In addition, I the inventory will include an indication of component use and characteristics (e.g., parameter; unit of measure, range, and display). The location and proximity to related items will be noted for use during the verification process. i i The inventory will be accomplished on a system-by-system basis. Operator work stations, such as the control boards, 4 peripheral consoles, back panels, desks, etc., will be included 4 in the inventory. i I In performing the inventory, each item will be identified by l its system. Equipment use and characteristic information will be recorded for each' inventory item. In addition, the location 4-17 j ______;2,__._,

Public Service Indiana and proximity to related items will be noted for use during the verification process. Information will be recorded on a Control Room Inventory Form. This form was designed to simplify the data recording process and provide an equipment coding scheme to ensure that each item has a unique identifier. After pertinent information has been recorded for a system, it will be transferred to a computer-based data management system. The individual (s) conducting the inventory must provide the following information at the top of each inventory form: System name System number (by Equipment Piece Number - EPN) Statior. number Name(s) of individual (s) performing inventory Data inventory performed The inventory form will be comprised of three major sections: System Instrumentation System Manual Controls System Automatic Controls Items / data to be inventoried are described below, by system. Where applicable, explanatory information is parenthesized after each item. System Instrumentation Nameplate data Parameter measured What measured (flow, pressure) Units (kiloamperes, volta) 4-18

4 Public Service Indiana Type of instrument RM - Rotery meter EM - Edgewise meter SP - Single-point recorder MP - Multi-point recorder 4

  1. P - Number of points recorded Instrument data Range (0 to 300, 20 to 100)

Div. (increment value between marks) Instrument panel location 4

  1. - Pan-51 number H - Horizontal section V - Vertical section System Manual Controls i

Nameplate data Type of control switch JS - Joy stick switch SCS - Standard control switch TB - Thumb buster switch PB - Pushbutton switch K - Keylock switch RS - Rotary selector switch TS - Thumbwheel selector switch Number of switch positions 2 - Two positions 3 - Three positions l 5 - Five positions l ? Enter appropriate number of positions if none of the above apply Type of action SR - Spring return to neutral As-is - Stays where positioned 4-19 l

Public Service Indiana Type of control 1 So, SC - Seal in Open, Seal in Close SO, TC - Seal in Open, Throttle Close TO, SC - Throttle Open, Seal in Close TO, TC - Throttle Open, Throttle Close f. System Automatic Controls Nameplate data Type of control M - Manual controller M/A - Manual auto transfer station AT - Master auto controller with setpoint adjustment 2 A - Auto controller without setpoint adjustment Control Data Range (0-5C0) Div. (51bs.. 15 rpm) Component controlled V - Valve T - Turbine M - Motor Parameter controlled F - Flow P - Pressure 3 - Speed Instrument data Range (0-100, 20-50) Div. (51bs., 10 degrees) Instrument panel location

  1. - Panel number H - Horizontal section V - Vertical section i

I l l l l 4-20 l

7 Public Service Indiana The equipment data collected in the inventory will be compared to the requirements as identified in the task analysis. Discrepancies will be recorded as HEDs. 4.4 Control Room Survey The human factors engineering survey will follow the guidelines illustrated in Section 6 of NUREG-0700. This survey will consider the extent to which human performance characteristics are considered within the control room. A comparison of control features to the human engineering guidelines will be conducted using the data obtained from the task analysis. Human Factors Specialists, in concert with experienced utility personnel knowledgeable of plant systems and control room instruments and equipment, and operations personnel, will observe and measure control room features. In

addition, individuals skilled in lighting systems, HVAC and communication systems will be used for special measurements.

The Human Factors Engineering guidelines will be addressed for the nine topic areas below: 1. Control Room Workspace 2. Communications 3. Annunciator Warning Systems 4. Controls 5. Visual Display 6. Labels and Location Aids 7. Process Computers 8. Panel Layout 9. Control-Display Integration Public Service Indiana recognizes the differences in the orientation of guideline topics and will use the checklist 4-21 O ]

~ ~.. - i =Public Service Indiana -approach. Discrepancies will be~ noted for each non-compliant item and a photographic log will be developed for reference. 4.5 Verification of Task Performance Capabilities 4 a l The objective of the task verification process is to ensure that operator tasks can be performed in the existing control-room with minimum potential for human error. This process.will be completed in two steps. The first step will verify - the i presence (or absence)'of instruments and equipment that pro _ vide the information and control capabilities necessary to implement each task.. The second step will determine if the man-machine I interfaces provided in the control room are effectively 4 designed to support task accomplishment. To ensure that every task has the necessary equipment, and each-f equipment item performs a necessary task / function, a comparison of the inventory list with the I&C list from the task analysis will be completed in accordance with NRC guidelines.. e r In addition to verifying the availability of control room equipment, a verification of human engineering suitability will be conducted. to identify intsrface problems that may affect l task performance but may not be evident when the control room equipment is examined. Personnel knowledgeable in plant systems, instrumentation and controls engineering, human factors engineering and operations will _ participate in the verification process.

Also, system designe./ architect engineers will be available for consultation.

i f The information needed for this process will come primarily i l from the task analysis, the control room inventory and the Section 6 guidelines. If the results indicate that the control 4-22 . ~.. _-.m-- ,+-.y.,%y.,,_,-,,..re.y,r ,.-*r*y' e-*- --*,1 er*ww**--***e

  • ---&w g'

y* 1 mw+y--

Public Service Indiana room may contain instruments unnecessary for operator tasks, then engineering and procedures documents, operational and maintenance directives and regulatory requirements will be consulted as necessary. The Human Engineering Discrepancies (HEDs) will be recorded on the same HED form used during the control room survey. A photographic log of discrepant items will be maintained. 4.6 Validation of Control Room Functions The objective of the validation review is to determine if the functions allocated to the control room operating crew can be accomplished effectively within both the structura of the established operating and emergency procedures and the design of the centrol room as it exists. The purpose of the Validation of Control Room Function Guideline is to provide and describe the processes that will be used in both review phases to perform the validation review. Specifically, this guideline a delinentes: 1) the specific plant events that will be evaluated during the validation, 2) the approach to be used for performing the validation, 3) the method in which data will be recorded during the validation, 4) the method of data analysis that will be used, and 5) the method by which the results will be reported (Figure 4.6). Plant Events to be Evaluated The events below were selected for validation. In the estimation of operating subject matter

experts, they will provide for the exercise of all emergency unit systems and all emergency control room workstations.

4-23 I r-,- -v, n _.y,,.

Public Service Indiana f SELECT A PLANT EVENT FOR VALIDATION 3 I SECURE OPERATING PERSONNEL TO ASSIST WITH THE VALIDATION & SCHEDULE THE TIME TO DO IT I f IMPLEMENT THE METHODOLOGY ACCORDING TO THE APPROPRIATE GUIDELINES I l f ANALYZE THE DATA COLLECTED I f RECORD AND REPORT OBSERVATIONAL AND ANALYTICAL RESULTS l Figure 4.6 Activity Flow Chart for Validation Review l t 4-24 r

Public Service Indiana e Small break loss of coolant accident e Inadequate core cooling e Anticipated transient without reactor trip (scram), following the loss of offsite power e Multiple failures of tubes in a single steam generator and tube ruptures in more than one steam generator e Reactor trip (scram) procedure or emergency shutdown procedures Validation Approach At present, PSI Marble Hill Station has a breadboard mock-up of the Unit 1 control panels and the common vertical panels, built on a scale of 1:1. This mock-up will be used in the walk-through validation review. The walk-throughs will be done on the boards as they currently exist, i.e., modifications recommended as a result of HEDs discerned in this review process, such as the NUREG-0700 section 6 checklists, will not be made on the boards before the walk-through validation review. However, as stated earlier, modifications implemented to the Marble Hill base plant, Byron Station, as a result of its PDA, have been implemented or reflected on the Marble Hill breadboard mock-up. [This will allow the Human Factors Specialist to assess both the accuracy and impact of previously identified discrepancies.] At the outset of the DCRDR Review,the DORDR Task Force ' Leader will schedule time on the mock-up for the validation review, and will arrange for the availability of operating crews to walk through the appropriate procedures. The walk-throughs will be performed identically according to l the procedural steps listed below, except where otherwise indicated. L l 4-25 L

Publih! Service Indiana 1. The DCRDR Task - Force Administrator will select 'an event for validation from the. list presented earlier-in this guideline, and obtain the appropriate procedure (s). 2. An HFS, with the assistance of the DCRDR Task Fogce Administrator, will develop a. floor diagram of tne unit work space and numerically identify workstations for that unit.

l. '

3. A trained operating crew will. review the procedure (s) i for the event selected in step 1 above. Different crews will walk through different selected events. 4. The DCRDR Task Force Administrator and the Task Force Operations representative will brief the participating control room operating crew on the purpose- ~and j l specific objectives. of the walk-throughs, and on how they will be performed. At this time, assumptions about the operating situation will be specified to the operator (s). 5. The control room crew will then walk through what they would do while following the appropriate procedure (s). During the walk-through, the operator (s) will describe the following: e Actions taken 7 e Information sources (o) used e-Conversions or uncertainties involved l e Controls used o Expected system response 4-26 i-kL~._,_..<......~,._..J,..,.__ ...-,...,.,.,m.4,,._,. ~....

l 1 i i Public Service Indiana e How those responses would be and/or may be verified e Actions that would be taken if the expected responses did not occur e Additional assistance necessary and/or desirable from personnel outside the control room (as apropriate) The operator (s) will be instructed to simulate actions they would take if the event were real. 6. The operator (s) will be accompanied by an HFS during the walk-through of each event. The HFS will take observational notes on a procedural step-by-step

basis, attending to the relation between operator performance and control board / control room design.

In particular, the HFS will evaluate and critique the walk-through on the following criteria: A) The indications and annunciators referenced in the procedure (s) used should be available to the operator (s) as identified by the task analysis. B) The units of measurement displayed should be appropriate and consistent with the procedure (s) used. C) The labels associated with the various controls, displays, and annunciators referenced /used should be readily identifiable by the operation crew and consistent with each other and the procedure (s) used. i l 4-27 I 5 ,_._.--,_,__-..-~,-m.

I. I Public Service Indiana D) The controls and displays necessary should be available. (Those controls or displays that are desirable but not necessary, and that are not available to the operators during the walk-through, will be noted and recorded.) E) The operator actions expressed or implied by th e '* procedure (s)" should be within the capability of the operator (s), and/or the minimum control room staff as specified by Technical Specifications. a F) The ' decisions and actions required of the operator (s) should be consistent with the 7 training and experience of the control room i personnel. I G) Any special job performance aids used by the operator (s) should be specified in the procedure (s). (Any special job performance aids desirable, but not used or referenced in the procedure (s), will be noted and recorded.) H) Identify all

controls, displays, annunciators, I

and/or job performance aids used but not referenced in the procedure (s). 7. An additional HFS will observe the walk-through to record work station-work flow information using the unit floor diagram, developed in step 2 above, as a guide. Sample lines of work station-work flow data are presented in Figure 4.7.The information recorded will include: A) The direction of movement 4-28 ,+ ...-s,, ,-w--., -c-,ey- ,ye,,_+.9~y, --w.9 v. ir, we--t --*=T--.ww-T---'y-T

Public Service Indiana I r T a .I N Work 4 6 8 2 4 5 Station l Time 1:05

15
45
10 2:00
20 4:15
15
02 5:30 i

Work 7 6 8 2 g Station Time

06
45
04 1:45
10 6:00
20 1:00
18
45 i

l I 4 Y Figure 4.7 Sample Lines of Work Station-Work Flow Data l f 4-29 e- ~, -. ,..,,,.4...,,_,.,-- .y. _.s. m-,-,

i Public Service Indiana B) The sequence of movement C) The frequency of the movement D) The estimated time criticality of the movement E) A real-time estimate of the time that the operator (s) spends at each workstation moved to. In addition to the walk-through/ narrative of each event as described in the procedural steps above, each event will be video taped to provide the HFSs with an indication of the dynamics involved between the operator (s) and the control board / control room design across the individual steps in the appropriate procedure (s). A different crew (not the crew that assisted in the non-video taped walk-through) will " perform" in the videotaped replication of the event. For the video taping, efforts will be made to induce as auch realism into - the event (s) as possible. For example, tape recordings of system annunciator warning signals may be made and played by a subject matter expert during the walk-through. of the event. The video tape walk-throughs will be done in accordance with the procedural steps listed below. 1. The DCRDR Task Force Administrator will select an event for validation, from the list presented earlier in this guideline, and obtain the appropriate procedure (s). 2. An HFS, with assistance as needed from the DCRDR Task Force Administrator, will develop a floor diagram of the unit work space and numerically identify workstations for that unit. Copies of the diagram developed for previous walk-throughs and/or video tape walk-throughs may be used. 4-30

Jublic Service Indiana 3. A' trained operating crew will-review the procedure (s) for the event selected in step 1 above. Different crews will walk through different events for the video taping. 4. Set up and test the video tape equipment.to verify that the video and audio components function properly. 5. The DCRDR Task Force Administrator and the Task Force Operations representative will assemble and brief the participating control room personnel on the purpose and specific objectives of the event simulation for video tape walk-throughs, and on how they will be performed. Any assumptions,about the operating situation will be specified to the operator (s) during 4 the briefing. i: 6. To the extent possible, the video tape walk-throughs will be event simulations. Therefore, an HFS will not accompany the operator to take observational notes as will be done in the non-video tape walk-throughs. F Procedures for reference will be available to the operating crew during the simulation, but procedural i steps will not be called out. However, annunciator warnings and display parameters may be called out by a subject matter expert during the simulation.' The operators will be instructed to approximate real time l in the event simulation, and-to call out: A) Actions they are taking B) Direction of action movement l-C) . Display / indicator to which they refer to verify system response to actions taken j: 4-31 t I t l

Public Service Indiana D) What that response indication is and/or what it must be before the operator can take the next i action step. 7. Start the video tape ensuring that the following guidelines are met: ) i A) The camera (s) should be positioned at a distance from the workstations to ensure that an j unobstructed view of each station is obtained. l } ) B) The lighting levels should be sufficient to record the details of the event being taped. C) "Non-Performing" personnel in the mock-up 4 warehouse should be instructed: 1) to be as quiet as possible during the taping of the event, and 2) not to distract the operating crew on camera in any way. D) An HFS should operate the video recording equip-

ment, and should have complete freedom of movement to follow the operator on camera.

E) Ideally, a minimum of two cameras and recorders should be used to document the event simulation walk-through. One camera would be stationary with an angle of view encompassing the entire control panel work space. The second camera could then focus on the operator and follow him/her around the control panel during the i simulation event. This would allow for the monitoring of:

1) head
movement, 2) verbal

(. response, and 3) action response. A camera and l l l l l 4-32 w n = - - - u -r

Public Service Indiana recorder should be available for each reactor operator (R.O.) on the operating crew. If more than one R.O. is anticipa.ted in the composition of the minimum control room staff, then the ideal minimum number of cameras and recorders should be increased appropriately. Having the cameras follow the operators as they perform their tasks is important and should the desirable number of cameras not be available, those that are should focus on the operator (s) and NOT the entire control panel workspace. 8. Begin the event simulation walk-through. 9. During the event simulation, a voice-over narration by a subject matter expert may be performed on the video tape. The narration would convey what was trans-piring, what the operator (s) should be attempting, and why. 10. At a cue from the operating crew performing the event simulation, a subject matter expert or the DCRDR Task Force Leader should terminate the event and video taping. I 11. At tr ' t point, the video tape operator should remove the tape from the recorder _ and log in: the event taped, the date of taping, the time of taping, any unusual circumstances surrounding the -taping, the names of the operating personnel taped, the name of l the event narrator (if applicable), and the counter reading from the VTR. 4-33 i - - _,., ~ -. - - _ _ _ _

Public Service Indiana 12. During the event simulation, the HFS should observe the event to record work station-work flow information using the floor diagram of the unit work space prepared earlier as a guide. The information recorded should include the: A) Direction of movement B) Sequence of movement C) Frequency of the movement D) Estimated time criticality of the movement An estimate of the time that the operator (s) spent at the workstations can be obtained from' the video tapes. Data Recording Tne HFS accompanying the operating crew operator (s) during the event walk-through/ talk-through will be evaluating the operator performance versus the control board / control room design criteria, specified earlier in this guideline, for each step of the procedure (s) being used for the event under consideration. A checklist (Validation Review Worksheet Figure 4.8) will be used to record the HFS's evaluation of each procedure step. If the criteria are met in a particular step of the procedure, the "Yes" column will be checked. If the criteria are not met, the "No" column will be check'ed, and the discrepancy will be recorded in the " comments" column. For the event simulation video tape walk-through, the data will of course be recorded on the video tape of the event. Data Analysis A number of methods will be used to analyze and process the information obtained in the validation' review. The methods 4-34

Figure 4.8 VALIDATION REVIEW WORKSHEET Page of EVENT: OPERATOR: PROCEDURE (S): HUMAN FACTORS SPECIALIST: PROCEDURE STEP YES NO COMMENT HED INDEX NUMBER s f I I 4-35 I i

l 1 Public Service Indiana will vary according to the type of data collected and the manner in which it was collected. For the data collected by the HFS during the walk-through/ talk-through, an HFS will cross-check the comments recorded on the Validation Review Worksheet versus HEDs documented in previous review processes of the DCRDR. If a comment is addressed by one or more existing HEDs, it will be reported as an HED. The work station-work flow data collected in both the walk-through narrative approach and the event simulation walk-through approach will be compared by an HFS for congruence. Instances of incongruency will be investigated to discern the cause(s). Procedural and/or training modifications may be recommended as a result of these investigations, and a resolution of the incongruencies will be obtained and recorded. Diagrammatical and/or mathematical link analysis techniques will then be employed on the observational data. The results of these analyses will be reported as design modifications to improve / enhance work flow. The event simulation video tapes will be reviewed and analyzed jointly by the HFS and a subject matter expert. As the tape is reviewed for each procedural step sequence, the subject matter expert will comment on the following: A) Actions taken B) Information sources used C) Conversions or uncertainties involved D) Controls used E) Expected system response F) How those responses would be and/or may be verified G) Actions to be taken if the expected responses do not occur l l l 4-36 l 6

Public Service Indiana H) Additional assistance necessary or desirable from personnel outside the control room (if appropriate) Using the information supplied by the tapes and the subject matter expert, the HFS will evaluate the operator performances versus the control board / control room design criteria. For each procedural step sequence that meets the criteria, the HFS will check the "Yes" column of the Validation Review Worksheet. For each procedural step sequence that does NOT meet the criteria, the HFS will check the "No" column and record ' the discrepancy in the " Comment" column. The HFS will then compare the worksheet completed for the simulation walk-through of the event with the worksheet completed for the walk-through/ talk-through of the event. Any discrepancy not recorded on the walk-through/ talk-through worksheet will represent a new discrepancy via the video tape event simulation approach, and will be reported as such. Reporting of Analytical and Observational Results Discrepancies discovered in the evaluation of the walk-through narrative and the video taped event simulation walk-throughs will be recorded on an HED description form (see Appendix). i ( 4-37 i

Public Service of Indiana 4

5.0 ASSESSMENT

AND IMPLEMENTATION The DCRDR review process described in this Program Plan will result in the identification of a number of Human Engineering Discrepancies (HEDs). PSI recognizes that each HED identified represents a potential source of operator error with subsequent consequences on plant safety and operations.

Moreover, the potential for error will vary across HEDs.

Therefore, the HEDs must be evaluated to determine the extent to which they may affect plant safety, plant. operability, personnel safety, and the health and safety of the community which PSI serves. The recommendations for improvement or correction that the Human Factors Specialists (HFSs) made for the dis.crepancies discovered in the DCRDR investigative processes must also be evaluated. This section of the DCRDR Program Plan describes a systematic method for evaluating both the significance of HEDs and the feasibility / viability of the recommended improvements or corrections for them. The results of these evaluations will provide a deliberated consensual and expert knowledge base for the Operations and Engineering Departments to employ in formulating their decision to implement recommended improvements. That implementation will be done according to a schedule. The final portion of this section provides an approach for recommendation implementation and scheduling (Figure 5.1). 5-1

Public Service Indiana INFORMAL ASSESSMENT FORMAL ASSESSMENT SEPARATE HEDs TO BE REVII:W & CORRECTED WITHOUT EVALUATE b FORMAL ASSESSMENT HEDs V EVALUATE INTERACTIVE OR CUMULATIVE EFFECTS OF HEDs V CLASSIFY HEDs TO A CATEGORY AND LEVEL WITHIN THE CATEGORY If lf EVALUATE RECOMMENDATIONS FOR EACH HED & SELECT THE ONE TO BE IMPLEMENTED V DETERMINE & ASSIGN PRIORITIES TO HEDs CORRECTIVE ACTION FOR IMPLEMENTATION Figure 5.1 Activity Flow Chart for HED Assessment and Corrective Action Implementation 5-2 O ,--.v. r

i Public Service of Indiana h 5.1 HED Assessment The assessment of HEDs for impact on plant safety and operability will be accomplished by the HED Assessment Team (HEDAT). At a minimum, the HEDAT will consist of HFSs, the DCRDR Task Force Leader and Administrator, and the Operations 4 Lead. This team will first review ALL the HEDs generated in i the NUREG-0700 review process, and set aside those that will be corrected without question and without a formal assessment of their significance. The remaining HEDs will be set aside for formal assessment. The formal assessment will follow the procedure described below. j 1. Each HEDAT member will review and evaluate each HED independently on the followinJ factors: ( 1 a. Impact on physical performance (fatigue, discomfort, injury, control suitability, etc.) b. Impact on sensory / perceptual performance (stimulus

overload, distraction, visibility, readability, audibility, noise display adequacy, inconsistency with stereotypes and conventions, etc.)

c. Impact on cognitive performan'ce'(mental overload, confusion,

stress, sequential / compound /cumula-tive/ interactive errors, etc.)

~ d. Interaction with task variable (communication

needs, task duration, task frequency, delay or absence of necessary feedback, concurrent task requirements, mission response characteristics 5-3

l l Publ~ic Service of Indiana such as accuracy requirements and speed requirements, etc.) e. Impact or potential impact on operating crew error f. Impact or potential impact on plant safety The evaluation will be performed using a five-point scale, and each team member will assign a rating value to each of the factors for each HED assessed. 2. The HEDAT will me t to discuss the ' ratings they individually assigr.ed to the six factors. The objective of the discussions will be to reach a team consensus on the factor ratings. It will be the DCRDR Task Force Leader's responsibility to facilitate and monitor the team discussions. If, in the TFL's opinion, a consensus is not possible, the TFL will so state and the rating WILL BE the HIGHEST rating proposed / assigned by any team member. As a consensus is

reached, the HFS will tabulate and record the team's decisions.

3. A Human Factors Specialist with the assistance of a subject matter

expert, will review all HEDs under assessment, and assign code numbers to each of the following parameters:
system, subsystem,
panel, component, and human performance requirements.

This data will be entered into a

computer, and correlational matrices will be generated.

The matrices will be analyzed for possible interactive or cumulative effects across HEDs. Highly intercorrelated discrepancies will be re-evaluated for significance, and possible significance rating change. 5-4

l l Public Service of Indiana l i 4. Using the information and results obtained in the preceding procedural

steps, the HEDs will be categorized as an aid in determining:

a) if the HED will be corrected, and b) the priority for correction for those that will be corrected. Three categories will be used, with four significance levels each. The three categories are presented and defined below: Category I: Safety-Related System - Discrepancies associated with safety-related systems will be' included in this category. A list of safety-related systems will be provided. Category II: Non-Safety-Related System Discrepan-cies associated with plant systems not included in Category I will fall in this category. Category III: Other Discrepancies Discrepancies falling in neither Category I nor Category II will be included in this category. An HFS with the assistance of the DCRDR Task Force Administrator will review the HEDs and assign them to one of the categories listed above. They will also assign them to one of the four significance levels within each category using the rating values determined earlier. l r I l l l l 5-5 l

l i Public Service of Indiana The four significance levels are: Level A - HEDs with documented errors or control board problems Level B - HEDs with an overall high significance rating Level C - HEDs with an overall moderate significance rating Level D - HEDs with an overall minimal significance rating The above taxonomy deviates slightly from that suggested in NUREG-0700 and NUREG-0801 in that three categories with four levels are proposed rather than four categories with three levels. It is felt that the taxonomy proposed facilitates the assessment process contemplated. In addition, it should be pointed out that, though the taxonomy differs, the intent to treat documented errors as high in significance and priority remains. Level assignment will be a determining factor in the recommendation to correct HEDs. The HEDs in levels A and B should be corrected. HEDs in level C.may or may not be corrected. Those in level D will probably not be corrected. It will be up to station's discretion as to whether or not they will correct HEDs in these levels. It will be the responsibility of the Station Operations and Station Project Engineering Departments to decide whether or not to correct level C and D discrepancies, with input from the LHFS and the DCRDR Program Manager. t I l l S-6

e. - - - - -

Public Service of Indiana 5.2 Recommendation Assessment The selection of recommendations for correction will be done by a DCRDR Recommendation Review Committee. This committee will consist of the Lead Human Factors Engineer, the DCRDR Task Force Leader, the Operations Lead, and a representative from Station Engineering,

Training, and Project Engineering.

The committee's first responsibility will be to decide whether or not to recommend correcting HEDs assigned to levels C and D in the HED assessment process. If a decision is made by Operations or _ Engineering not to correct the

HED, a

justification will be prepared and signed by the rejecting person. The committee will then assess.the recommendations written by the HFSs for the HEDs that will be corrected. HED recommendations will be considered in an order predicated on their significance and priorirty as exemplified by their category and level assignments. The assessment of recommendations is a subjective process by nature. To inject a measure of objectivity into the process, a detailed procedure, outlined

below, has been developed to assess HED recommendations.

1. The committee will meet to review the HEDs, one at a time, and discuss each HED's recommendations briefly to clarify any points of concern. 2. Using a rating

system, each committee member will evaluate each recommendation on a number of factors.

Among the factors will be viability, soundness and feasibility. 3. The LHFS will record and tabulate the results of the committee's deliberations. I 5-7

~. ._..=- E Public Service.of Indiana i i 4. The recommendation with the best evaluation.will be i the committee's preferred recommendation and will be submitted to the station Operations and Nuclear Engineering Departments for consideration. 5. The committee may generate its own acceptable recommendations for those.HEDs for ' which none 'of the proposed recommendations were acceptable. It will be the LHFS's responsibility to ensure that the recommendations developed and accepted by the committee are in accordance with applicable precepts of sound numan factors engineering practice. 6. .If a decision is made by Operations or Engineering not I to accept any of the recommendations, a justification-will be. prepared and signed by the rejecting person. 5.3 Implementation and Scheduling of Recommendation The HEDs that should be corrected and their. accepted recommended corrective action, as determined by the procedures outlined

herein, will be delivered to the Marble Hill Operations and Project Engineering Departments for.

their. review. For those they decide to

address, a

Schedule of Corrective Actions is suggested (Figure 5.2). This schedule is predicated upon NRC recommendations and -is subject to the f availability of equipment, outage time availability, construction schedules and engineering design lead time. 9 e 5 I

Public Service of Indiana Category & Level Schedule for ._ _ _ __ _ __ _ _ _._ _ _ _ _ _ _ EO.Ere_ction_Implemen_ta_ti_on _ IA Prompt

  • IB Prompt IC Near-Term ID Long-Term II A Prompt II B Near-Term II C Long-Term II D Long-Term III A Prompt III B Near-Term III C Long-Term III D Long-Term See glossary for operational definition of these terms 1

i Figure 5.2 Suggested Schedule for Corrective Actions I 5-9 t

i Public Service of Indiana 6.0 FINAL

SUMMARY

REPORT i Upon completion of the DCRDR, a detailed summary of the results will be prepared and submitted to the NRC for review. The final report will describe the results of the DCRDR and will be submitted within six months after completion of the review. This report, following the format recommended by the NRC, will summarize the review

process, provide descriptions of the identified Human Engineering Discrepancies (HEDs),

detail proposed corrective actions and present implementation schedules for each action. De. tails of the DCRDR, along with complete documentation, will be available for NRC evaluation and review. The final report will specify the personnel who participated in the Detailed Control Room Design Review and delineate their qualifications. It will also indicate any modifications or revisions made to the Program Plan submitted,to the NRC. These may become necessary periodically throughout the DCRDR and will be described by the Review Team in the Report. A summary of the Operating Experience Review processes and results will.be contained in the Final Report. The _ types of Historical Reports reviewed and the period of time they covered will be provided. The experience levels of the surveyed I operators as well as the procedure used to conduct the survey will'be summarized. l 6-1 l [

k Public Service of Indiana The final report for the DCRDR will provide a summary of processes involved 'in the system function review and task analysis and will contain: o Charts or lists of major systems and subsystems, and their major components o Task descriptors,. organized by system o . System instrumentation and control requirements as identified in the task analysis Data management procedures used to record review data and provide a data base for the system review will be described. Samples of control room inventory forms and forms used in the control room survey will be provided. Procedures used for verification of task performance capabilities and validation of 4 control room functions will be summarized. i Findings.of the DCRDR will be organized-according to the chapter headings suggested in NUREG-0700. Each chapter heading will describe identified discrepancies, potential safety j consequences and identify the proposed corrective-action. I Details of the assessment procedure used in this process will be summarized and supporting documentation will be provided. Changes which do not provide a full and complete correction of an identified HED, or decisions to allow a discrepancy (which was assessed to be corrected) to remain, will.be justified ' and information pertinent to such decisions will be provided. Identified design improvements, whether safety-related or not, will be described. l i 6-2 ~ -. - - -, ,,.-,..n

Public Service of Indiana The summary report will address review findings at the individual control room system level based on the control room survey or task analyses. Further discussion will be directed to review findings and solutions identified during the operating experience

review, task performance capability verification and operating crew function validation.

A copy of the Operations Personnel Questionnaire used to collect the personnel data, as well as copies of other pertinent forms, will be contained in the appendices. Implemented or proposed design solutions and implementation schedules will be described. Such scheduling will be governed by priorities and any departure fron this prioritization will be explained. This tentative implementation schedule will include a plan to insure adequate review of planned improvements. Any deviation from the proposed DCRDR methodology described herein, will be discussed and appropriate explanation provided. l 6-3

Public Service of Indiana 7.0 BIBLIOGRAPHY 7.1 U.S. Nuclear Regulatory Commission Regulations TM1-2 Lessons Learned Task Force. Status Report and Short-tern Recommendations (NUREG-0578). Office of Nuclear Reactor Regulation, U.S. Nuclear Regulatory Commission, Washington, DC: July 1979. TM1-2 Lessons Learned Task Force. Final Report (NUREG-0585). Office of Nuclear Reactor Regulation, U.S. Nuclear Regulatory Commission, Washington, DC: October 1979. U.S. Nuclear Regulatory Commission, Office of Inspection and Enforcement, Division of Emergency Preparedness. Functional Criteria for Emergency

Response

Facilities, Final Report (NUREG-0696), Washington, l DC: February 1981. U.S. Nuclear Regulatory Commission, Office of Nuclear Reactor Regulation, Division of Licensing. Clarification of TMI Action Plan Requirements (NUREG-0737). Washington, DC: November 1980. i r i 7-1

k Public Service of Indiana U.S. Nuclear Regulatory Commission, Office of . Nuclear Reactor

Safety, Division of Human Factors Safety.

Criteria for Preparation of Emergency Operator Procedures (NUREG-0899). Washington, DC: June 1981. U.S. Nuclear Regulatory Commission, Office of Standards Development. Regulatory Guide 1.70, Standard Format and Content of Safety Analysis Reports for Nuclear Power Plants, (Revision 3) (LWR Edition). Washington, DC: November 1978. U.S. Nuclear Regulatory Commission, Office of Standards Development. Regulatory Guide 1.97, Instrumentation for Light-Water-Cooled Nuclear Power Plants to Assess Plant and Environmental Conditions During and Following an

Accident, (Revision 2).

Washington, DC: December 1980. U.S. Nuclear Regulatory. Commission, Office of Standards Development. Numan Factors Evaluation of Control Room Design and Operator Performance at .TMI-2, (NUREG/CR-1270.). Washington, DC: January 1980. U.S. Nuclear Regulatory Commission, Office of Standards Development. Handbook of Human Reliability Analysis with Emphasis on Nuclear Power Plant Applications, (NUREG/CR-1278). Washington, DC: 1981. U.S. Nuclear Regulatory Commission. NRC Action Plan Developed as a Result of the TMI-2

Accident, (NUREG-0600).

Washington, DC: May 1980. 7-2

Public Service of Indiana U.S. Nuclear Regulatory Commission. Supplement 1 Requirements for Emergency Response to NUREG-0737 Capability (Generic Letter No. 82-33). Washington, DC: December 1982. U.S. Nuclear Regulatory Commission. Reactor Safety Study: An Assessment of Accidental Risks in U.S. Commercial Nuclear Power Plants (WASH-1400; NUREG-75/014). Washington, DC: October 1975. 7.2 Supplementary References ARD Corporation. Human Factors Engineering Guide for CRT Display Design. Columbia, MD: February 1981. ARD Corporation. Licensed Operator Control Room Questionnaire. Columbia, MD: December 1980.

Bear, D.

E. " Plant Operator's Computer Interface." Instrumentation Technology, October 1975, 29-34.

Bozeman, W.

C. " Human Factors Considerations in the Design of Systems for Computer Managed Instruction." Association for Educational Data Systems

Journal, Summer 1978, 89-96.

Bourchard, T. J., Jr. " Field Research Methods: Interviews, Questionnaires, Participant Observation, Systematic Observation, Unobtrusive Measures." In M.D. Dunnette (Ed.) Handbook of Industrial and Organizational Psychology. Chicago: Rand McNally College Publishing Co., 1976. 7-3

Public Service of Indiana

Camm, W.,

and R. E. Granda (Eds.). Symposium Proceedings: Human Factors and Computer Science. Santa Monica, CA: The Human Factors Society, 1 June 1978. s

Chapanis, A.

Research Techniques in Human Engineering. Baltimore: Johns Hopkins Press, 1959.

DeGreene, K.

B. Systcms Psychclogy. New York: McGraw-Hill, 1970.

Dunnette, M.

D. and W. K. Kirchner. Psychology Applied to Industry. Englewood Cliffs, NJ: Prentice-Hall Inc., 1965.

Egeth, H.

E. Conditions for Improving Visual Information Processing (Final Report ONR-TR88). Baltimore: The Johns Hopkins University, 31 August 1976.

Engel, J.

D. An Approach to Standardizing Human Performance Assessment (NTIS AD-717258). Paper presented at the Planning Conference of Standardization of Tasks and Measures for Human Factors

Research, Texas Technological University, Lubbock:

March 1970. Federal Energy Regulatory Commission. The Con Edison Power Failure of July 13 and 14, 1977 (Final Staff Report DOE /FERC-0012). Washington, DC: U.S. Government Printing Office, June 1978.

Flanagan, J.

C. " Critical Requirements: A New Approach to Employee Evaluation." Personnel Psychology, 2_, (1949): 419-425. f 7-4

Public Service of Indiana

Flanagan, J.

C. "The Critical Incident Technique." Psychological Bulletin, 51, (1954): 327-358. Fleishman, E. A., and A. R. Bass. Studies in Personnel and Industrial Psychology (3rd ed.).

Homewooi, IL:

Dorsey Press, 1974.

Foley, J.

P., Jr. Task Analysis for Job Performance Aids and Related Training (NTIS AD-771001). Brooks Air Force

Base, TX:

Air Force Human Resources Laboratory, November 1973. Goodstein, L. P., et al. The Operator's Diagnosis Task Under Abnormal Operating Conditions in Industrial Process Plants. Denmark: Danish Atomic Energy Commission, June 1974.

Greene, B.

F. "A Primer of Testing." American Psychologist, 36, (1981): 1001-1010.

Greene, D.

M., and J. A. Swets. Signal Detection Theory and Psychophysics. Huntington, NY: Krieger, 1974

Guilford, J.

P. Psychonetric Methods. New York: McGraw-Hill, 1954. i

Jennings, A.

E. and W. D. Chiles. An Investigation of Time-Sharing Ability as a Factor in Complex Performance (FAA-AM-76-1; NTIS ADA 031881). Washington, DC: Department of Transportation, May-l l 1976. 4 4 7.., - -,

-l Public Service of Indiana

Katz, G.

H. Control Panel Design - An Additional Impact on Plant Operation. Paper presented at Conference on Reactor Operating Experience, Albuquerque, NM: 3-5 August 1975.

Kemeny, J.

G. -( Chairman). Report of the President's Commission on the Accident at Three Mile Island. Washington, DC: October 1979.

Meister, D.,

and G. F. Rabideau. Human Factors Evaluation in System Development. New York: John Wiley & Sons, 1965.

Morgan, C.

T., et al. Human Engineering Guide to Equipment Design. New York: McGraw-Hill, 1976.

Nunnally, J.

C. Psychorretric Theory. New York: McGraw-Hill, 1967.

Pack, R.

W. Conference Proceedings: Workshop on Power Plant Operator Selection Methods (Special Report EPRI SR28). Palo

Alto, CA:

Electric Power Research Institute, January 1965.

Pack, R.

W., (Ed.). Human Factors Research for the Electric Utility Industry (Draft Report EPRI NP-DI-79). Palo Alto, CA: Electric Power Research-Institute, February 1979.

Parks, D.

L., and W. E. Springer. Human Factors Engineering Analytic Process Definition and Criterion Development for Computer Aided Function-Allocation Evaluation l

System, (D180-18750-1).
Seattle, WA:

Boeing 1 j Aerospace Company, January 1976.* l 7-6 i

~. Public Service of Indiana i 4 i ) i -

Pedersen, O.

M. An Analysis of Operator's Information and il Display Requirements During Power Plant Boiler Start, ^ (RISO-M-1738). Denmark: Danish Atomic Energy Commission, December 1974.

Pope, R.

H. " Power Station Control Room and Desk Design, Alarm System and Experience in the Use of Cathode-Ray-Tube Displays" (IAEA-SM-226/5). In Symposium on Nuclear Power Plant Control and Instrumentation 1978 (Vol. I).

Vienna, Austria:

International Atomic Energy Agency, 1978. Proceedings of the 22nd Annual Meeting of the Human Factors Society, Detroit, MI: Human Factors Society, 16-19 October 1978. Rasmussen, J. Man-Machine Communication in the Light of Accident Records. Denmark: Danish Atomic Energy Commission, June 1974. Rasmussen, J. The Human Data Processor as a System Component: Bits and Pieces of a

Model, (RISO-M-1722).

Denmark: Danish Atomic. Energy-Commission, June 1974. I Richardson, D. C., and W. J. Frezel. " Evaluation of Methods for Sizing and Layout of Control Panels Used in Nuclear Power Generating Stations" (A77140-7). l Presented at the IEEE PES Winter Meeting, New York: January 30 - February 4, 1977. 7_7-

i l 4 1 i Public Service of Indiana

Sabri, Z.

A., A. A.

Husseiny, and R.

A. Danofsky.

_An, Operator Model

-for Reliability and Availability Evaluation of Nuclear Power Plants (ISU-ERI-AMES-76328). Ames, IA: Iowa State University, May 1976.

Seminara, J.

L. and R. W. Pack. Communication Needs of the Nuclear Power Plant Operator (F78700-7). Presented at the IEEE PES Summer Meeting, Los Angeles, CA: July 16-21, 1978.

Seminara, J.

L., et al. " Human Factors in the Nuclear Control Room." Nuclear Safety. ], December 1977: 790. ShizcDerg, B. " Testing for Licensure Certification." American Psychologist, 36, (1981): 1138-1146.

Sinaiko, H.

W., (Ed.). Selected Papers on Human Factors in the Design and' Use of Control Systems. New York: Dover Publications, 1961. f 7-8

Public Service of Indiana Appendix DCRDR Data Collection Forms l A-1

) Public Service Indiens OPERATING EXPERIE: ICE REVIEW REPORT System: Panel Identification Number: Component Identification Number: Component I?am e : 4 Date: Reviewers: Index No.: Error / Problem; l 1 Operatind Status: Maintenance / Systems Conditions: a i Transient Ancralies: Sequence of Events: Instrumentation Involved: Outcome: i i Co rre c tive Measures: 1 L Is identified component acceptable? YES: If no, descrepancy number: l i l A-2.

CONTROL ROOM HUMAN ENGINEERING DISCREPANCY RECORD HFS: Date: ~~No: Plant Unit: System: Panel Equipment Equipment Name ID# ID# Description of Discrepancy Photo Log No. Photography Instructions Photo Caption: Guideline No. & C

1. Workspace 6.

Labels & Aids Caption: O 2. Communications 7. Computer /CRT D

3. Annunciators 8.

Panel. Layouts E

4. Controls 9.

C/D Integration l S

5. Displays Other:

Comments: Assessment Category / Level. I II III ~ A-3a

RECOMMENDATION (S) RECORD REVIEWERS: RECOMMENDATION (S): 4 ACCEPT RECOMMENDATION NO.: REJECTION SIGNATURE REJECT RECOMMENDATION NOS.: REJECTION JUSTIFICATION: IMPLEMENTATION AND SCHEDULING l l TENTATIVE SCHEDULED COMPLETION DATE: -PROJECT ENGINEER APPROVAL. ( STATION ASST. SUPT. OPS. APPROVAL: I m COORDINATOR: 1 Bear COMPLETED: A-3b

lllli iIl!l S S YM R Rm YeM

  1. (

l Sr S Re R9

  1. e Y1 Y

4 D D R R A A OP OP BX BX E E L L OR OR RE RE TP TP NO NO O O S C C C M I T S RP RP I AX AX T EE s EE A L m. L T CR n CR S UE UE NP NP D O O NA S E E C G G I A A I IP A R G T T O i i l l M G G E I I i D E E I l I l N n O n I n T n A X $ X 3 n L E E U S 3 S 3 P OP g 4 R R O R O R T O T S O A S T A N T R R C R O C E O U E I U P T R P T R T O A S T O A S T S R N S R N S Y D E O N D E O N L P E P I I P E P I I A U S O T U S O T N O N A R O N A R A R E D DR O R E D DR O G C E EE T G C E EE T ri I S SP A L I S SP A L uM L N NO L L L N NO L L e E E - U A E E - U A N C CN M R N C CN M R O I IO I E O I IO I E N L LN S V N L LN S V O O Ye illlillll l

m-- JOB TITLE: TASK NO. PREPARED BY: STA. NO. TASK DESCRIPTION ACTION: ACTION STEPS: (Sequence of what must be done to accomplish ACTIONS) TASK CONDITIONS: (Givens, Denials, Environment) Frequency, Shift Day Wk. Mo. Bi. Quar. 6Mos. Year Cycle Other Once every: Mo. Initiating Cues: (When Does the task start) Performance Criteria: (What Does Job Incumbent Have to Accomplish) Physical Difficulty: duSW Mehib Nue61 Mental Difficulty. Teamp Mstf m l Safety Related Sys. Importance: 19BJfP MeD w Operational Importance: BUUP Mrse-TW A-5 i

M M i ? urG l N MG 5 to t I m T FI N OE 3 G .K o L 2 t AU W 5 t 1 I G I U T IW K S L0 B 1 l P f t G B F 1 O EP S Y C T S f S Y o J R O V T A I N T D E M I V r N e e f I g t 2 t a a M E R P D M C O I t F S A N R I Y K R S O A T T P N i r E T f2 V N N E M W I N 41 O IS t P N P 0 O O: I S P R C O M L r E E r P O e 7 Y R b 7 T L t T m H t I N K u 4T O S N I C A S T m N S I T e I i t R E s I U l L T y R S S W Y S R G E I E T R A W I A MW M W P tn e rC f D V m 2E e 1 I t 1 1 a R2 4 I TN t SA S NP I B e y n m B o a i N d t e c m r A N. e a P. t p k E s e s y r a S P T C IFIOS IC! P E rE pct SAS

+ l l .m,..

u. w a,_.

Page of EVENT: OPERATOR: i PROCEDURE (S): HLHAN FACTORS SPECIALIST: PROCEDURE STEP YES NO COMMENT HED INDEX NUMBER l I f l l 1 l i A-7 \\

AIR VELOCITY SURVEY RECORD Plant. Date: - Time: Sheet

  • _ of _

Measurements made by: Eauipment / Instrument used: Seriei#: C.I brsteon date: Location 6 f t. 4 f t. i O i i I i l l A-8

i HUMIDITY / TEMPERATURE RECORD pp,: Date: Time: 6 Measurements made by: Ih"' Of-Egg pment / Instrument used: 9;,g #; Calibration date: Time Hee @t Temperature Humedsty Remarks Floor 6 f t. Fioor 6 tt. Ftoor 6 f t. Floor 6 ft. Floor 6 it. FIoor 6 f t. Floor 6 ft. Floor 6 ft. i i l l l 1 iu


fA-k

i l I \\ l I i i 1 LIGHTING SURVEY-LUMINANCE AND REFLECTANCE RECORD Tuse: $ sees e es_ meae seeme Dere. Lausament # anserwsnmns mese: _ snam ey; Sard#: Cmiereseen emme: Casentee.ame flemasse Meassense he hE - Ossionance Mese Lecm ae. Psame e es Onessesv. Puest N"' P8 noe e=s y e &arsemos 80e8*ss 8 Me'8eEE. Ese dese Seeease Peace. "**0" eJGiese se Gase eJGesse nn Gasse growse 4 e 1 W 4 i A-10

LIGHTING SURVEY ILLUMINANCE RECORD Plant: Date: Twns:, Mesmerements enesse try: Sheet +_ of - Ecivipment / Instrument uned: Series #: Calitwaten date: Panel Full J.C Full 1.D. No. Amtsment Emergency g' A-11 )

l l SOUND SURVEY RECORD l Plant: Date: Time: Sheet

  • of _

Measurements made by: Equipment / Instrument used: Serial * : Calibration Date: Operator Wort Station d8(A) Remarks 250 500 1K 2K 4K 4 l l i e I A-12

STATION: PHOTOGRAPHER: UNIT: DATE: SLIDE e PHOTO SEQUENCE F SHUTTER INDEX LOCATION CAPTION DESCRIPTIVE i NUMBER STOP SPEED NUMBER CODE CODE CAPTION .i d J e k a 1 4 L i I f f M A-13 ._. _ -.. _. _,.. _..... _.. _ _. _ _. -. -. ~. - -... _. _.

1) Station Number 2) System.Name 3) System Designation Identifier 4) Piping and Instrumentation-I Diagrams used l 5) References 5.1 System Procedures Used: 1 I 5.2 System References Used: ) 4 5.3 System Characteristics described: i i 5.4 System Function (s): 4 6) System Functions Verified: Human Factors Specialist j i I i l A-14 i

INDEX OF REVIEWED REPORTS Index# Problem Title Priority

  • Report Report Disposition **

Tyoe Number Priority: H=

high, L=

low Problem deemed CORRECTED, (no additional investigation warranted) Disposition: C = UC = Problem deemed UNCORRECTED, (additional investigation warranted) A-15

-~. 1. 4 Public Service Indiana 12. During the event simulation, the HFS should observe the event to' record work station-work flow information using the floor diagram of the unit work space I' prepared earlier as a guide. The information recorded should include the: i i A) Direction of movement B) Sequence of movement C) Frequency of the movement 3 l .D) Estimated time criticality of the movement i An estimate of the time that the operator (s) spent'at the workstations can be obtained from the video tapes. i Data Recording + i The HFS accompanying the operating crew operator (s) during the event walk-through/ talk-through will be evaluating the operator l performance versus the control board / control room design criteria, specified earlier in this guideline, for each step of the procedure (s) being used for the event under consideration.. I-A checklist (Validation Review Worksheet Figure 4.8) will be ' t I used to record the HFS's evaluation of each procedure step. If the criteria are met in a particular step of the procedure, the "Yes" column will be checked. If the criteria are not met, the "No" column.will be check'ed, and the' discrepancy will.be recorded in the " comments" column. For the event simulation video tape. walk-through, the data will of course be recorded on the video tape of the event. 4 I {' Data Analysis l l A number of methods will be used to analyze and process the j information obtained in the validation review. The methods f 9 4-34 . _. _ -. _. _ - _ _. -, _ _.. _., _ _, _. _... _ -....... _ _,.. -. -... _.. _,,, _ _, _ -, ~., _., _

Figure 4.8 VALIDATION REVIEW WORKSHEET Page-of f EVENT: OPERATOR: PROCEDURE (S): HUMAN FACTORS SPECIALIST: i. PROCEDURE STEP YES NO COMMENT HED INDEX NUMBER i i i e i l l i ( l t 4-35

Public Service Indiana will vary according to the type of data collected and the manner in which it was collected. For the data collected by the HFS during the walk-through/ talk-through, an HFS will cross-check the comments recorded on the Validation Review Worksheet versus HEDs documented in previous review processes of the DCRDR. If a comment is addressed by one or more existing HEDs, it will be reported as an HED. The work station-work flow data collected in both the walk-through narrative approach and the event simulation walk-through approach will be compared by an HFS for congruence. Instances of incongruency will be investigated to discern the cause(a). Procedural and/or training modifications may be recommended as a result of these investigations, and a resolution of the incongruencies will be obtained and recorded. Diagrammatical and/or mathematical link analysis techniques will then be employed on the observational data. The results of these analyses will be reported as design modifications to improve / enhance work flow. The event simulation video tapes will be reviewed and analyzed jointly by the HFS and a subject matter expert. As the tape is reviewed for each procedural step sequence, the subject matter expert will comment on the following: A) Actions taken B) Information sources used C) Conversions or uncertainties involved D) Controls used E) Expected system response F) How those responses would be and/or may be verified G) Actions to be taken if the expected responses do not occur 4-36

Public Service Indiana 1 H) Additional assistance necassary or desirable from personnel outside the control room (if appropriate) Using the information supplied by the tapes and the subject matter expert, the HFS will evaluate the operator performances versus the control board / control room design criteria. For each procedural step sequence that meets the criteria, the HFS will check the "Yes" column of the Validation Review Worksheet. For each procedural step sequence that does NOT meet the criteria, the HFS will check the "No" column and record the discrepancy in the " Comment" column. The HFS will then compare the worksheet completed for the simulation walk-through of the event with the worksheet completed for the walk-through/ talk-through of the event. Any discrepancy not recorded on the walk-through/ talk-through worksheet will represent a new discrepancy via the video tape event simulatica approach, and will be reported as such. Reporting of Analytical and Observational Results Discrepancies discovered in the evaluation of the walk-through narrative and the video taped event simulation walk-throughs will be recorded on an HED description form (see Appendix). 4-37 -}}