ML20197G858

From kanterella
Jump to navigation Jump to search

Evaluation of Dcrdr Program Plan for San Onofre Nuclear Generating Station Unit 1. Suggested Agenda for Unit 1 in-progress Audit & Units 2 & 3 Preimplementation Audit Encl
ML20197G858
Person / Time
Site: San Onofre  Southern California Edison icon.png
Issue date: 03/27/1986
From:
SCIENCE APPLICATIONS INTERNATIONAL CORP. (FORMERLY
To:
NRC
Shared Package
ML13333B403 List:
References
CON-NRC-03-82-096, CON-NRC-3-82-96 TAC-51201, TAC-51202, NUDOCS 8605160502
Download: ML20197G858 (30)


Text

y o-O Ef1 CLOSURE 1 h

EVALUATION OF THE DETAILED CONTROL ROOM DESIGN REVIEW PROGRAM PLAN FOR SOUTHERN CALIFORNIA EDISON COMPANY'S SAN ONOFRE NUCLEAR GENERATING STATION UNIT 1 March 27, 1986 Prepared by:

Science Applications International Corporation Under Contract to:

The United States Nuclear Regulatory Commission Contract NRC-03-82-096 Q'(;(f)ELh(6_Nhe

,a

w..gv

EVALUATION OF THE DETAILED CONTROL ROOM DESIGN REVIEW PROGRAM PLAN FOR SOUTHERN CALIFORNIA EDISON CONPANY'S SAN ONOFRE NUCLEAR GENERATING STATION UNIT 1 Science Applications International Corporation (SAIC) has evaluated the Program Plan (Reference 1) submitted by Southern California Edison Company (SCE) for conduct of a Detailed Control Room Design Review (DCRDR) at the San Ono fre Nuclear Power Station, Unit 1 (SONGS-1).

The purpose of the evaluation is:

1.

To determine whether the planned program will result in a successful DCRDR.

2.

To determine whether an in-progress audit or meeting is necessary.

3.

To provide an audit agenda where appropriate.

4.

To provide constructive feedback to SCE.

. The evaluation was conducted relative to the requirements of Supplement 1 to NUREG-0737 (Reference 2).

Additional guidance was provided by NUREG-

~

0700 (Reference 3) and Section 18-1, Revision 0, of NUREG-0800 (Reference 4).

This report provides the results of the evaluation.

DISCUSSION Establishment of a qualified multidisciplinary team.

The organization for conduct of a successful DCRDR can vary widely, but is expected to conform to some general criteria.

Overall, administrative leadership should be provided by a utility employee.

The DCRDR team should be given sufficient authority to carry out its mission.

A core group of specialists in the fields of human factors engineering and nuclear engineer-ing are expected to participate with assistance as required from other disciplines.

Sta f fing for each technical task should bring appropriate expertise to bear.

Human factors expertise should be included in the 1

+

s 4

sta ffing for most, if not all, technical tasks.

Finally, the DCRDR team should receive an orientation which contributes to the success of the DCRDR.

Section 18-1, Appendix A, of NUREG-0800 (Reference 4) describes criteria for the multidisciplinary review team in more detail.

The licensee's Program Plan describes the composition of tSe review team as having a core group of specialists in the, fields of human factors

~

engineering, plant operations, and nuclear and electrical / instrumentation and controls (I&C) engineering.

This core group includes personnel who are also knowledgeable in licensing, training, program management, and other NUREG-0737, Supplement 1 programs such as SPDS and upgrade of E0Is. Resumes for most members of this core group were provided. (Resumes for Station Technical Advisor M.B.

McKinley and Station Engineer A.T.

Herring were nct

' ~ kdHded.) A review of this information shows the core review team will have both the multidisciplinary composition and qualifications suggested in NUREG-0700. However, the Program Plan does not mention whether the core group will be supplemented by personnel with other expertise that may be needed for lighting and noise surveys.

The ultimate responsibility for the DCRDR will reside with the SCE Management personnel.

However, the review team will be responsible for the

_,lanning, scheduling, coordinating, and integration of CRDR activities.

The p

l DCRDR Project Director, Donald Burgy, of General Physics Corporation will serve as the primary contact between the project manager and the senior members of the corporate staff.

The Project Manager, Dr. L. Schroeder, also of General Physics Corporation, will provide technical direction and serve l

as the primary client contact. He will be responsible for the preparation of project reports, control of schedule and budget, and also provide human factors engineering input to the DCRDR team.

Qualified human factors consultants will be assigned project work, particularly with respect to human factors technical issues, and will report directly to the project j

manager. The licensee has provided a functional CRDR team organization, but has not described specific task assignments and levels of effort of the review team members. The licensee should describe specific task assignments of each person, such as who will participate in the task analysis and levels of effort assigned to the team members.

2

~,

4 In conclusion, the personnel comprising the DCRDR core team generally meet the qualifications recommended in NUREG-0700, and the licensee should be able to accomplish the DCRDR in accordance with NUREG-0737, Supplement I requirements.

However, the two requested resumes and task assignments by name (person), discipline, and level of effort should be provided for each DCRDR element.

Function and Task Analysis.

The objective of the system function and task analysis that will be performed at SONGS-1 is to establish the input and output requirements of control room operators' tasks under emergency conditions. To accomplish

- - thi-s,-the licensee proposes to conduct a top-down approach to the function and task analysis beginning with an identification of systems and system functions using SONGS-1 Emergency Operation Instructions (E01) Bases Documents, System Descriptions, and other existing plant documentation. The function (s) of the system (s) and the conditions under which they are used will be described.

The licensee then proposes to define a set of scenarios using SONGS-1 safety and safety-related systems and function descriptions.

Each scenario will be briefly described in order to establish the limits and conditions of the events to be analyzed.

The licensee indicates that Stenarios chosen will be checked to ensure that they adequately sample "various" emergency conditions and the plant systems and functions used in those conditions.

SONGS-1 plant-specific E0Is will also be identified at this point.

As a first step in the actual task analysis, operator tasks required in each scenario will be identified from the Westinghouse Owner's Group (WOG)

Emergency Response. Guidelines (ERGS).

Next, the operator tasks will be broken down into plant-specific steps using the E0Is to reflect a step-by-step procedural set of actions that must be carried out in order to accom-plish the task.

The task steps along with a brief description of the tasks will be recorded on Task Analysis Worksheets.

These two steps will be completed by operations, engineering, and human factors personnel.

Thirdly, operator decisions and actions linked to task performance will be identified and recorded on the Task Analysis Worksheets.

Branching points in the E01s that determine the outcome of the operating sequence will 3

o also be recorded.

Finally, information and control requirements for suc-cessful task performance will be developed and marked on the Worksheets.

Recorded variables will include system components, specific parameter

values,

" relevant characteristics," and procedural information.

This information will be based on "E01s, E01 Bases Documents, and Technical Specifications" (p. 3-16). 'The licensee notes that the preceding four steps will be completed independent of the control room.

However, the licensee neglects to indicate who will te completing steps 3 and 4.

The Function and Task Analysis as described in the Program Plan is fairly comprehensive in scope. However, several concerns have surfaced from a review of the licensee's description.

Fi rst, the licensee states that the

- - set 4f plant systems and subsystems selected for the analysis will be

" comparable" to safety and sa fety-related systems from the WOG Emergency Response Guidelines (ERGS).

The reviewers are concerned that this set may not include all safety and safety-related systems from the WOG ERGS except plant-specific deviations.

Similarly, the licensee indicates a

" representative" set of scenarios will be defined for the scope of the analysis.

Without assurance that the systems selected will be all inclusive (except for plant-specific deviations) and that the scenarios selected will encompass all operator emergency functions and tasks, the reviewers cannot conclude that~ the task analysis will be complete.

Second, it appears to the reviewers that the task analysis may be descriptive rather than prescriptive. The Program Plan states that "the operator tasks will be analyzed using the selected plant-specific E0Is as a sta rting ba sis" (p. 3-15).

The discussion goes on to indicate that the specific values for information and control requirements will be based on the E0Is.

If the E0ls are written to a high level such that they will not bias the task analysis, then this approach may be satisfactory. However, i f the E01s have been written to accommodate the existing control room design, the independence of the task analysis will be questionable.

In other words, the licensee's task analysis would describe what exists in the control room rather than prescribe what should be in the control room in order to compre-hensively and objectively identify missing and/or unsuitable controls and displays.

Based on the limited information provided in the Program Plan, we cannot determine which is the case.

SCE should provide a more detailed discussion about the E0Is and their use in the task analysis.

4

Third, the Program Plan defines the information and control require-ments that will be developed in the task analysis as " system component / para-meter" and " relevant characteristics" (p. 3-19).

The reviewers are not clear what " relevant characteristics" will be developed.

According to Figure 12 in the Program Plan, the task analysis worksheet will contain information and control characteristics including " type of component, range, units, positions."

If this is an all inclusive list of characteristics, it is incomplete.

Information and control requirements should include the required precision and accuracy of instruments.

Without a more precise definition of " relevant characteristics," the thoroughness of the task analysis cannot be determined.

Fourth, there is a concern regarding a review of the Task Analysis Worksheets using SONGS-1 experience.

Figure 8 (p. 3-12) depicts the activities comprising the task analysis and indicates that the Task Analysis Worksheet will be reviewed by SONGS-1 operations following the task analysis.

No further discussion of this activity is provided in the text of the Program Plan.

If this step means that operations engineers will review the task analysis results for accuracy, then the reviewers find this to be acceptable.

However, the reviewers do not know how and why SONGS-1 opera-tions will modify the task analysis and whether it will jeopardize the independence of the analysis.

To clarify the nature and purpose of this step, further information should be provided.

Fifth, SCE has omitted information describing who will perform the final two steps in the task analysis.

The Program Plan indicates who will complete the first two steps (identifying task steps; defining tasks) but not who will identify operator decisions / actions and develop the information and control requirements. Additionally, an account of the control room

' nventory appears before the task analysis in the Program Plan.

If this i

reflects the order in which these activities will be performed and the same personnel will be involved in both, then the results of the task analysis wiil be biased. The reviewers need to know the order ot these activities and the personnel peforming each step in these activities.

1 In summary, more detail on the task analysis methodology should be l

provided to assure the NRC that the completeness and objectivity will result j

in a successful review process. Specifically, SCE should present documenta-5

tion describing the use of E01s in the task analysis as well as how and why the Task Analysis Worksheets will be reviewed using SONGS-1 experience.

In addition, SCE should provide the NRC with some assurance that all safety and sa fety-related systems from the WOG ERG's will be included in the task analysis and some indication of the scenarios that will be used.

Finally, SCE should provide more explicit information concerning the meaning of

" relevant characteristics" and the personnel conducting all steps of the task analysis.

The licensee should address these concerns in order to assure the NRC that this requirement in Supplement 1 to NUREG-0737 will be satisfied.

A Comoarison of Display and Control Requirements With a Control Room Inven-

_ _ tory _

The SONGS-1 DCRDR effort will include the compilation of a control room inventory followed by a verification of task performance capabilities.

As both activities are necessary to satisfy the inventory requirement of Supplement 1 to NUREG-0737, both are described below.

The licensee proposes to use the results of the task analysis to judge the adequacy of the control room inventory. Following the specification of tasks, decision requirements, and information and control requirements, SCE plans to document the existing Instrumentation and Controls (I&C) that the operator uses or can use for each procedural step.

The licensee indicates t

l that the results of this phase will be a control room inventory listed on the Task Analysis Worksheet in the column "I&C Identification" (p. 3-23).

The parameter, range, scaling units, and related information will be documented in a separate listing.

The presence or absence of the information and control requirements that were identified in the task analysis as required by operators will be j

determined first.

This will be accomplished by " comparing the postulated requirements in the 'Information and Control Requirements' column of the Task Analysis (Worksheet) to the actual control room I&C listed in the 'I&C l

Identi fication' column" (p. 3-22).

Whenever required information and j

controls are not available to the operator, an HED will be documented.

i i

6 l

t

The suitability of information and control needs will be determined by comparing available I&C with human engineering criteria. SCE plans to check suitability by asking three main questions:

1) Does the equipment provide appropriate information/ feedback for the task?

2)

Does the equipment provide actual (direct) system status information?

3)

Is the equipment usable?

The licensee's suitability check will determine "for example, if a meter...

has the appropriate range and scaling to support the operator in the corre-

,, sponding procedural step" (p. 3-23).

If the~ meter is not appropriate, it will be recorded as such on the Task Ar.alysis Worksheet and documented as an HED.

The principal concern with the described approach goes back to the concern with the task analysis.

As indicated earlier, the licensee intends to use the results of the task analysis to judge the adequacy of the control room inventory (I&C availablity).

One of the concerns noted regarding the task analysis is that procedure steps may be written around existing instru-

, mentation and controls in the control room.

Given the concerns identified in the previous section, it appears that the verification of I&C availa-bility may be jeopardized.

If the task analysis is unduly influenced by the existing control room and E0Is, then the task analysis results, when compared with the control room, will yield few findings.

Another concern with the verification of I&C availability regards the information and control characteristics identified in the control room inventory.

The Program Plan states that " parameter, range, scaling units, and related information will be documented" (p. 3-23). The reviewers are not clear what "related information" will be documented. There is a concern that "related information" will not include scale divisions or increments.

Without a more precise definition of "related information," the thoroughness of the comparison of display and control requirements with the control room inventory cannot be determined.

7

The next step of the review process will be the validation of control room fonctions. The purpose of this step is to examine the interactions and dependencies of the operating crew and equipment.

Additionally, this step also provides for identification of HEDs that may have been identified in other DCRDR steps.

The licensee's methodology for conducting a validation of control room functions appears to be adequate.

The use of walk-throughs with operators should yield valuable findings.

However, there are a few concerns related to this validation.

First, it is not clear whether auxiliary opera-tor tasks will be included in the validation of control room functions since there is no precise outline of the scope of the walk-throughs.

Information gathered in an evaluation of tasks allocated to auxiliary operators would prove vallable to SCE in ensuring that those capabilities are needed in the

~~

control room. Second, SCE plans to use a full-scale photographic mock-up to conduct the walk-throughs. The Program Plan does not indicate how limi-tations of the mock-up will be addressed or how current the mock-up is.

I Third, once again there is no indication of who will be involved in this phase of the DCRDR. The reviewers cannot be sure that a human factors specialist and other essential personnel will conduct the walk-throughs.

In conclusion, until the licensee provides information to demonstrate that the task analysis is unbiased and that their use of the E0Is will be objective and unbiased, a satisfactory assessment cannot be made of the comparison of display and control requirements with the control room inven-tory. Also, SCE should provide more explicit information concerning the information and control characteristics to be documented in the control room inventory.

Finally, the effort should include the participation of a human factors specialist in addition to the operations expert and I&C engineer mentioned in the Program Plan.

Regarding the validation of control room functions SCE should clarify the reviewers questions concerning the scope of the walk-throughs, evaluation of auxiliary operator tasks, the use of a photographic mock-up, and provide a complete list of the participating personnel.

i 8

Control Room Survev.

The key to a successful control room survey is a systematic comparison of the control room against accepted human engineering guidelines.

One accepted set of human engineering guidelines is provided by Section 6 of NUREG-0700. Discrepancies between the control room and human engineering guidelines should be documented as HEDs.

The purpose of the control room survey described in the Program Plan, on page 3-6, is consistent with the survey requirement in Supplement 1 to NUREG-0737.


The licensee mentions on page 3-8 of the Program Plan that the survey of the control room will be conducted using human' engineering guidelines contained in NUREG-0700.

It further mentions on p. 3-9 that the survey will be performed by a team composed of human factors engineers and operations personnel in the SONGS-1 control room or in the mock-up for those guidelines that are applicable. However, the Program Plan does not indicate how SCE plans to address dynamic criteria such as the reach envelope and brightness of indicators that cannot be evaluated using a one-dimensional mock-up.

-The checklists have been designed to provide a "yes" or "no" response and contain both quantitative and qualitative statements.

Samples of the checklists which could have aided our evaluation of the methodology were not provided; however, an HED form was provided. The HED form indicates that discrepancies will be documented with information for assessment purposes and future retrieval during the corrective phase.

In addition, HED infor-mation will be stored in a CRDR Computer Database System.

In conclusion, it appears that the licensee has developed a complete p ogram to accomplish the control room survey. As indicated, a team will be assembled with necessary materials to conduct a systematic survey. However, samples of the checklists should be provided to aid the reviewers in evaluating the methodology. The licensee should also further clarify the evaluation of dynamic criteria.

Although not an explicit requirement of Supplement 1 to NUREG-0737, the inclusion of the remote shutdown capability in the licensee's DCRDR is strongly recommended.

9

Assessment of Human Engineering Discrepancies (HEDs).

The objective of this requirement of Supplement 1 to NUREG-0737 is to assess Human Engineering Discrepancies (HEDs) identified in the review phase in order to determine which HEDs are significant and should be corrected.

In SCE's Program Plan, the assessment of HEDs does not constitute a separate phase from the selection of design improvements.

However, these two activi-ties are performed sequentially.

First, HEDs will be assessed and, second, design improvements will be selected. Both steps will be done by the HED Assessment Team.

The HED assessment results and proposed design improve-ments will then be evaluated by an Evaluation Team and, finally, authorized or rejected by a Site Change Committee.

-~- %pparently, the actual assessment of HEDs identified in the review activities will be performed by the Assessment Team.

The Assessment Team will review each identified HED to " verify that it is in fact an HED" (p. 4-1 ).

The Program Plan provides a list of questions to be applied in deter-mining if an HED should be an HED (Table 1, p.403).

HEDs that are not

" deleted" as a result of this review will then be assigned a priority based on the potential for error, safety significance, and technical specification conformance.

These three factors were considered for their relative impor-tance, and corresponding weights were assigned (potential for error 0.555, safety significance 0.167, unsafe condition or technical specification violation 0.278}.

When assessing an HED, each factor will be given a relative magnitude based on a scale ranging from 0 (none) to 5 (very high or documented) to determine scale magnitudes of factors 1, 2, and 3 (M1, M2, and M3).

The scale magnitude of each factor will be multiplied by the relative weight of each factor (W1, W2, and W3) and then summed to get a point value for each HED. The point value determines the assignment of HEDs to nine priority levels. The higher the point value of the HED, the more critical is the need for correction.

Priority levels 1, 2, and 3 appear to be safety-related, whereas priority lev:Is 4 to 7 appear to be nonss faty-related but high potential for error, etc. Priority levels 8 and 9 appear to be neither safety-related nor significant.

There are several concerns regarding SCE's proposed plan for assessing HEDs. First, it is not clear why it is necessary to examine discrepancies identified in the review activities to verify that they are really HEDs.

10

There is a concern that discrepancies of importance (potential for error or safety-significant) could be discarded prior to the formal assessment and assignment to a priority for correction.

Furthermore, it is not clear precisely how each discrepancy will be " verified" using the criteria pro-vided in Table 1 (pp. 4-3 & 4-4).

How many "Yes" or "No" answers are required to determine whether the observation is, in fact, an HED? Does each criterion carry equal weight in the decision-making process?

SCE should thoroughly explain this verification process.

A second concern regards the method for categorizing and establishing priorities for correction of HEDs.

From the discussion provided in the Program Plan, it is not clear how SCE arrived at the three weight factors.

Whi1e assignment of 55% to potential for error may be reasonable, though a little high, the rationale supporting the assignment of 16% to degree of safety significance and 27% to unsafe condition and/or technical specifica-tion violation, respectively, is not understood and seems unreasonable.

Obviously, both the latter factors are related; however, without more detail regarding what each of these factors contains, the weights assigned seem disproportional.

For example, a low score on the " technical specification violation" would decrease the entire assessment of an HED.

Also, a zero assigned to " potential for error" does not seem valid and the assessment resu,lts would be inaccurate.

A more detailed discussion on the weighting of these factors is necessary in order to understand and assess fully this priortization method.

In an organization chart entitled "HED Processing" (p. 4-8), SCE states in the first note at the bottom that an "HED may be processed with a majority of team members present." This implies that an HED can be assessed (and design improvements selected) without all team members present.

The reviewers are concerned that the necessary expertise will not be involved in the assessment of the HEDs (and the selection of design improvements).

We are particularly concerned that HEDs could be assessed without the input of the human factors consultant (HFC).

SCE needs to affirm that all necessary personnel, including the HFC, will participate in the assessment of HEDs.

In the third note, the organization chart also states that " delegated replacements" can participate in the place o f designated members.

The Program Plan provides no indication as to the identity of these replace-11

ments.

The reviewers' concern is that the replacements will not have adequate backgrounds and will not be as closely involved in the DCRDR as the designated team members. SCE should provide more information concerning the personnel who will serve as replacements on the Assessment and Evaluation Teams in order to assure the NRC that qualified personnel will be assessing and resolving HEDs.

The reviewers also have a question regarding the assessment of cumula-tive effects of the HEDs.

In the Program Plan, SCE states that individual Priority 9 HEDs will be reviewed by the DCRDR team for potential cumulative or interactive effects.

HEDs found to have a potential interactive effect will be reevaluated and reassigned to a higher priority as necessary.

It is not clear why only Priority 9 HEDs will be examined for cumulative effects.

'~

Th'elcensee would have a more thorough assessment if all priority 4 through

~

9 HEDs are examined for cumulative or interactive effects and reassigned to a higher priority as necessary.

A final question is the sequence for assessing HEDs.and selecting design improvements.

On the HED form provided on page 3-10 of the Program Plan, recommendations for design improvements appear before the category rating.

The reviewers are concerned that if this reflects the order in which these. steps will be executed, then this process will lead to an assessment biased by the recommendations and not purely based on plant safety and probability of operator error.

In summary, SCE has not clearly defined all aspects of the process for

{

assessing hEDs.

In particular SCE should provide a more detailed explana-tion of the procedure for verifying that HEDs are really HEDs as well as the procedure for weighting the factors used in assigning priorities for correc-tion.

SCE also needs to clarify the personnel serving on the Assessment and Evaluation Teams and assure the NRC that the HFC and other essential personnel will be assessing and prioritizing all HEDs.

Finally, the Program Plan would be enhanced if SCE assessed all HEDs with a lower priority for cumulative effects and clarified the sequence of HED assessment and selec-tion of design improvements.

Until the requested information is provided, it cannot be determined whether the process described by the licensee will be satisfactory to meet this requirement in NUREG-0737, Supplement 1.

12

.e 4

Selection of Desian Improvements.

According to NUREG-0737, Supplement 1, the purpose of selecting design improvements is, as a minimum, to correct safety-related HEDs.

Selection of design improvements should include'a systematic process for development and comparison of alternative means of resolving HEDs.

As stated earlier, the assessment of HEDs and the selection of design improvements will not be performed in independent phases.

From a review of s

the Program Plan, it appears that HEDs will be assessed and resolved by the HED Assessment Team and then evaluated by the HED Evaluation team and autho-rized or rejected by the Site Change Committee (SCC).

While the Program Plan provides some description of the assessment prhcess, the process for

- - seTectit g d.esign improvements has not been described.

There is no descrip-tion of the techniques to be used in selecting the design improvements.

Furthermore, the Program Plan does not indicate how alternative corrective actions will be judged, or the criteria to be used in selecting one correc-tive action over ot!!ier alternatives.

Without a detailed descripti~on of the

~

process for selecting design improvements, an adequate evaluation cannot be made of this requirement.

Since it appears that there is no systematic process for' the develop-ment'and c'omparison of alternative means of correcting HEDs, corrections may float from committee to committee in a piecemeal fashion.

It is strongly recommended that design improvements be presented as a package to the SCC for approval.

By presenting, them as a package, the SCC can get a better overall picture of the des'? improvements.

Another concern a r s s the composition of the SCC.

According to Section 4.3 (p. 4-13) of tr.e Program Plan, the. SCC will have "overall responsibility and authority to review and accept or reject the scope, priority, budget category, and schedule" of the recommended corrective actions.

Yet, there 'is no indication that human factors (HF) considerations will be taken into account in their approval and judgments. The SCC appears to have no HF expertise and, consequently, the reviewers are concerned that the committee will not receive the necessary HF input when making final 13

decisions concerning corrective actions for HEDs.

SCE should provide some evidence that the SCC will have the necessary HF expertise and input when deciding on the fate of recommended corrective actions.

One final concern deals with the implementation schedule.

The Program Plan indicates that the development of a schedule to correct HEDs wil' be based on several factors including " category assigned, complexity of modifi-cation, additional engineering study requirements, engineering and equirment lead time requirements, (and) plant scheduling constraints" (p. 4-14).

Howeve'r, SCE stated earlier that each HED will be assigned to a priority level which will determine when the HED will be corrected.

According to the established priority levels, all HEDs of Priority 1, 2, and 3 are to be cor,rected promptly.

If the development of the implementation schedule is based too heavily on factors other than safety HEDs with high safety signi-ficance or potential for error (Priority 1, 2, and 3) may not be corrected i m med ia te ly.

The implementation of HED corrective actions should be performed in an integrated fashion with the schedule determined by the highest priority HEDS in the change package.

In summary, SCE has not presented a methodology for selecting design improvements.

SCE should provide a detailed description of this process to AJ10w us to understand and assess this requirement fully.

In addition, SCE should demonstrate that the SCC's final approval of HED corrective actions will take into account HF considerations and that the implementation schedule will result in the immediate correction of Priority 1, 2, and 3 HEDs.

In order to demonstrate the knowledge and commitment necessary to select and implement design improvements successfully and meet this require-ment of Supplement 1 to NUREG-0737, SCE needs to present documentation providing this information.

Verifications That Selected Desian Imorovements Will Provide the Necessary Corrrection and not Introduce New HEDs.

The process to verify that selected design improvements correct an HED without introducing new ones can be accomplished by a variety of ways.

NUREG-0800 suggests that the use of mock-ups is valuable for explor ing instrument arrangements, addition or deletion of instruments, and surface enhancements.

Reapplication of survey guidelines associated with design 14

changes, control room operator walk-throughs, and a re-examination of infor-mation and control needs are possible approaches toward meeting this requirement.

The licensee states on page 4-13 of the Program Plan that a review is conducted by the HED evaluation team to validate that the corrective actions do not create new HEDs.

However, SCE does not present the methods by which it will verify and validate that changes resulting from the DCRDR are effec-tive. The photographic mock-up could potentially be used for evaluating proposed changes.

Until the licensee presents such information, a complete review of this process cannot be accomplished.

Coordination of DCRDR With Other Programs.

1 This requirement of Supplement 1 to NUREG-0737 states that " improve-ments that are introduced should be coordinated with changes resulting from other improvement programs such as SPDS, operator training, new instrumen-tation (Regulatory Guide 1.97, Rev. 2), and upgraded operating procedures" (p.11). SCE presents a brief commentary on p.1-1 in the Program Plan for the following improvement programs:

9

,- Emerge.ncy Operating Instructions (E0!s)

Safety Parameter Display System (SPDS)

Regulatory Guide 1.97 Emergency Response Facilities l

Of these four programs, only the need to coordinate the DCRDR with the SPDS l

and Reg. Guide 1.97 has been recognized by SCE.

The Program Plan does not present any specific plans to address E0Is or ERFs in the DCRDR.

In addi-tion, no mention is made of coordinating changes resulting from the DCRDR l

with operator training programs.

A discussion of SCE's plans for coordinating the SPDS and Reg. Guide 1.97 with the DCRDR is presented in Section 6.2 (pp. 6-1 to 6-1).

While this discussion indicates that these programs will be coordinated and inte-grated with the DCRDR, there i; no description of precisely how this coordi-nation and integration will be effected.

For instance, SCE does not describe how the inputs from the task enalysis can be used in designing the l

l 15

SPDS or how these inputs can be used in verifying new instrumentation introduced to the control room as a result of Reg. Guide 1.97.

Fu rthermore, the Program Plan does not mention the management personnel responsible for coordinating these programs or the mechanism for performing this coordina-tion. Also, SCE does not provide a timetable or schedule showing the interaction of these programs. Although the licensee mentions an integrated living schedule, more specific information should be provided before the Summary Report submittal.

Another concern is the licensee's intention to use the SPDS as a means for resolving HEDs and delay its implementation.

The Program Plan states that "SPDS criteria development will be delayed until the role of the SPDS in re_ solving control room HEDs is established" (p. 6-2). When using the SPDS for resolving HEDs, the following concerns should be taken into account.

First, the SPDS is not expected to be a qualified 1E instrument.

If used to provide missing information and referenced in the E01s, then it must be a qualified IE instrument.

Second, operators often do not consider the SPDS to be reliable or worth consulting. Consequently, in an emergency situation operators are more likely to use the paneled hardwired instruments that they have used for years. Third, the SPDS is sometimes used by the shift tech-nical supervisor (STS) to provide a high level view of the system.

If this

_is the case, then the use of the SPDS by operators to provide missing parameters or instrumentation would be a conflict. However, SCE has not stated who the u~ser(s) will be.

Therefore, it is not possible to assess the suitability of the SPDS to provide operator information needs.

In summary, SCE has not demonstrated an awareness of the potential contributions the various improvement programs can exchange.

SCE should present a clear description of the process for coordinating all improvement progra'ms, including ERFs and training. The description should include a discussion of personnel involved in the integration as well as a timetable for the integration.

In addition, SCE should address the concerns regarding the use of the SPDS as a means for resolving HEDs.

Until shown otherwise, the reviewers must conclude that SCE has not demonstrated the knowledge or the commitment necessary to coordinate these programs successfully and meet this NUREG-0737, Supplement I requirement.

16

Operating Experience Review Although not a requirement of Supplement 1 to NUREG-0737, review of operating experience is bene ficial to the success of the DCRDR.

The SCE Program Plan indicates that an operating experience review will be conducted as part of the DCRDR at SONGS-1.

Consistent with guidelines in NUREG-0700, a review of plant operating experience, administration of questionnaires to operators, and o erator interviews will be included.

The licensee plans to review historical documents using specified guidelines and criteria to identify human factors implications for specific events.

Documents to be reviewed include Licensee Event Reports (LERs),

Station Incident Reports (SIRS), Station Problem Reports (SPRs) and Non Conformance Reports (NCRs).

It plans to use a Historical Documentation Review Summary Form (similar to Figure 5 of pg. 3-3 in the Program Plan) to summarize and document control room human factors problems identified in historical reports.

Findings from this review activity will receive further consideration in subsequent DCRDR activities.

SCE's operator survey effort entails distribution of confidential questionnaires to at least 50% of licensed operations staff.

They will be dist,ributed by Human Factors Consultants (HFS) to ensure uninhibited responses.

SCE plans to develop an interview format based on responses.

HFCs will then conduct follow-up interviews in the control room or mock-up.

Finally, the DCRDR Team will review data to ascertain whether the concerns encountered are Human Engineering Discepancies (HEDs) and document HEDs on an HED form.

In summary. SCE has proposed an extensive operating experience rev.iew consistent with NUREG-0700 guidelines and objectives.

To ensure that survey and interview questions are simple, clear, and objective, it is recommended that the survey instrumentation and procedures be pretested.

A plan for analysis of open-ended responses and procedures to ensure confidentiality and anonymity of respondents should be developed.

If these issues are resolved, SCE's operating experience review, as proposed, should augment the total DCRDR effort.

17

Conclusion SONGS-1 has submitted a Program Plan which addresses most of the DCRDR requirements to a sufficient degree to permit an understanding of the manner and process with which they will satisfy Supplement I to NUREG-0737.

In fo r-mation in the Program Plan has lead the reviewers to conclude the following:

An acceptable multidisciplinary review team has been established by e

SCE.

However, two resumes and task assignments by name (person),

discipline, and level of effort should be provided for each DCRDR element.

e The operating experience review, if conducted as proposed, should

-- ~~'~~

augment the DCRDR.

Human factors personnel should play a greater role in the task o

analysis, verification of task performance capability, validation of control room functions, and HED assessment activities.

e Documentation describing the use of E0ls in the task analysis as well as the purpose and nature of the review of the Task Analysis Worksheets using SONGS-1 experience should be provided.

In

~~ ~

addition, documentation describing the scenarios from which the tasks will be derived should be provided.

e Assurance that the process used to conduct a verification of task performance capabilities will identify missing and/or unsuitable controls and displays shoold be further demonstrated.

e The process of verifying that HEDs are really HEDs and the method for establishing priorities for correction of HEDs should be explained in greater detail.

e The procedures and criteria by which the licensee will select design improvements should be described, The procedures or mechanism by which the licensee will verify that e

selected design improvements will provide the necessary correction 18

e 2

and veri fy that no new HEDs are introduced should also be I

described.

The process for coordinating all improvement programs, including e

E01s, ERFs and training should be outlined and a timetable for overlap of programs indicated.

e The exact role of the SPDS in correcting HEDs and delay of its implementation should be explained further.

Overall, the Southern California Edison Company (SCE) has provided a

,, Program Plan that will essentially guide a DCRDR.

However, in view of the foregoing conclusions due to inadequate information, and in order to gain greater confidence in the licensee's understanding of and commitment to meeting the requirements of Supplement 1 to NUREG-0737, an in-progress audit should be conducted.

19

9 REFERENCE 3

" Control Room Design Review Program Plan," Southern California Edison Company, San Onofre Nuclear Generating Station - Unit 1 December 1985.

NUREG-0737, Supplement 1

" Requirements for Emergency Response Capability,"

USNRC, Washington, D.C., December 1982, transmitted,to reactor licensees via Generic Letter 82-33, December 17, 1982.

NUREG-0700, " Guidelines for Control Room Design Review," USNRC, Washington, D.C., September,1981.

NUREG-0800 (Standard Revision Plan), Revision 0. Section 18-1 and Appendix A

~ ~ to' Yection 18-1. September, 1984.

20

Suqqested Agenda for an In-Progress Audit.

J The following audit procedures and agenda are suggested for an in-progress audit.

1.

Hold a Kick-off meeting with the licensee to present and discuss iden-tified strengths and weaknesses in the Program Plan.

The licensee should present any clarifications and additional information/documen-tation at this time. The development of action items and the identifi-cation of any modifications to the remaining steps in the in-progress audit will occur at the end of the meeting.

,, 2.,,,The in-progress audit team should review the following areas of the licensee's DCRDR:

e Qualifications and Structure of the DCRDR Team.

Provisions for two requested resumes.

. Levels of involvement per ac.tivity for each member of the Review Group.

.a Function and Task Analysis Provisions for ensuring that systems and subsystems will include all safety and safety-related systems from the WOG ERGS except plant-specific deviations.

Description of scenarios.

Nature of the E0Is intended as the basis for the task analysis.

Definition of term " relevant characteristics."

Nature and purpose of the review of the Task Analysis Work-sheets using SONGS-1 experience.

Personnel conducting ~the various steps in the task analysis.

e Comparison of Display and Control Requirements With a Control Room Inventory Provisions for ensuring that the process used to conduct a verification of task performance capabilities will identify 21

missing and/or unsuitable controls and displays.

Definition of term "related information."

Participation of human factors specialist.

e Control Room Survey Provision of samples of the checklists.

Clarification of the process for evaluating dynamic criteria.

Assessment of the remote shutdown capability.

e Assessment of HEDs Nature of the verification of HEDs as real HEDs.

Methodology for weighting the factors used in assigning priori-ties for the correction of HEDs.

Participation of all necessary personnel in the assessment of HEDs.

Replacements for HED Assessment and Evaluation Team members.

Provision for assessment of all HEDs with a low priority for cumulative or interactive effects.

Sequence of HED assessment and selection of design improvements.

o Selection of Design Improvements Methodology for developing design improvements.

Criteria for determining the acceptability of a design improvement.

Provision for presenting design improvements as a package to the Site Change Committee (SCC).

Provision for the consideration of human factors issues in the SCC's final approval / judgment on design improvements.

Provisions for ensuring that Priority 1, 2, and 3 HEDs are corrected immediately.

22

e Verification That Improvements Will Provide the Necessary Correc-tions and That Control Roo n Modifications Do Not Introduce New HEDs Methodology.

e Coordination of the DCRDR with other improvement programs.

Plans for integrating DCRDR inputs with the following programs:

1.

E0P upgrade 2.

SPDS 3.

Reg. Guide 1.97 instrumentation 4.

ERFs 5.

Training Use of SPDS for resolving HEDs.

3.

Interview review team members or review documentation to obtain any clarifications or additional information.

4.

.0bserve any ongoing DCRDR activities if necessary.

5.

If some of the DCRDR tasks are based on a photographic mock-up, randomly audit these for validity, if possible.

6.

Conduct an audit team meeting to evaluate and integrate material audited and to prepare a presentation for the exit meeting.

7.

Conduct an exit meeting with the licensee to dispose of action items and provide constructive feedback.

1 l

23

ENCLOSURE 2 SUGGESTED AGENDA San Onofre, Unit 1 Detailed Control Room Design Review In-Progress Audit Day 2

~ -

2:00 PM Hold a Kick-off meeting with the licensee to present and discuss identified strengths and weaknesses in the Program Plan.

The licensee should present any clarifications and additional information/ documentation at this time. The development of action items and the identification of any modifications to the remaining steps in the in-progress audit will occur at the end of the meeting.

The licensee should discuss the following items and provide supporting documentation.

1.

Qualifications and Structure of the DCRDR Team a.

Provide the two requested resumes.

b.

Discuss levels of involvement per activity for each member of the Review Group.

2.

Function and Task Analysis a.

Provisions for ensuring that systems and subsystems will include all safety and safety-related systems from the WOG ERGS except plant-specific deviations, b.

Description of scenarios.

c.

Nature of the E0!s intended as the basis for the task

analysis, d.

Definition of term " relevant characteristics."

3

o i.

e.

Nature and purpose of the review of the Task Analysis Worksheets using SONGS-1 experience.

f.

Personnel conducting the variots steps in the task analysis.

~.

Day 3 AM 1.

Comparison of Display and Control Requirements With a Control Room Inventory a.

Provisions for ensuring that the process used to conduct a verification of task performance capabilities will identify missing and/or unsuitable controls and

displays, b.

Definition of term "related information."

c.

Participation of human factors specialist.

2.

Control Room Survey a.

Provision of samples of the checklists.

b.

Clarification of the process for evaluating dynamic criteria.

c.

Assessment of the remote shutdown capability, d.

Audit team will conduct a mini-survey of the control room and audit the licensee's survey results. HED docu-mentation should be provided.

Break for Lunch PM 3.

Assessment of HEDs a.

Nature of the verification of HEDs as real HEDs.

4

i.

b.

Methodology for weighting the factors used in assigning priorities for the correction of HEDs.

I c.

Participation of all necessary personnel in the assess-ment of HEDs.

~.

d.

Replacements for HED Assessment and Evaluation Team members.

e.

Provision for assessment of all HEDs with a low priority for cumulative or interactive effects.

f.

Sequence of HED assessment and selection of design improvements.

g.

Audit team will audit any results from the assessment process.

day 4 AM 1.

Selection of Design Improvements a.

Methodology for developing design improvements.

b.

Criteria for determining the acceptability of a design improvement.

c.

Provision for presenting design improvements as a package to the Site Change Committee (SCC).

d.

Provision for the consideration of human factors issues in the SCC's final approval / judgment on design improve-ments.

e.

Provisions for ensuring that Priority 1, 2,

and 3 HEDs are corrected immediately.

5

2.

Verification That Improvements Will Provide the Necessary Corrections and That Control Room Modifications Do Not Introduce New HEDs a.

The licensee should present the methodology.they intend to use to complete this requirement.

~ ~

Break for Lunch PH 3.

Coordination of the DCRDR with other improvement programs.

a.

The licensee should present plans for integrating DCRDR inputs with the following programs:

~~ ~ ~ ~ ~

1.

E0P upgrade 2.

SPDS 3.

Reg. Guide 1.97 instrumentation 4.

ERFs 5.

Training b.

Discuss the licensee's intentions for using the SPDS to resolve HEDs.

4.

Conduct an audit team meeting to evaluate and integrate material audited and to prepare e presentation for the exit meeting.

5.

Conduct an exit meeting with the licensee to dispose of action items and provide constructive feedback.

6

J.

SUGGESTED AGENDA San Onofre, Units 2 and 3 Detailed Control Room Design Review Pre-Implementation Audit Day 1 AM Administrative. processing for on-site access.

Planning and preparation

- set up office space, informal briefing and introductions, short visit to control room.

1.

Function and Task Analysis The licensee should discuss the process used to generate a data-base of characteristics of instruments and controls that were identified during the task analysis.

Discuss all characteristics collected and how they were defined to help identify missing and/or unsuitable cor..'rols and displays.

Of particular concern is the acceptability of the large instrument inaccuracy as described in the SFTA, and the lack of information regarding the identification of interval and division characteristics of control room-instruments.

Break for Lunch PM 2.

Control Room Inventory a.

Discuss the level of detail that was collected in the inventory to describe what information and control capa-bilities exist in the control room.

b.

Discuss the methodology used to compare the information and control requirements and their characteristics to the existing equipment on the remote shutdown panel.

i 1

3.

Control Room Survey a.

Discuss the completeness of the computer portion of the survey and provide descriptions of the computer systems included in the survey.

~ ~

l b.

Describe the corrective actions and implementation schedule to resolve HEDs from the annunciator survey and/or Task K-409.

Also, discuss why a new. annunciator study is needed.

4.

Assessment of HEDs

~ ~ ~~~--

a.

Provide method for assessing HEO.

b.

Discuss the weights e,ssigned to safety, operator error, and technical specification violation.

c.

Discuss the prioritization of HEDs and schedule for corrections.

Also discuss the prioritization of HEDs that will not be corrected until after the second refueling outage.

QA2_2 AM 5.

. Audit team break up to discuss questions outlined in the Appendices to the TER.

The audit team should conduct this audit activity at the simulator in order to view HEDs and j

design solutions. The licensee should assure that the appro-i priate team mambers are available to discuss the HEDs.

NOON 6.

Audit team caucus and audit team leader will provide exit briefing.

l t

I 2

.