ML19332E951
| ML19332E951 | |
| Person / Time | |
|---|---|
| Issue date: | 10/19/1988 |
| From: | Beltracchi L, Coffman F NRC |
| To: | |
| References | |
| NUDOCS 8912130185 | |
| Download: ML19332E951 (9) | |
Text
Conference
Title:
Halden Symposium Date & Place: October 19, 1988, Tokyo, Japan THE STATUS OF HUMAN-l%CFINE RESEAKLH WITHlh TFE V.S. HUCLEAR REGULATORY CCP. MIS $10N L. Beltracchi and F. Cof fman, Jr.
U.S. Nuclear Regulatory Commission Washington, DC 20555 INTRODUCTION The Nucle 6r Regulatory Commission and the U.S. industry efforts to understand the causes of human error and ways to reduce it in the operation of nuclear power plants fall within the scope of human factors.
Human f actors is a system-oriented, technical discipline that integrates knowledge from other disciplines in the design, operation, evaluation, and regulation of all aspects of human-machine systems and their interfaces. Implicit in the discipline's system orientation is 6 broad definition of a system. This definition includes the hardware, software, and people who execute tasks to satisfy goals, such as safe operation of the plant. The definition of system used here also includes other functional elements, such as procedures, personnel selection, and training programs.
The Office of Nuclear Regulatory Research within the Nuclear Regulatory Comission has recently developed a Human Factors Research Program Plan. This plan identifies human factors research considered important to the safe operation of nuclear plants and is responsive to Commission guidance, formal user requests f rom each office in the NRC, and certain recomendations from flational Academy of Sciences reports. The first National Academy of Sciences report, " Revitalizing Nuclear Safety Research," dated December 1986, identified the need to intcnsify research on human factors in the regulation of the commercial nucleer industry. The second National Academy of Sciences report, " Human Factors Research and Nuclear Safety," dated February 29, 1966, made two types of recommendations. These are:
1
- 1. "...to f acilitate the initiation, planning, management, conduct, and l
use of human factors research," and
- 2. "... specific research topics to be investigated by the NRC end the rest of the nuclear community."
The Human Factors Research Program Plan also responds to these recommendations. This technical paper discusses the research program plan as ioentified in SECY-88-141, Human Factors initiatives and Plans, dated May 23, 1988.
The viewpoints and opinions expressed herein are those of the authors and do not necessarily reflect the criteria, requirements, and guidelines of the U.S.
Nuclear Regulatory Commission.
DISCUSSION The Human Factors Research Program Plan identifies five research areas. These are:
- 1. Human Ferformance and Human Feeliability Assessment,
- 2. Human-Machine Interface, 8912130185 881019
'f N121$ bib 5 PDC 1
3 i
i 3 procedures,
- 4. Qua11ficetion and Training, and
- 5. Organization and lionagement.
A discussion of each of these areas follows.
1.
Human Performance and Human Reliability Assessment The goal of research on human performance is to ioentify information about human capabilities and tne impact of various performance-shaping factors on those capabilitics. This research, through appropriate comparisons with human performance requirements, will identify human factors concerns. The object of a current ongoing project is the acquisition and management of human performance data.
A major problem for reliability and risk assessnent practitioners is a lack of credible data to support these assessiaents.
This is especially true in the area of human performance. Therefore, our research focuses en potential' data sources inside and outside the commercial nuclear coniunity. One source of data within the nuclear industry we are pursuing is the collection of human performance data during experiments at the halden Reactor Project.
EGG-REQ-7704, " Specification for the Entry of Simulator-Based Human Error Probability Data in NUCLARR," May 1987, is a report that identifies a inethod for the consistent collection and tabulation of simulator-based human erformance data. The collected data are then acceptable input to NUCLARR p(The Nuclear Computerized Library for Assessing Reactor Reliability) systte.
This specification presents a data collection method that will satisfy the minimum data requirements and standards established by the NUCLARR system.
It is intended for use by personnel engaged in training escluation activities ano other related experimental programs that involve full-scope simulator-based studies. The results obtained from these efforts will assist human reliebility analysts by providing them usable raw performance data for conducting future reactor risk studies.
For the analyst, NUCLARR is an important source of frequercy and failure rate information on both humans and hardware.
NUCLARR is maintained as a continuously updateo data base.
These updates will be evailable to all NUCLARR users in the form of new software diskettes and change pages that are added to existing manuals.
The bulk of human error probability (llEP) data currently in NUCLARR reflect a veriety of human reliability analysis techniques.
Most of these techniques made use of expert estimation techniques, e.g., consensus expert judgnient, and represent data gathered from NRC-sponsored efforts. Ongoing efforts are to enter dota collected from a study of maintenance personnel perfomance simulation data and exercises since there is a great need for additional maintenance-related HEP data.
2
t Another human performance and human reliability assessment research project is to develop methods for deriving human error probability, performance shaping (causal) factors, and uncertainty bound estinctes. This research produced an initial set of inethods for deriving the desired estimates using raw perforrance data from field reports, training simuletor studies, corrputer modeling, and consensus expert judgment.
NUREG/CR-3519. " Human Error Probability Estimation Using Licensee Event Reports," February 1985, describes u methoo f or estimating human error probabilities. Also, the NRC is now seeking feedback on some of these methoos from case studies being conducted inside the NRC and in cooperation with domestic and international agencies outside the NRC.
Human reliability assessment may be seen as a specialized area of human factors research. The NRC's human reliability assessment effort has two objectives. The first is to develop a data base of human error rates and to support methods for integrating human reliability assessments and segments The second of reliability evaluations into probabilistic risk assessments.
objective is to develop tools that allow use of data generated by human reliability and probabilistic risk assessments to address unresolved and generic safety issues.
Human reliability assessment can also support a comparison of alternative approaches to resolving safety significant human performance concerns.
I Human performance and human reliability assessment is a significant effort within the NRC's human factors research program.
During fiscal year 1988, this effort amounted to 30 percent of the human factors research budget.
2.
Human-liachine Interf ace The human-machine interf ace is the link between humans and system hardware and software. Displays and controls are the most obvious elements of the human-machine interface. A safety concern may result when elements of the human-machine interface do not adequately support the required level of human performance. For example, human error rates for safety-related tasks may be unacceptably high when vital information is difficult to find, difficult to interpret, or misleading.
Research in the human-machine area will address a broad range of issues related to whether an interface adequately supports the required level of human performance.
For another example, the location and display of sensor data from a data base may require the input of several commands by an operator. An operator keystrokinc commands into a keyboard may make several errors unless the operator is a proficient typist.
Human error rates are also high when a specific control switch is difficult to locate iri an array of similar devices.
The Office of Nuclear Peactor Regulation has identified its user's needs to the Office of huclear Regulatory Research, Continued coordination between the offices will further define the user needs as the research progresses. One of the research areas identified was a need for guidelines for expert systems 3
verification and validation methodology.
The formal user need stated that artificial intelligence and expert systems art now being introduced into nuclear power plants in applications unrelated to the functions of safety sy stems.
The methods that exist today are not yet adequate to verify and validate expert systems. The goals of the desired methodology are to ensure that implemented systems work as desired, are reliable, and result in operator confidence in their use.
The National Academy of Sciences (NAS) report (" Human factors Research and Nuclear Safety," 1988) contains many research recommendations. On discussing the current development of computer-based support systems within the nuclear industry, the report states:
"These developments mean that there is an immediate need for effective tools to evaluate and measure the impact of new aids and automation on the human-technical system. Developing these measuring techniques is the highest innediate priority in this area."
Developing measuring techniques to evaluate computer-based aids is a complex research task. We are aware of research activities by other organizations in this area.
For example, the Halden Reactor Project is working on this issue.
The NRC is a participating member of the Halden Reactor Project. Also, Dr.
Donald Norman's work on direct manipulation interfaces deals with many of the issues associated with human-machine interfaces. Furthermore, Dr. Jens Rasmussen's work on skill, rule, and knowledge-based behavior also deals with many of these issues.
From our review of the human-machine interface issues and existing knowledge we found that no tools are available to evaluate and measure the adequacy of interfaces. However, in our judgment, the existing knowledge in this field, in conjunction with additional research, should support the development of interf ace tools and measures. As a first step toward this goal, we plan to conduct a workshop of experts on human-machine interfaces. The purpose of the workshop is to discuss existing human-machine interf ace models and to propose measures and experiments on interfaces.
The purpose of these experiments is to (valuate the guidelines as tools and measures for interface assessment by the NRC. Within the, scope of this program, we also plan to address the issue of acceptance criteria for expert systems. Our current plans are to conduct this workshop in early 1989.
We anticipate the need for several experiments to evaluate the measures proposed by the workshop.
We expect to conduct some of these experiments at the Halden Project and others at facilities within the United States. We expect to use some of the key members of the workshop to evaluate the results of these experiments and to propose new guidelines based on the knowleoge gained f rom the completed research.
Research in the human-machine interface is a significant effort within the NRC's human factors research program.
During fiscci year 1980, this ef fort amounted to 20 percent of the human factors research budget.
4
' +
g 3.
Procteures Procecures are another means to ensure that human task perforracoce satisfies systen. r(quirements.
Procedures provide an appropriate resolution to some human factors concerns. Procedures are used to specify tasks to satisfy the systems requirements, to delineate the sequence of tasks, to define interfaces with other tasks, to provide warnings and constraints, and to provide status feedback on the progress of evolutions under way.
However, inadequate 4
or poorly presented procedures can increase human error rates. Research in this area will serve to identify the need to implement or improve procedures designed to resolve human factors concerns.
Research will also serve to confirm the effectiveness of proposed procedures or improvements to procedures.
4.
Qualifications and Training A person's innate abilities and acquired knowledge and skills that are brought to a task impact human performance. One approach to resolve human factors concerns is to select personnel based on the qualifications needed to perform tasks that are important to sy:;tems performance. Another approach is to enhance knowledge and skills through training to improve task performance.
The rationale for both these approeches is that systems performance requirements are more likely to be met by a better match with human performance capabilities. The approaches are not mutually exclusive and, in f act, are of ten used together.
Resec,rch in this area will serve to identify the need to implement or improve specific qualifications and training of personnel who are parts of systems that use or produce nuclear materials.
l Research will also serve to confirm the effectiveness of proposals to L
implement or modify qualifications and training requirements.
l l
5.
Organization and Banagement l
Orger.izationel design, management practices, ano policy decisions impact human l
performance to various degrees.
Changes in organization or management can I
affect either organizational performance requirements or human performance capabilities.
For example, changes in staff size or composition can affect individualworkloads(humanperformancerequirements).
Changes in shift length can affect such factors as vigilance (human performance capabilities).
Research in this area will adcress the relationship between organization, management, and human performance.
This research will also serve to confirm the effectiveness of proposed changes to organization and management for resolving human factors concerns.
Research on the influence of organization and management on human error rates will also provide tools to monitor plant safety performance.
This research may support the further development of programmatic performance indicators.
perfonaance indicators are quantitative treasures used to monitor and assess irdividual plant performance.
There are currently seven performance 5
I l
e indicators in use at the U.S. Nuclear Regulatory Commission.
These performance indicators are:
- 1. automatic scrams while critical,
- 2. safety system actuations,
- 3. significant events,
- 4. safety system failures, L forced outage rate, C. equipment forced outage rate, and
- 7. collective radiation exposure per 1000 critical hours.
Licensee event reports serve as the source of cata in evaluating these performance indicators. Licensees submit these reports in accordance with 10 CFR 50.73, inmediate notifications to the NRC Operations Center in accordance with 10 CFR 50.72, monthly operating reports in accordence with plant technical specifications, and an annual radiation exposure report in accordence with 10 CFR 20.407.
Figures 1 and 2 illustrate plant data for six of the above performance indicators.
The performance indicator program began in February 1987. Numerical values of these performance indicators are published in quarterly issued reports. These reports are available f rom the NRC Public Document Room. Senior management of the Nuclear Regulatory Connission review these reports. The object of the review is to identify changes in the safety performance of operating plants and to icentify those plants that may warrant increased NRC attention.
The potential use of other performance indicators to measure the safety performance of plants is now under development.
In one research project, an assessment of risk-based indicators is under way. This effort focuses upon
' outputs" and some aspects of "throughputs" in plant performance. This effort l
identified two additional indicators as measures of safety.
One performance indicator measures safety system unavailability; the second measures safety system reliability.
These indicators were tested against simulations using PRA methods.
Results of this work are stimulating the exploration of l
possible ways to collect data needed to evaluate the indicators. Details on this research are provided in a document, " System Unavailability Indicators,"
bHL draft report, dated September 1987.
In another research project, progremnatic performance indicators are under development. These performance indicators focus on the internal parameters of plant performance, such as " inputs" and "throughputs." This work identified a set of measures based on monitoring 'causes of reportable events' (such as operator errors, personnel errors, maintenance problems, I
and others). The programmatic areas considered were 1
6
management / administration, operation, trainir.g. maintenance /surveillence, material control, quality programs, health physics, security / safeguards, configuration management, and fire protection.
Potential measures of performance were identified for each of these areas.
Results from tte analysis of these performance indicators were mixed.
Several of the indicators provice useful, interim mechanisms for trending plant performance, but sever 61 others do not appear to be useful.
Details or this research are in a cocument "Developn.ent of Programmatic Performance Indicators," PNL draf t report, dated 11 arch 1988.
The developnient of perf ormance indicators is a significant effort within the NRC's human f t.ctors research program. During fiscal year 1988, this effort amounted to 20 percent ot' the human f actors research budget.
CONCLUSION Human factors research has many facets and is 6 broadly based discipline. The discussion in this paper has given emphasis to that portion of the discipline addressing human performance, human reliat>ility assessments, and the human-machine interface.
Even these elements are not totally separttle, but interrelate closely with all the human factors.
The Human Factors Regulatory Research Program is addressing a broad range of research needs, including all research needs formally requested by the hRC regulatory users. Additionally, the program addresses broader human factors research topics some of which were identified in the second NA$ report (" Human Factors Rescerch and Nuclear Safety," February 29,1988).
The program will provide technical bases for supporting nuclear regulatory decisions related to human performance. Also, we expect that the research performeo from this program will ctepen our overall uncerstanding of the causes of human error for the purpose of reducing its adverse incidence on commercial nuclear operations, t
The Human Factors Research Program Plan is a living plan in the sense that we anticipate periodic revisions to the plan. These revisions will reflect modified or new user needs as well as the resolution of any current needs l
resulting from research that has been completed.
l I
7
p rcVRE 1 I
I C IrtSit et or FL, INT X
Lecen0:
C. e m 4.e..;,
Criticot Hos s 85-4 to 57-3 1
a tomatic Summ W5e Critets 2, Sofety System Actuot.ont 1
u
- n r.a t
nw am I
p',,# %.
3000
.7000 j
3
,f, 3
- I t*
q A,
p;
=,
t g
ij
"" )
- " }
Sr'-
5-
lj r
j) s
=3 11
=t m3 3
3
. soo too
~
g
. =o
. e;,e3 g,;i...l'.', 6.,
s.e.e;,e3g,;i.,a.,.,.,
i seem c.ii.
- 4. se ir s,,t.m r6u.
. m.
n.
d^ ~ ~ \\
s I
' s, j) l9
.'". I l}s.
..no -
s.
- 1 t
9 i.
ei.
. n. }
g
. =n )
{
\\---
d) d 5
5 t
ma rm s i
i s
g v
.,,g reo me s.e,s;,e3g,;i.,i.,a.,
e.e,eg,e3g,;i.,W.,u.>
- 6. E forced Cuages/
S. Forced Cuage Rate (.)
CrNeca Hows ano
. ano f..
\\. m.f
'.s\\
==f
- ~
=. -
=
p
.I I..
)
g
. an j b
3 y
p
[
m$
. bio 3
]"
2.
tai J
J p
- %g a&
Hms__
- =
u>,
=
^
- M w
r,.
m s
. mi m;y3 g,..... n.3
,, y 3 g,...>.
.>.3 8
1
p.-
TCURE :
Trenas l
1
=c v:
DuWe inceonc Per f ormonce indiccterS l
1 Automatic Scro-s *we ce.t.coi
- \\b *
-o ne (2 Otr avg em 87-3).
g
- 2. Sof ety fystem Actuotens (2 Otr. Avg eno 87-3).
o
- 3. Sigraficant Cvents (2 Ott. Avg ena 87-3).
.oss Sofety System fokses (2 Otr. Avg end 87-3).
o.n a
- 5. Forcee Outoge Rote (2 Otr. Avg end 87-3).
0:2 l
- 6. Cavipment forcee Outooes/1000 Crit. Hrs.
-Si (2 Otr. Aq ena 87-3) -
-2.5 -2.0 i.5 i0 -d.$ 0 0 05 1'O t'5 2.0 2.5 Deviations from Previous 4 Otr, Piant )Means (Moosured in Stanoord Ovviatiors Deviations from Older Plant Means PLANT x Bel w Avg. Perf, he Avg. Perf.
Performance Indicators t Aulsmote Screms WMe Critical Osa (4 ott. Avg ene 87-3).
- 2. Sofety System Actuations (4 Otr. Avg end 87-3).
1
- 3. Sigrificara Evems (4 otr. Avg end 87-3).
42
- 4. Sofety System rokses (4 Otr. Aq end 87-3).
.os4
- 5. Forcee Outoge Rote (4 Otr. Avg end 87-3).
g.
i forceo Outoorst 1000 Cnt. Hrs.
05'
- 6. Ecupment
'4 C t.-
Ag em 27-3)*
-2.5-20-15-10-0.5 0.0 05 to 15 0.0 0.5 Deviations f tom Older Pront Means (Measureo in Stonocro Desnotions; i
9