ML20141K179
| ML20141K179 | |
| Person / Time | |
|---|---|
| Site: | 05200003 |
| Issue date: | 05/07/1997 |
| From: | Roth E WESTINGHOUSE ELECTRIC COMPANY, DIV OF CBS CORP. |
| To: | |
| Shared Package | |
| ML20141K158 | List: |
| References | |
| OCS-GEH-031, OCS-GEH-31, WCAP-14701, WCAP-14701-R01, WCAP-14701-R1, NUDOCS 9705280356 | |
| Download: ML20141K179 (110) | |
Text
.
v gT - -
. n.., o.: :.9..Q g......-
N.::.y.:.'.;P.;:g!. :: 9.:#s,y ",v..y;;p. s:... ~ e ;...,; m. w.m::...m.s v.,; y_:g* = ":.q'..z"m>;;;a.
n.y
~ ~ -
if.
- .D x.fc :: 5.;my. J't :vy'y;.p;..;
tt.g:l y.. g;::y q::,;.;gpp;__.
-.p,r;.Qr yq.w:;;
g p,c.Q.;. f.2.k; c:,;-.w ag,..
.. p: e...,;. g
.g?.*.. :.;4;.'.:.g~u::.y.;A.;&... v.C.i:p, 9,-
gg~:.g*s. y
..:. y g : Y,,h,;s.. e
,y.. - 1,' *.
ll, A ?', * *,7 *,;.,y;
.: w:
~,.
,5 y n
./* -.
e
\\
t,
'(
E;' ' '
5' f.l. ? h !{tf.' -
*.'s.; kl.'t..- ;- { s{4. f Eysy h
{
..gybe!
':q' a
b>
=
- e..R'l', " l
$?;b
..o;h.; e,
.J
?k..
-.{ h,_;.':'?. : ' $.. ::
! W;'..,j.f. :e&*.'(*M...),L j.&jgfM;&' *., F,.N' 5.. l: llR.$.i. 'f' :
.},Y.J. ? ':.';?f. g ',9.?..i.?{..l*
-(.?: '.,i:.NQ._: : ': :jdli
~
5,v..u.g.. y:
. :e es.?. s,
..,. g ;.' '.
..a /s; A.PI.:s.... p.;,r.
4..=$.
...g,
.....a.
w.,. )
p
.f., h. a.,'.Q h_'..Y. ' ; ha y 4,.
i....
N.U.
-g t
. ' *.f ll h'[ 5,f:
N
{h.&.%.e.... '.?.
L.:
._l.'!
k
'n$
.., k k:.: *
, Q::l ::':. :. 6.l'.%.... '
G a
E.
.h{.,,y*.g. _,
.G :j;.?.yQ........,.:..
~,{; _ f h. m.
...c...
.... -.v r
...,:n..:. u.:
.o m
k';...... :.a: /..n
_ ;$$ g.h%;*&'.:)
liI
.ye
.V f.-l:ji::dl'f.h.!{ ;yl*,'
'.l*$.T r :.i.,:
- .* :)ly:,} ?.)
s r
.'.t.s.
g?.,.".w.~...'.q,:a..
. t
.~.
,l* 'l l '.
l'l*
n..% '.4.. s.y,.;,...... c.$.,,'..: :...':.
4....... *... t. )....,.. '.......,.
1 S.
....\\c....
,,.: =..,(v,.4, ".,.,, =. [s,;
.l...; g.~.,.,.,... b', _.
e.
..*.. x.
- 4.. j ^.,: pl-Q. -s,t; ~. N., j..ls,.l:1__ ;
s
,Q.4..,n.;;,s.l...( ') $;,.:.
a,.
a
... 3
- ' l.' l. ;
M'.4.? _'f g.*,,
?
e t'*;.. ?.(;,f _ ;:
- 1.l-Q [p. y,1:
.c..e, y.Q,.f.:,jy ;
^
1
_.. Z.
- . Q.
,.;p,.. :. a.
- 4.
.p g.......:
,. ;,1 t,.
,. a. : )
. <..,... s.y. A,,
a I':l f',h), R:r' 5 :) %y.. s
- f,"
V.'
' '.;', f.$..,:. s;..s. :,,. < *,3.,;: g...
y..:?., p
- s
- n
..
- l:
.-(.~
, Q %.D, y '...
1.ll 'l." " :.Y,. ?i.'.yl.f..l&.w g... '. : ? l)l.:'.l,l].?s
.*[.$$i., Y; ;.f.
- r:';. l
'[ll? ?:k* ;!:k.,.7lI;;lf *:
^
Q t i t,'[.
,..l,'.o.:: l;.T! ',i &.\\ i;'u '
ldj$f g.Q.'-)f'hl.'Q 4.. ;,r. :.s.V '.: g, f ;,.f.. :o..'."i-l
- , fj;.','f..,s'.'.'*
./., '
q..'.V.;& '..'l
- * '.c ;. y_,!;?.;. ;::. J y'L p. q+ y;f q:;i..k :,. :&* :,.,,,,, 2,;.' [..
. _.; [;.
j
- g..
..t. ; fa q.
.3
..,y g
e_
..,:;,f.,f:.yl........,..,
3..,
1:p).s +,,.:h(j.;.
n v. ;gl : Q. ;K. y t
m..
- ,..... ',';. u
- . : \\.,9.V.;;. y:,. ;.,;',.
f.-
?. :[ v. ;,; :. ; n,.,4.h'::.~.:s tw'
' %.,. '. a... ;. s,. '. '.';,;.. :
v
/
% t ; ;, ' 'y. 9.,
y:&,.c '
- /f:n. 3.. e.,;
g.
..'p y.e..
.v.
n v.: '., - -..-
l i s.G._ _ '.4.i.;H ".-
m, -
( - '
., 3,.
.,7,..,
_'. '. l, u..
.Q.?: *- '.(,: ;q..',
i ' *. t :.; :7 g. q,; 9 ; ~' '[.l,'.;l,{l :~ ' ',.,...,,. [i:(,
f~ 4_.7 ;:
%.- ~
q.
,v..... ', ',. _....
%;}:..r..
~
.j......
. "..,1 Y: w... -.
- .>(_o..,;.
<*.-n-
- y.. '." y;f f:'g;5. e:?.
.7
...: y:~,
y; 7c". ;.;. 'y".'g *c,;\\.",; ;:: *;. '
- y"v. 47 '.v,el.;.q' *
. I
- y :.
.\\
......w;,. ;p'i d'. y ' '- ^- :.. W =
- l:
- .. s y.,;Y
- '..
- l..,
~., -.
a..
...w;;..... '.&-.
...* ::l ^.'r.
p..,.. 4.c. 3.2 ': :. ?. S: ~.;'.'*; :j' l
A'
, ;;c.
4>i.
css:
- r.s f.'#. b...., n%.! '.;.
- .s..,.
. v..:.
,..v:,:...
.e
.c:
q,..
,4.
.t q.
?.,1-
... s. Q.,,'h *-':,.) :
.}:. %,,.,'s.*t. (.3d.?,;'.
y.
- f.,
-,. ' T
,. o',, ::.'.1.
f: -
1,. l ;,
~
.;'::'.. ' :- q
- s. g..
)
.>'g.Y,'.:"..,
,5. g,,.,
,..'.?
=. *
..i..y;..,......~s....,.; ; ;. ;. ;. 7 ;
.,,.u.: i.... 't A i v> m ~ m. "..
iv a W
- c
.,. y.c.. 9., %. l,,,...
,.'W*.;..
1 1+..i v.
. ;.. s :;;...
,,',.m.
,.- ;..?.-.;
.v u
v, :
,. g.:
r:
.s
?
. :..;.. r '.y;..:'..r.u..
....:.3.,.....~.>
2 :
e,.;. ;,..
r.
. ~...
.~,.....a
.u.... :. g
.. r.
. :. _ z.. w
..,y..
.c....v.. c.: ;-~.. p. 2...,.
- r..
,,.. g.:
. y.
. a.,..,
- .. g~.
- ... v -
u
.w.-
5-
- . :.. ^
- : e..~ n
.;,s
.._o....'; f:.C '..:
.').-
- ..;.. ?. :g.. -
.. )G....
..'.;..s.,. i... :n,.. y..i
.,.. N,. :..'..'n'.*n.,'r
.... : :.,.',f :.n '.. \\
'. sg
- ^ <
,: b
- .~ ;.i ~....,3 ;.;.;,.. =,. ;=,.. m,
.c.*.
r...
,e y
Jt e
, q,. Q...,.
..,ss
...q,.,.p,,.<.. -
n
..,,..,., ~.. :
- y
~..u.u.',*,',-...::..,..,e.
gr 4
$ %. :E ;.. *;L.$. -:....
l '.,
.fl.:.%, }';;.f::.;',:j' : '..
h'.. ~ :..
v.
u.
e n
- :..:. L.]
9'.y';% ^:.~;. e>:.hy., }t.
- iv'W
. ?. f.,? ;;? y. :h.. Q. :&,.',. V:.iy. f,::
J:..
.R
!,Y.j.;.;;,;. ' T.
%;.'s ;1
"'y.f. ' s ' :~#.
f,'. : y ;.. :'...i.' :* *e j.:...
4..,..>'.
.;l- :, *....,
..g :
- ). 'Q..;c. y *
- :a t.
a,.
e;
,..s.:... ;.
.n
,/.
- _..,'t-s
's'.- * * * -^ ' ' ' ' *,
ne ::..,...,;...,, ; I.,.','. >,,%. g.
,.'.;-* * }'l f,i'$,s.
.ei,-,
.p,,.,.,;.,...
' d.Q,(* b. lgj, _ ?,:A,..~.",'5,.'.
. :; m *;.;;, ' l, s';' '. ?.'{ ; ' a ~. '
- 1 '.' *:.*;s,. ' '
., ' j{-
', '** i'. " ?,,e f,. ;. ': ::.'.. f..,. ?.-,,. ' ';g;. :'.,,4
?.,.. '. :
- 7-l5 -..
- +,,,.
- g
,,.jp,
,,..,,.'.'.'*L;,,;,:.'d..l '.o;^
,}'_'
x$,..f.. J _ ;y',%', ;.?( h. f.
- 7..
- ~.
';.,
- i' b; l '. >:l*(' k.
-l,
).
g....'<.'4..,'.,:.'$j.?$... %:..'.,b. 7 E
< : 8, :.,;?..
..l
.'.:^.
- f. 4. a..
..p,,.... : u
.-. w W :.9:^ Wy ' ?. s:M.,,,H)*.:
- E.
',,.Cf.
r.
. $.. './* %,.. Y..;'.)lC.. ^.
.lO% N:iM.*'M:.ls %*.G..S.l.
- l
- V.
.. s.q
- [ g,l
..C:',...,
Y'
-i
.s/
.'e
',.t
.a-2
~*
'.'g~.,'j,"-"
. -.,..*',''(W,.;u- ',*'.. %y,?l:.'. '-:,"(**.. O', ^ ;j'.'A., '. ?..**,;*,
8.[.,
1
' : 'r'
.ff, / 9'i 'l':,:..
' *i' g
. ', { :
j.
G' j.e,=
- y. :,,
- y.,,-
,e f, g
} '%"
a
.': j,..- ",
J 2.
9d, Y-f (,
.c
, - [4f
- * =
' N h;':.?
.c
- h:Y.Ni l..
f : :. '.', *.,., "
3.
'.;l b'*
.g' f.' d,' h%h..'l.
Y
'l.i.
y
?
(',
..k o
- . i - *
..;...:. e 4 ~.
t'. ~.m
. '. r
- ,.,s s. n..
-.p...s....,... j,,m.-. l:' q>
.o
- s'...',..:c,. o %; * i-... u ;. ' ;$. '. 4 :. h '.%."..
' f;.' :r: '. ~~:
.o Y.i. ' :.... :-...;
4:
hs, b '. '.if:.,}, *p.f;: Jlkt'ih):.. * \\,,_l'.
..., ;* )., f,. f... A ~,* Q.,#;R. '.]..] :R.,,.lp ll.. :'%..c.
,h*.
- t.
... N
, b N' 5
W
. V.'1
.t
.W
? :.
h;'.:b ;. h.Q: :..i'f.'.>v.:.:.., ' ' '.,,
.. i
- \\;, '
.g~:..G...,'.,...,*.
'..'..h'.
. b
..'V...,.
y,
%)
s
's t.
ra.
...., %; a..
. 4)..,.
i, 1
?' :?.5l;r '. '. " y, s. -
t ..:
ll-l,
.:/.. e.,. ~, '
s
.$:k.'.
' Y *h.k,,J..* :. *?h.<.
~. :;.j :
U.
- t
,t,.-
r.:
r.
.r:.t..'
4,h'h % %$,.;0'.;;h;4S.'Y. 'Y. }%: 'hY" } Y lNk;Y?:?.l k,h':l
- e l ^ '.h.N':t.
'n t
- g
- * ^ f.f. &. t.'
l p.,. s x'v::.f,; ',94l I. f..:-.. "y : :.... *
'&. q', ' o*.'l.,
f: f.:
-lY ', f.' 0 :.': '
f 5
{N
^$ ' I
-. '. _..'k $ ;., h *' '.** * 'l i.:Q,;...ky.!) -..
.f.
s )_
~,!.k *j!*
, u f.-
?.e @y,.:' h s
=
.;.-. w.~.. ;.7. " '.
.c %.w:.*,;:s'11 t'
t ',y/
- , ~... m,%1
f.. q. :yN...y
- .s
\\.:j x
f 4.g W:.
t
.s N
.e i.
y l n.
- ". ', - l'*...l %.b
- g 4
i g.
.. r.3 n.h,g.....:- Q. '. Z..'.:.,h l N.'
.l;h.. Y. -),;....w.*
. if. h. Y.'j:s, E.0:,.'l.'.1.'.
?,lq~.0,>, s \\
"~
'..'.I...e'
.l
- f..:.. 4:. ' vi- -
- ,y ;
. ' ul,. ',.?.~. y. : : '+, is..,.. r... } v., '../. *.1:v,,.,A%,,..
?...e
.,y
, u ts. * :.**- -.
.s
.e
.e.
- 4.. *... v-. ~
..., \\'a ;;., e " 0. ;f;* ":v. - '
?
lt.-
- ,.x, <,
t
. ' e... '".i 1. 4,../ * :'.'....,. ' 3. g% '.1.O..;. '. '.,., C'
.,<h
- 7... v
.., s.3. -;P,. ' ',.nXM f
f%
"..r.-
b rs... 2. F. % -
..J.*-
,; :...>y 1-
./-
..%....:.,...,:.-~..... p[ s.: f..
c'h.,..".
j c.
o p., q;;..R;;....
.s p
..i,,
._,.. q;. %e.g. y; n. ;;..='ut.
- ., :y...
- u, s
Q..,. v. -
s
-1
..,< y
..a n
.%. :: ~
..t s
.. ~:. ::.
r
- c...s.rs:.l
,g...
1 s
.M.f
. '[
.N,[..
. h $ h h.[:;
.1:d;r [..': ;'..' h,*.[* ' f: ((.',f.Y,', ',' f',[.
k;). / h.l.s.
- 6
[.
U.
$ $ k $. g :.f v[::*r.Y Q * [k...f.7.,.,,
.. 4#. lh
,?,
,l. e).
/.
'r,,
' Y!.. l;..t. {. ; n' ;. '......,.. N. ':,.
- &l.l Y..YlU.3,'.
l-N s f.l. l
,.c. g,y.r,;.
.s -'...'.; ;?.g.*
ve p.
ged a v:. e :. *.....
- q
?.->.
, t
.,<Ng !.1.*.. +.-s,.
- e. t... ;:n
-. n.::
I )l
, < f^g :W.n.. v
. 3.< Y.W. s.
. e.r.
s
,._ f. 3,..p.
.r
- f. 9f s. :..
f,.:. ' ". c. ::~'..,p; M.,.; ~, g.. :. s v.
'.' a.:? ":%.-
n:
$g."f y 4, y... -r:. :
e-s:
.. c'.P.' $. H 2:
- ..:'...).
p.~.
.. - % r p..;; :. l!,.g::.
'.'.e
.,.; p..,.
..: c 7.. 4... y%. ;g v
.g :. g.,.9.,,.
- a. n -.
s...,e i
y s
t 4 3 s
v, 1 v:;%w.gu. y. 4 s
,,..,'..;',.. :i. '
.g ~'.9
.a
- ~. :-.
... %.. %, o
- e..
4,. p.
.. Wy.s
.q q -.:
o;.g. 3
.c..
c.f
.r..:.
.v..
$j!,
.sg%.~.
. v.
m., m
.u. s,.. %,.. > a... a:r
. 'he n :,e k' )-
h,:c 3,. g 4
w h..,.~
?
- b.
..' '. ;< ~y.f
.
- lcc.,. ; :.f.'. 34,*}l,
.> w.s.;i,'.: M.1f:.,;g..q:?..
J.:
..,.. g t,s. T
. h, t e.# :.W:,..
b
~
Yc n. n". p.:n. e y' r a..n. ".,. E.vC..'.
- 1 p2. :aw:.c." : :a + n,..h...e v.~::
M *c. g e,*;,:-
% g m ::e. &...
nm y
9r m. [f $,['* # @s..,"g' (?
c. w...;.. r.%: '....
,f.'.
f ib.. y*.s
.~
p '.,
[s*/.N.. d. [.24
- C S. '.#
I\\..
.I.
3..
9;; ~ %'.<..:' F::~. d.)"
I t,.81.
P
y s:.;. L.y, " y. ~j ?4 7.y.8+f
. y g : ng't.%.-
/
,[
- .., N "/ (.
ts. :,.
4
's %. g,. < @.s /.%
% /-
94 r.g;j g'n' O. d.f
'4.'r.9'q.&.V t
c.
g-
- g.. :. s..'b
.;;;.:::l. s... '
~ * -
s-
'y..:
.;-.;p.b'.
$$h kh' i -
N hY)$,,.k,u. -
~ T s..
%s't.g.,. Y.. h k h
.?.. h e. $ h::;..k h j.
h.k b
Y W W N N..$e}t v$
$$..,$. $..$t W khk f.
D D.h ms n:.n, w ::h &.,:.n.!
^
. ?s
- W$
I:
- .;y n. g.
. m.n
- ).a...r,y,m.y,.:..,,
,,g
",,m,.
- 4., y: ;p.;,y-.,,. e..a =;
x.
- p. m s: :u ;m.&
- x. ga m.
~.w a e{.
%g t.C:'n..;y.;g;#..
.m.
s.. r s,,,
v,.s..
Vs
. 4. ht
.. ~t. a Q%y :
y...
%;'*) : M :
4
~
-... q.y m
yr.
4
- .g(. &,D:,.h' in:p.M: z.i$in p:.L..f :)P.mp-n.vp;;.j3xp:9.4M,Q:LW.g.
%. g:.yn : >.c W
"ku y (t.g, s.W,'c.
J.h+ q.a j Q. c
- 7 qi.
Q.g s.
. :,c f
h.. n ;
. z.
9.gtN
- ~g.:3;, ;..'.:; -:z.
..;h G.my&py&g
... %,n... w.i.k...:; %+.,;l. q. k.,g%,g:b @%.d ? W:.
. )
g W :.s:p.;
ggA.q g 6 %f AM.:%gl... e s#.9
%$j.&. ;h; y ?
.:u 4
hebdN$[4i%x..
y.,
i
%.pNiM '
%w
&@. b:.n K%W ifk&W[.;
A.t
.?%B:;p.7:
'yf 3 %
Q.g.
YYhh;,..) kQf
- t. W.!:
'0,
h hY N m w,h h
.MRphik$w$p%py '
sl@$. -
l I
g.:
\\
.H W.
. t 9 g $g%WP:
M ggg.MM / if) yn,gg 4.
'9705380356 970519 Q y
- ].E.G
- gg
- eqq,.. c.Ly.,y;k.4.: a f,. p igy y p s:j 'j PDR ADOCK 05200003 i
.y,.
.., +..,.,-w m m :..n. 3,ra..::;:;c;;..f. ;?y.
PDR Q
.a.
..R.g;. y y n.pw:,,z, 9R 9 n...$q g
..u r...
a,w n
- v..
w,
...,w
Westinghouse Non-Proprietary Class 3 i
WC AP-14701 Revision 1 Methodology and-Results of Defining Evaluation Issues for the AP600 Human System Interface Design Test Program
- e e
t' il Wes tingh ous e Energy S ys tems W
I,
,, aq: 3c; (,
97<
(, '
s o;4
t AP600 DOCUMENT COVER SHEET TDC:
IDS: I S
Farm 58202G(5/94) lo:\\3411w-2.wpf]
AP600 CENTRAL FILE USE ONLY:
0058.FRM RFS#:
RFS ITEM #:
{
AP600 DOCUMENT NO, REVISION NO.
ASSIGNED TO f
OCS-GEH-031 -
1 Page 1 of Robin Nydes ALTERNATE DOCUMENT NUMBER: WCAP-14701 - R /
WORK BREAKDOWN #: 3.3.2.4.15 PROJECT: AP600 i
DESIGN AGENT ORGANIZATION:
Methodology and Results of Defining Evaluation Issues for the AP600 Human System Interface Design TITLE:
. Test Pfogram ATTACHMENTS:
DCP #/REV. INCORPORATED IN THIS DOCUMENT REVISION:
l CALCULATION / ANALYSIS REFERENCE-i ELECTRONIC FILENAME ELECTRONIC FILE FORMAT ELECTRONIC FILE DESCRIPTION 3411w.wpf Wordperfect 5.2 l
3411-1.wpf f
3411-2.wpf (C) WESTINGHOUSE ELECTRIC CORPORATION 1997 0 WESTINGHOUSE PROPRIETARY CLASS 2 This document contains information proprietary to Westinghouse Electric Corporation;it is submitted n confidence and is to be used solely for the purpose for which it is furmshed and retamed upon request. This document and such information is not to be reproduced, transmitted, disclosed or used otherwise in whole or in part without prior wntten authorization of Westinghouse Electric Corporation, Energy Systems Business Unit, subject to the legends contained hereof.
O WESTINGHOUSE PROPRIETARY CLASS 2C I
This document is the property of and contains Proprietary Information owned by Westinghouse Elet.tne Corporation and/or its subcontractors and suppliers. It is transmitted to you in confidence and trust, and you agree to treat this document in strict accordance with the terms and conditions of the agreement under which it was provided to you,
@ WESTINGHOUSE CLASS 3 (NON PROPRIETARY) i COMPLETE 1 IF WORK PERFORMED UNDER DESIGN CERTIFICATION gg COMPLETE 2 IF WORK PERFORMED UNDER FOAKE.
-- 1 DOE DESIGN CERTIFICATION PROGRAM - GOVERNMENT LIMITED RIGHTS STATEMENT ISee page 2)
Copyright statement: A license is reserved to the U.S. Govemment under contract DE ACO3-90SF18495.
- p Subject to spectfled exceptions, disclosure of this data is restricted until September 30, DOE CONTRACT DELIVERABLES (DELIVERED DATA) 90$F18495, whichever is later.
EPRI CONFIDENTIAL: NOTICE: 1 U 20 3 4
S CATEGORY: A M B C
D E
F0 t
2 O ARC FOAKE PROGRAM - ARC LIMITED RIGHTS STATEMENT ISee page 2)
Copynght statement: A license is reserved to the U.S. Govemment under contract DE-FCO2 NE34267 and subcontract ARC-93-3-SC-001.
0 ARC CONTRACT DELIVERABLES (CONTRACT DATA)
. Subject to specified exceptens. disclosure of this data is restncted under ARC Subcontract ARC-93-3-SC-001.
ORIGINATOR SIGNATURE /D TE
}'ar y,g)stil/)py l[ /((J Emilie Roth
{,
AP600 RESPONSIBLE MANAGER SIG APPROVAL DATE Robeft Vijuk/b MQ f
- g
,,5/f/f7
- Approval of the reeponsible Mitnager signrfes that docu t is complete, all required revews are comp'lete', el(ctronic file is attached and document is rekated for use.
AP600 DOCUMENT COVER SHEET Pig 3 2 Form 58202G($l94)
LIMITED RIGHTS STATEMENTS DOE GOVERNMENT LIMITED RIGHTS STATEMENT (A)
These data are submrtted with limited nghts undergovemment contract No. DE-ACO3-90SF18495. These data may be reproduced and used by the government with the express limitation that they will not, without written permission of the contractor, be used for purposes of rnanufacturer nor dsclosed outside the govemment, except that the govemment may disclose these data outside the government for the following purposes,if any, provided that the govemment makes such disclosure subject to prohibition against further use ar'd dsclosure:
(I)
This "Propnetary Data
- may be dsclosed for evaluation purposes under the restrictions above.
(11) The *Propnetary Data" may be disclosed to the Electnc Power Research Institute (EPRI). electric utihty representatives ano their direct consultants, excluding direct commercial competitors, and the DOE National Laboratories under the prohibitions and restnctions above.
(B)
This notice shall be marked on any reproduction of these data,in whole or in part.
ARC LIMITED RIGHTS STATEMENT:
This propnetary data, fumished under Subcontract Number ARC-93-3-SC-001 with ARC may be duplicated and used by the govemment and ARC, subject to the ilmstat.ons of Article H-17.F. of that subcontract, with the express limitations that the propnetary data may not be disclosed outside the govemment or ARC, or ARC's Class 1 & 3 members or EPRI or be used for purposes of manufacture without pnor permissson of the Subcontractor, except that further disclosure or use may be made solely for the following purposes:
This propnetary data may be dsclosed to other than commercial competitors of Subcontractor for evaluation purposes of this subcontract under the restnction that the proprietary data be retained in confidence and not be further disclosed, and subject to the terms of a non-disclosure j
agreement between the Subcontractor and that organization, excluding DOE and its contractors.
DEFINITIONS CONTRACT / DELIVERED DATA - Consists of docurnents (e.g. specifications, drawings, reports) which are generated under the DOE or ARC contracts which contain no background proprieta:y data.
EPRI CONFIDENTIALITY / OBLIGATION NOTICES NOTICE 1: The data in this document is subject to no confidentiality obligations.
NOTICE 2: The data in this document is popnetary and confidentialto Westinghouse Electric Cr.rporation and/or its Contractors. It is fmwarded to recipient under an obligation of Confidence and Trust for limited purposes only. Any use, d;sclosure to unauthorized persons, or copying of this document or parts thereof is prohibitet except as agreed to in advance by the Electnc Power Research Institute (EPRI) and Westinghouse Electnc Corporation. Recipient of this data has a duty to inquire of EPRI and/or Westinghouse as to the uses of the inforrnation contained herein that are permitted.
NOTICE 3: The data in this document is propnetary and confidentialto Westinghouse Electric Corporation and/orits Contractors. It is forwarded to recipient under an obligation of Confidence and Trust for use only in evaluation tasks specifically authonzed by the Electne Power Research Inst:tute (EPRI). Any use, disclosure to unauthonzed persons, or copying thm document or parts thereof is prohibited except as agreed to in advance by EPRI and Westinghouse Electne Corporation. Recipient of this data has a duty to inquire of EPRI and/or Westinghouse as to the uses of the information contained herein that are permitted. Ths document and any copies or excerpts thereof that may have been generated are to be retumed to Westinghouse, directly or through EPRI, when requested to do so.
NOTICE 4: The data in this document is proprietary and confidential to Westinghouse Electric Corporahon and/or its Contractors. It is being revealed in confidence and trust only to Employees of EPRI and to certah contractors of EPRI for limited evaluation tasks authorized by EPRI.
4 Any use, dsclosure to unauthorized persons, or copying of this document or parts thereof is prohibited. This Document and any copies or excerpts thereof that may have been generated are to be retumed b Westinghouse, directly or through EPRI, when requested to do so.
j 1
NOTICE 5: The data in this document is propnetary and confider bal to Westinghouse Electric Corporation and/or its Contractors. Access to this data is given in Confidence and Trust only at Westinghouse facilities for limited evaluation tasks assigned by EPRI. Any use, disclosure to unauthonzed persons, or copying of this document or parts itiereof is prohibited. Neither this documer;t nor any excerpts therefrom are to be removed from Westinghouse facilities.
EPRI CONFIDENTIALITY / OBLIGATION CATEGORIES i
CATEGORY "A"-(See Delivered Data) Consists o' CONTRACTOR Foreground Data that is contained in an issued reported.
CATEGORY "B"-(See Delivered Data) Consist 2 of CONTetACTOR Foreground Data that is not contained in an issued report, except for computer programs.
CATEGORY "C"- Consists of CONTRACTOR Background Data except for computer programs.
CATEGORY "D"- Consists of computer programs developed in the course of performing the Work.
CATEGORY "E"-Conssts of computer programs developed prior to the Effective Date or after the Effective Date but outside the scope of the Work.
CATEGORY "F"- Conssts of noministrative plans and administrative reports.
I WESTINGHOUSE NON-PROPRIETARY CLASS 3 WCAP-14701 I
Revision 1 a
Methodoloft and Results of Defining Evaluation issues for the AP600 i
Human System Interface Design Test Program d
d Emilie Roth Science & Technology Center I
l May 1997
?
l l
Westinghouse Electric Corporation P.O. Box 35>
Pittsburgh, PA 15230-0355 j
C 1997 Westinghouse Electric Corporation All Rights Resened i
WCAP-14701 May 1997 3638w.wpf:ll>050697 Rntion 1 L
iii TABLE OF CONTENTS LIST OF TA BLES....................................................... vi LIST OF FIGURES...............................
...................... vii LIST OF ACRONYMS AND ABBREVIATIONS.................
.............. viii 1
INTRODUCTION................................................
1 -1 2
GOALS OF HSI TEST PROGRAM...................................
2-1 3
EVALUATION SCOPE...........
................................. 3-1 4
FRAMEWORK FOR DEVELOPING THE HSI DESIGN TEST PLAN.......... 4-1 4.1 INTEGRATION OF THE HSI DESIGN TEST PROGRAM IN THE HSI DESIGN PROCESS.........................................
4-1 4.2 MODEL OF TEST BED FIDELITY..........
..................... 4-6 4.3 TESTING DIFFERENT LEVELS OF STAFF INI'ERACTION...........
4-9 5
PHASE 1:
J ISSUE DEFINITION...................................
5-1 5.1 HUMAN PERFORMANCE MODEL............................
5-2 4
5.1.1 Detection and Monitoring / Situation Awareness..............
5-2 5.1.2 Interpretation and Planning.............................
5-3 5.1.3 Control.............................................
5-4 5.1.4 Feedba ck...........................................
5-4 5.2 MAJOR CLASSES OF OPERATOR ACTIVITIES...................
5-4 5.2.1 Detection and Monitoring / Situation Awareness...
5-6 5.2.2 Interpretation and Planning.............................
5-7 5.2.3 Control Plant State..................................... 5-8 5.3 MAPPING OF HSI RESOURCES TO OPERATOR ACTIVITIES (MODEL OF SUPPORT).....................................
.... 5-11 5.3.1 Detection and Monitoring / Situation Awareness.............. 5-11 5.32 Interpretation and Planning............................ 5-13 5.3.3 Controllin8 Plant Sta te................................. 5-14 5.4 HUMAN PERFORMANCE EVALUATION ISSUES................. 5-15 6
PHASE 2: TEST DEVELOPMENT.................................... 6-1 e.1 TESTABLE HYPOTHESES AND PERFORMANCE REQUIREMENTS... 6-1 6.2 EVALUATION APPROACH................................
6-2 6.3 EVALUATION REQUIREMENTS.............................
6-2 6.4 EVALUATION DESCRIITIONS...............................
6-3 6.5 DATA ANALYSIS AND FEEDBACK TO THE DESIGN PROCESS.....
6-3 WCAP-14701 3638w.wpf;1b.050697 Revision 1 May 1997
iv TABLE OF CONTENTS (Continued) 7 EVALUATION ISSUES AND DESCRIPTIONS..........................
7-1 7.1 -
EVALUATIONS FOR DETECTION AND MONITORING............
7-1 7.1.1 Evaluation Issue 1: Passive Monitoring of WPIS and Workstation Displays.................................
7-3 7.1.2 Evaluation Issue 2: Directed Search for Information Within the Workstation Displays Based on WPB Displays...............
7-5 l
7.1.3 Evaluation Issue 3: Directed Search for Information within the Workstatien Displays Based on a Request..................
7-8 7.1.4 Evaluation Issue 4: Maintaining Crew Awareness of Plant l
Condition........................................... 7-11 7.2 EVALUATIONS FOR INTERPRETATION AND PLANNING......... 7-15 7.2.1 Evaluation Issue 5: Detecting and Understanding Disturbances Using Alarms........................................ 7-16 7.2.2 Evaluation Issue 6: Interpretation and Planning Using Workstation Displays.................................. 7-19 l
7.2.3 Evaluation Issue 7: Interpretation and Plannmg During I
f Single-Fault Event Using Alarms, Workstation, WPIS, and Procedures.......................................... 7-23 7.2.4 Evaluation Issue 8: Interpretation and Planning During Multiple-Fault Events Using Alarms, Workstation, WPIS, and j
Proced ures.......................................... 7-27 7.2.5 Evaluation Issue 9: Interpretation and Planning by Crew During Multiple-Fault Events Using Alarms, Workstation, WPIS, and Proced ores.......................................... 7-31 7.2.6 Evaluation Issue 10: Interpretation and Planning by Crew During Severe Accidents Using the Technical Support Center, Alarms, Workstation, WPIS, and Procedures................ 7-35 7.3 EVALUATIONS FOR CONTROLLING PLANT STATE,............. 7-39 7.3.1 Evaluation Issue 11: Simple Operator-Paced Control Tasks..... 7-39 7.3.2 Evaluation Issue 12: Conditional Operator-Paced Control Tasks. 7-42 7.3.3 Evaluation Issue 13: Control Using Multiple, Simultaneous Proced ures......................................... 7-46 7.3.4 Evaluation Issue 14: Event-Paced Control Tasks............. 7-50 7.3.5 Evaluation Issue 15: Control Tasks Requiring Crew Coordination........................................ 7-53 WCAP-14701 May 1997 3638w.wpf.114506WRevision 1
~
v TABLE OF CONTENTS (Continued)
EVALUATIONS FOR CONFORMANCE TO HUMAN FACTORS ENGINEERING DESIGN GUIDELINES................
7.4 Conformance to HFE Guidelines......... 7-58 7.4.1 Evaluation Issue 16:
EVALUATIONS FOR VALIDATION OF INTEGRATED HSI.........,. 7-7.5.1 Evaluation Issue 17: Validation of Integrated tC........... 7-59 7.5 REFERENCES...................................................
8-1 S
l May 1997
~WCAP-14701 3638w.wpf:1b450697Revnion 1
-h vi LIST OF TABLES 8Me1 Major Evaluation Issues...,,,,,,,,,,,,,,,,
5VCAP-14701 DW WPf:Ib45(697 Revision 1 May 1997
vii LIST OF FIGb1ES AP600 Concept Testing and Verification and Validation Activities......
1-2 Figure 1 4-2 Methodology for Developing Verification and Validation Plan.
Figure 2 Integration of the Verification and Validation Test Program in the Figure 3 4-3 HSI Design Process...........................
4-7 Testbed Fidelity Dimensions and Evaluation issues..................
Figure 4
~
.. 5-12 Mapping of HSI Resources to Operator Decision-Making Model...
Figure 5 Data Collection and Analysis Process...........................
6-4 Figure 6 i
l i
I I
May 1997 WCAP-14701 3638w.wpf.1M50697 Revision 1
_ _ _. - - - _ _ _ - - _ - - - - - - - - - - - - - ~
viii LIST OF ACRONYMS AND ABBREVIATIONS CPS Computerized Procedure System CRT Cathode Ray Tube EOP Emergency Operating Procedure HFE Human Factors Engineering HSI Human System Interface MCR.
Main Control Room M-MIS -
Man-Machine Interface System PWR Pressurized Water Reactor QDPS Qualified Data Processing System TS Technical Specificatioru V&V Verification & Validation WPIS Wall Panel Information System VDU Visual Display Unit I
WCAP-14701 3638w.wpf:ltas0697Raision 1 May }99f
1-1 1
INTRODUCTION This document describes the methodology, analysis, and results of the process used to define the AP600 human system interface (HSI) design test program.
The AP600 HSI design test program consists of two parts; Concept tests to be performed as part of the HSI design process Verification and Validation (V&V) tests to be performed at the completion of the j
AP600 design process i
The AP600 HSI design test program is integrated with the HSI design. Figure 1 summar2es the major elements of the AP600 HSI design test program and their relation to the HSI d process.
As described in the AP600 Standard Safety Analysis Report (SSAR) subsection 18.8.1, concept testing is performed as part of the HSI design process. During the functional design phas the core conceptual design for an HSI resource and corresponding functional requirements are developed. An integral part of this phase is rapid prototyping and testing of design concepts. Concept testing during the functional design phase serves two purposes:
It provides input to aid designers in resolving design issues that have no well-established human factors guidance.
It establishes the adequacy of the design concept and functional requirements that are produced in the functional design stage. A main objective of concept testing is to establish that the conceptual design is adequate to support operator performance in the range of situations that are anticipated.
This document provides an overview of the human performance evaluation issues address as part of the AP600 concept testing. The process by which these issues are selecte general approach to testing these issues is also described. Reference 1 describes the c tests planned as part of the AP600 HSI design process.
May 1997 Introduction 3638w.wptib-050697Rnision 1
{E E'
?
4 C
. E 9.
m i
R 5
?
w i
l e
g i
HFE Verfilcat.on and ValldeWon
~
l I
- "2 Hsl HSI g
HSI Task HFE DE Functional 4 Implementation Support Design I
Ey Design
& Integration Wrfacation
%-;rn, ap (Hanfware & Software) g
>4 I
ny a
I kp i
- Design
- Resolve design issues I
I Wts we
[ {7
- Establish adequacy of near fuf4 cope, integrated l
- Func5onal design concept and l higtFfidenty,
,,Y Syslem 4
requirements functional requirements l trainbg simulator WW I
j 9
V E
i g
1 P I
e I
3 Concept Testo Issue q
g 3
Resolution l
Wethe-loop test of concrete l
VerfMcation
~
J.
examph of functionaldesign:
I l
g
- Rapidprototypes I
1 P g
a
- Part-tasksimulations g
- High-fidelitysimulefor FinalPlant HFE I
l g
fbr similar plant I
Verificatiort g
E I
Factory s-y 3
g acceptance test e S;ge I
n*
acceptance test l
I
-_-_____I F
x l
l 4
1-3 The AP600 human factors engineering (HFE) V&V program is performed at the completion of the HSI design when hardware prototypes of the HSI resources are available. The AP600 HFE V&V includes:
HSI task support verification
=,
HFE design verification Integrated system validation
~
=
Issue reselution verification Final plant HFE design verification This document provides a description of the human performance issues addressed as part of the AP600 HSI test program, and the general evaluation approach used. A programmatic j
level description of the activities conducted as part of the V&V program is presented in l
I Reference 2.
t i
k i
1 i
1 i
e i
I I
i i
Introduction May 1997 3638w.wpf;1b450697 Revision 1
(..
i l
.I 2-1 1
r I
2 GOALS OF HSI TEST PROGRAM The goals of the HSI test program are to:
Systematically evaluate human factors concerns that affect the plant performance Conduct these evaluations so that test results may be incorporated into the design of the HS1 The HSI test program plan:
Describes the process for conducting human factors tests and incorporating results into the HSI design process Identifies human performance issues related to the HSI that are important to safe and efficient operation of the plant Describes the general test approach for the concept and V&V test phases of the HSI design process j
k i
i
?
l l
l l
l i
Goals of HSI Test Program May 1997 3638w.wpf:1b.050697 Revision 1
t b
l 3-1 3
EVALUATION SCOPE i
i I
i The following items are addressed in the human factors test program for HSI:
i Plant Facilities - Facilities ircluded in the scope of the AP600 test program are the main control room (MCR), the technical support center, the remote shutdown facility j
j and local control stations.
i
~
Plant Staff Activities - Activities required to operate under normal, abnomial, and emergency conditions are included.
3 1
4
. The test program for the AP600 HSI focuses on the following HSI resources:
k c
i Plant information system (including functional and physical displays of plant j'
. processes) j Alarm system 1
1 j.
Computerized procedure system (CPS)
Dedicated and soft (computer-based) controls 4
j Wall panel information system (WPIS)
=
J i
Qualified data processing system (QDPS) i 4
l In the test descriptions that follow, displays that appear on the control room workstation visual display units (VDUs) (for example, plant information system displays) are referred to l
as workstation disp'ays.
1 J
The passive niety features of the AP600 affect operator decision-making by affecting the type of information available to the operator, the alternatives the operators have for responding to plant upsets, and the time requirements for operator response (Ref. 3). These features pose requirements that are of the same type as for traditional platits, but may need some modification for the design of the AP600 HSI to support operator decisions. Consideration of these plant design features is important in the development of scenarios for those evaluations that require simulation of plant dynamics. In addition to evaluations of the control room, this plan addresses the application of human factors design guidelines to the HSI of the remote shutdown room and other operations and control centers.
Evaluation Scope May 1997 3638w.wpf:1b-050697 Revision 1 i
3-2 Control room personnel addressed by this evaluation include the occupant of the supervisor's console (shift foreman, senior reactor operator license), the reactor operator (s) located at the control workstations, and any additional staff specified as part of control room staffing assumptions for a particular plant mode or condition.
f f
4 i
Evaluation Scope way 1997 3638w.wpf;1bd Revision 1 P
e
4-1 FRAMEWORK FOR DEVELOPING THE HSI DESIGN TEST PLAN 4
A two-phase process is used to define tests as illustrated in Figure 2. Phase 1 is issue definition. The purpose of this phase is to integrate major operator activities with the HSI resources that support the operator activities in order to establish a set of human performance evaluation issues. Phase 2 addresses test development. The purpose of this phase is to develop testing plans for each of the evaluation issues identified in Phase I. A detailed description of the Phase 1 and 2 processes is presented in Sections 5.0 and 6.0, respectively.
Phase 2 involves development of test implementation plans. Section 7.0 provides a general description of the test approach to address each evaluation issue. Ti.e test implementation details are documented in individual test implementation plans that 4 re prepared for each concept and V&V test near the point when the test is scheduled to be performed.
INTEGRATION OF THE HSI DESIGN TEST PROGRAM IN THE HSI DESIGN 4.1 PROCESS Figure 3 depicts the relationship of the HSI human factors test program to the HS1 design process. The figure organizes information in six horizontal rows. The second row displays an abbreviated version of the HSI design process. The design process starts with a mission statement that defines the purpose and goals of the HSI resource. This leads to the establishment of human performance requirements and design bases, which include operator cognitive activities and behaviors that are supported by the HSI resource to achieve the mission statement. Functional requirements are developed to guide the development of the HSI design to support the human performance requirements. These functional requirements are implemented in the design of HSI components. HSI components are built as prototypes.
First, they exist as individual and partially integrated HSI prototypes. Finally, they exist as an integrated HSI hardware prototype after the components have been assembled and interfaced. The design process includes intermediate steps that are not depicted in the figure.
The row above the HSI design process represents those points in which human factors and cognitive psychology theory are applied to the design process. The human performance requirements are derived from a review of operating experience, a model of human performance (subsection 5.1), an analysis of major classes of operator activities (subsection 5.2), an analysis of the impact of changes in technology on performance, and a model of support (subsection 5.3). Next, human factors and cognitive science theory are applied to the development of functional requirements. Inputs include HSI design principles and guidelines that are obtained from the human factors and cognitive science disciplines.
Examples include results of research on human-computer interaction and human-centered May 1997 Framework for Developing the HSI Design Test Plan 3638w.wpf:1b.050697Rmsion 1
4-2
)
Phase 1. Issue Definition Define HSI y
Map HSl
,. 4 Define Resources Resources Evaluation Major 4
Evaluation to Operator lasues as issues Activities Unks Between Employ
/
(Model HSI Resources Human ofSupport) and Operator Performance Performance Model Identify Major Classes ofOperator Activities Phase 2. Test Development Develop Evaluation Define Evaluation Define Evaluation Document issueinto Testable
+
Approach for
+
Requirements for +
Evaluation Hypothesis and Concept Testing and Concept Testing Descriptions Performince Performance Testing:
and Performance Requirements Verification Testing:
Validation Verification Valklation Figure 2 Two Phase Process Used to Define the Human System Interface Design Test Program I
Framework for Developn.g the HSI Design Test Plan 3638w.wpf IbE0797 Revision 1 May 1997
J Model of Human Performance q
Human Factors /
Operating Experience Review cog Ps % y impact of Changesin Technology Major Classes of Operator Activities:
1 i
Cognitive Demands and Sources of Error -
i i
i j
u ir v
ir i
" U **" * " ""*
I Hsl Mission Rguimmenk aM Design Pme***
Statement Design Basis I
f I
Evaluation Tests 4
1 i
i I
i i
Evaluatum Test Beds I
l Evaluation Type
- l l
~
)
Evaluation Criteria 1
~
1 Notes:
l
- Test beds for Concept Testing range in fidelity from static drawings to rapid display prototypes to high fidelity simulator for similar plant. 4 _ 2 L c. of plant i
dynamics ranDe from scripted scenarios to dynamic plant simulebons
- Performance Testing is performed using production prototype componer.ts in a full scale, full size plant simuistor. Factory Acceptance Testing is also performed at this tune.
4 i
a j
Figure 3 Integration of the Verification and Validation Test Program in the HSI Design Process l
WCAP-14701 3638w wpf;1b-050697 1
j i
i
4-3 Human Factors / Cognitive Human Fadom/CognWe Science Principles for 5 ien e Principles for HSI Design Analytical and Experimental Evaluation v
Functional Requirements:
Individual / integrated IndMdualHS! Roquirements HSI Hardware
-- +
Integrated HSI Requirements Prototype Vorification of Functional Validation of Human Research to Guide HSt Concept Tests to Refine Requirements and Performance Development HSt Concepts Human Factors Guidelines Requirements (Man in (AnalyticalTests) the Loop Tests)
HSI Breadboard Designs M Full Scope Simulators
- ConceptTesting V&V Testing Presence of significant human Meets functional requirements and performance problems human factors guidelines for individual / integrated HSI IANSTEC
~
Meets individual / integrated HSI Assessment of Performance Benefits of altamatrve concepts APERTURE human performance requirernents y CARD h Am.
Aperturs %
Q7oS2Bo3GL-l
~
May 1997 Revision 1 t
4-5 l
design requirements found in References 4 and 5, and intemal design guidelines such as the display design guidelines.
Functional requirements development is guided by Man-in-the-Loop studies designed to test HSI design concepts. The design and analysis of these Man-in-the-Loop concept tests are guided by human factors and cognitive science methods described in References 3,6,7 and 8.
Human factors and cognitive science theory are applied to the design and analysis of V&V tests, which use integrated HSI hardware prototypes in a near full-scope, high fidelity simulator (See Ref. 9).
The third row depicts the types of evaluation tests in the evaluation program, including concept and V&V testing. Concept testing clarifies human performance issues and refines functional requirements for the HSI. V&V testing verifies that the functional requirements have been satisfied in the design and provides evidence that the design satisfies the htunan J
performance goals.
Concept testing is conducted during the functional requirements and design phase of the HSI design process. Concept testing involves specific Man-in-the-Loop tests of functional designs
)
of HSI resources. These design examples are referred to as "HSI Breadboard Designs" in Figure 3. Breadboard designs include design concepts represented through static drawings, rapid display prototypes, part task simulations, mockups, or actual HSIs being developed for plants that have similar properties to the proposed AP600 HSI functional design.
The purposes of concept tests are to:
Explore and clarify human performance issues associated with specific design concepts Contribute to the development of functional requirements for the HSI Contribute to the development of criteria for human performance requirements of
=
the HSI Qualitative information gathered through debriefing, discussions, or other means, is analyzed to identify design features that lead to confusion, errors, and slow or awkward actions by the subjects. These performance problems are evaluated in terms of their effect on the successful
~
completion of a task. Functional requirements can then be developed to address those design characteristics that have significant effects on system performance. The intention is to understand the mental burdens that specific design features impose on the users with respect to perception, attention, and memory and to develop functional requirements to systematically address these demands. Quantitative measures of performance are judiciously WCAP-14701 May 1997 3638w.wpf:1b-Os0697 Revision 1
4-6 t
j used as baselines to compare alternative designs and to evaluate performance benefits
]
i achieved through refinements of design concepts.
At the completion of the HSI design, V&V tests are conducted. The V&V tests include:
HSI Task Support Verification -
e Verifies that the HSI design provides the necessary alarms, displays, and controls to support plant personnel tasks HFE Design Verification Verifies that the HSI design conforms to HFE principles, guidelines, and standards 1
Integrated System Validation
{
Validates that the HSI design can be effectively operated by personnel l
Issue Resolution Verification j
Verifies that the HSI design resolves the HFE issues identified in the tracking
. system Final Plant HFE Design Verification Verifies that the final "as-built" product conforms to the verified and validated design A detailed description of the V&V tests that are part of the AP600 HSI design test program is presented in Reference 2.
4.2 MODEL OF TEST BED FIDELITY Many types of evaluation do not require near full-scope, high fidelity representations of the control room. A set of principles guides the specification HSI test bed fidelity when defining evaluation requirements. A model of test bed fidelity provides this guidance. Figure 4 provides a graphic summary of this model.
' The following is a discussion of terminology:
Prototype characteristic consists of two parts: realism and completeness.
Realism refers to the degree to which the prototype resembles (looks and behaves
.l e
like) the actual system.
Completeness refers to the degree to which the prototype represents the total system.
I i WCAP-14701 May 1997 3638w.wpt1M50697 Revision 1
t I
g<
.9 37x m
1
%8
. W g
i a
a FIDELITY Hg Dimensions REAUSM Completeness a
Physical Functional ri
~d.
Form Information Dynamics b.E Content e
Abstract (e g, drawing) e Static e Part task simulation
, g (, g g o
Representative (e g., mockup) characters) e Stetic-discret9 e Full sirnulation a
e Actual (e g., petetype) g
- Medium (e.g.'ta)e Dynanne-slow, fast sample da
~~o
- U"**'C***
e n,gh (e 9-.
Y g
complete data) u 5.
Human Peaception:
Decision Dynamics integration y
perforim Making an ance Detectability e Procedure and e Perceptionof motion e Conpattaity of human &
{
aspects e
teguity my use
- Accuracyof action machme coamonents
~o'
- s e Diagnosis (e g, o Physicalfatyse prerequisites, Information e Satisfacten of mission T
Physical Fit:
side effects.
y post conditions)
Dynamics statement (validation) g e
Reach e Response to plant e
Strength dynenwes o Mentalworkload
- Vigilance e Usability e Navigation K
h4
!I g
-y y
4-8 A part-task simulator is a prototype that has limited completeness (it represents a small l
portion of the entire system) but often has a high degree of realism. Realism can be further l
broken down into the following two components:
Physical fidelity refers to the degree to which the physical form of the test bed looks
=
and feels like the actual system.
Functional fidelity refers to the degree to which the test bed behaves like the actual system.
Physical form can be characterized by three categories:
Abstract - A representation that has little resemblance to the actual system (such as a drawing)
Representative - Some relevant physical characteristics are presented (such as, a
=
three-dimensional mockup of a console that is constructed with foam core)
Actual - Actual hardware (such as production prototype equipment)
Functional fidelity has two characteristics: information content and dynamics. Information content pertains to the data and text provided in the HSI test bed. For example, a display system test bed can contain names of actual plant components and realistic values or just strings of random alphanumerics. The fidelity of information content can be characterized in three levels:
Low - Random data or characters are used as place holders to fill the data fields of interest. Data are neither accurate nor complete. This level of fidelity is used for tests of legibility.
Medium - Relevant data fields do not contain accurate and complete data. Data fields are partially filled. Data is random or fictitious. This level of fidelity is used for studies of display space navigation. Subjects use menu headings and other aids to locate a specific position in the display space.
High - Relevant data fields contain accurate and complete data. This level of fidel
=
is important for evaluations that address complex decision-making.
~
3638w.wpf:1b 050697 May 1997 Revision 1 1l
f 4-9 i
Dynamics refers to the behavior of the HSI as represented in the test bed. At leas,t of representation are possible as follows:
Individual static presentation Sequential static representation (sometimes called a slide show) j Continuous dynamic, not real-time (such as slow or fast)
Continuous dynamic, real-time Tasks that require physical skills such as reach and dexterity require a. high degree of physical fidelity in the prototype. For example, operation of soft controls requires dexterity, speed, and accuracy. Evaluation of alternative soft control methods (such as mouse-driven, poke points, touch screens, and keyboard commands) requires high physical fidelity.
Functional fidelity (that is, how it actually operates) is less important in this instance.
Cognitively demanding tasks require a high degree of functional fidelity to provide a valid i
test case for operator decisions. Important considerations include provisions for a sufficient data set, so the operator's problem is represented, as well as a data set updated at a sufficient i
rate to simulate system dynamics and time constraints.
i l
4.3 TESTING DIFFERENT LEVELS OF STAFF INTERACTION Three levels of staff interaction are considered in the HSI test program: individual, crew, and plant. Evaluation issues at the individual level are concerned with the demands that the HSI j
imposes on basic human capabilities (such as, workload, perception, decision-making, or anthropometrics). Issues at the crew level include these considerations as well as the flow of information and the coordination of work between crew members. Issues at the plant level include coordination of control room tasks with tasks performed in other parts of the plant (such as local equipment panels, and the technical support center). Evaluations of issues at the crew and plant level are performed during the later stages of the design process because a higher level of plant design detail and prototype fidelity are required.
r i
I l
l I
l I
i WCAP-14701 May 1997 M W P.It450697 f
}
5-1 5
PHASE 1: ISSUE DEFINITION The objective of Phase 1 of the HSI test plan development methodology is to identify the major evaluation issues to be tested. This involves several activities as shown in Figure 2.
First, the main HSI resources to be included in the evaluation are identified. These are used as a starting point to define how the HSI is intended to support operator performance and to bound the evaluation issues considered. Next, a human performance model (adapted from Reference 10) is specified. Reference 26 provides a description of the operator's decision-making model as adapted for the AP600 HFE program. Based on the model, three major classes of operator activities are defined:
Detection and monitoring / situation awareness Interpretation and planning Controlling plant state For each major class of operator activity, the types of conditions that can increase task complexity, the cognitive demands posed by these situations, and the potential types of human errors that can result are identified. This analysis draws on operating experience reviews, including analyses of operator performance during actual and simulated emergencies described in References 11 through 14; cognitive task analyses of nuclear power plant operator performance discussed in References 15 through 19; and models of human decision-making in complex systems and human error (Refs.10,20, and 21).
The analysis of operator activities and cognitive demands defines:
The major classes of operator activities that the HSI needs to support The types of complex situations that need to be sampled in evaluating the effectiveness of the HSI in supporting each of these three classes of operator activity The HSI resources intended to support each of these operator activities are then identified.
This defines the model of support evaluated as part of the HSI test plan.
The set of issues to be tested is derived based on joint consideration of the HSI resources intended to support each operator activity class and the dimensions of compicxity that can arise (Refs. 23 and 24).
The final set of test issues is organized into three categories corresponding to the three major classes of operator activity. Within each class, an attempt is made to start with issues that examine the role of a single HSI resource and then progress to test issues that assess the joint effect of multiple HSI features. A second theme in defining the set of test issues is to start Phase 1: Issue Definition May 1997 m:\\3638w.wpf:1b-Os0697 Revnion 1
5-2 j
l with studies that test the ability of the HSI to support operator performance on straightforward tasks and then to progressively test the ability of the HSI to support operator performance in cognitively complex situations.
l l
The elements of the issue definition process are described m sube uns 5.1 to 5.3, and include:
Human performance model used (subsection 5.1)
=
Major classes of operator activities identified and the cognitive processes that are
=
involved in performing these activities (subsection 5.2)
Mapping of HSI resources to Operator Activities (i.e., how the HSI resources are intended to support the cognitive processes involved in performing the operator activities identified) (subsection 5.3)
Subsection 5.4 presents the list of human performance evaluation issues that results from applying this process to the AP600 HSI design.
5.1 HUMAN PERFORMANCE MODEL The operator decision-making model (Refs. 25 and 26), which is adapted from the model of operator decision-making developed by Rasmussen (Ref.10),is used to support the design and evaluation of the AP600 HSI.
The human performance model provides a high-level description of the operators' decision-making tasks. The model helps identify a number of performance issues that establish the bounds of the human-machine evaluation.
The model identifies four major cognitive activities to be supported: detection and moritoring/ situation awareness, interpretation and planning, control, and feedback. The major cognitive activities defined by this model are discussed in the following subsections.
5.1.1 Detection and Monitoring / Situation Awareness Operators monitor plant parameters to understand the plant state (Ref.19).
Phase 1: Issue Definition May 1997 m:\\3638w.wpf:1b-Os0697 Revision 1
5-3 In emergency or abnormal situations, operators are alerted to (detect) a d t
to monitoring of plant parameters to identify what is abnormal. Operators may try o answer questions such as:
l Where is the mass in the system?
f Where is the energy in the system?
=
=
i What is the reactivity?
=
Where is the radiation?
What critical safety functions have been violated?
=
A second concern in this stage of decision-making is data quality. The relia of plant state indications is assessed.
This description of detection and monitoring is oriented primarily toward emergency h
thing is operations. That is, detection and monitoring are initially driven by a cue t at so i
t t to abnormal. It is important to support detection of abnormal states, but it is also mpor a maintain an awareness of plant status and system availability under normal and ou conditions.
The model is defined broadly to address detection and monitoring during each p d
condition. It includes active monitoring guided by procedures or a supervisor, an ort monitoring that is passive, such as board scanning. It also includes monitoring to su awareness of the goals and activities of other agents, both people and machines.
Based on the results of these monitoring activities, operators develop an awa defined as state that is referred to as " situation awareness." Situation awarene "the perception of the elements of the environment within a volume of
" (Ref. 27).
comprehension of the meaning, and the projection of their status in the near 5.1.2 Interpretation and Planning The mc st critical components of decision-making are correct situation assess identification of the most appropriate response plan (procedure), given the d This the plant. In some cases identification and procedure selection is straightforwa t
y corresponds to Rasmussen's rule-based level of performance. In other cases have to integrate multiple information sources for correct situation assessme tradeoffs among operational goals. That is, if multiple failures occur, more than on procedure may be indicated or the standard procedure may need to f
safety functions. The identification of a response plan becomes difficult in the h
multiple failures. It can become even more difficult under severe accident c multiple safety systems may be lost and system data may be unreliable. Thes May 1997 Revision 1 Phase 1: Issue Definition mA3638wa7 ;1tsis0697 f
I 5-4 designed to support both rule-liased and knowledge s
Coordination between operators, and between operators and automat considered in this area of decision-making. The need for coordination procedures is not explicit in the performance model. The process of initial a human and automated resources, and later coordination of tasks (g included in the interpretation and planning area of the model. The HSI mode the monitoring of goal achievement, which is a means to assess how well each es explicit automated system is progressing in achieving goals.
perator or 5.1.3 Control Control involves decisions in the initiation, tunmg, and termmation of plant proc Control is s.*mpler for operators when they control the pace of an event. Control b more difficult when multiple individuals or autonomous systems must be coordinated to execute a task.
I Controls, indicators, and procedures may exist in software space, and are acc called to the screen. This type of access of displays and controls may cause a burden on vperator to efficiently find a control or display. While the control area of the model does not explicitly call out the process of locating procedures, controls, and displays, they are considered part of this area of the model.
I i
5.1.4 Feedback Feedback occurs at several levels. Initially, operators need to verify that the control is implemented. Second, operators need to monitor the state of plant parameters and processes to determine whether the actions are having the intended effect. The final, and most critical, level of feedback is an evaluation of whether the operational goalis achieved. Operators may ask, "Is the goal satisfied?" or "Is the current procedure achieving the desired purpose?"
5.2 MAJOR CLASSES OF OPERATOR ACTIVITIES Based on the human performance model, three major classes of operator activities that the HSI support are defined:
Detection and monitoring / situation awareness Interpretation and plannmg Controlling plant state Phase 1: Issue Definition May 1997 m:\\3638w.wpf:1t>-050697 g,yision 3
5-5 These classes correspond to the first three major performance areas of the operator decision-making model. Feedback is dealt with as an element of the controlling plant state activity, rather than as a stand-alone activity, because it is primarily associated with activities.
Because these classes of activities are derived from a human performance model, they are able to characterize a broad range of operator tasks. As a result, they are jointly able to encompass each type of activity that arises during operation.
Analysis of the cognitive demands associated with these act;vities provides a basis for defining the human performance requirements for plant operation and for maintaining safety. This defines the range of activities and situations that the HSI supports.
The following attributes are discussed in this section:
The main characteristics of each class of operator ac'dvities The types of conditions that can arise that increase task complexity The potential types of human errcrs The analysis of the cognitive demands draws on analyses of operator performance duri actual and simulated emergencies (Ref.11 through 14); cognitive task analyses of nuclear power plant operator performance (Refs.15 through 19); and models of human 10,20,21, and 22).
decision-making in complex systems and human error (Refs.
The analysis of the dimensions of task complexity draws on a framework for such co systems (Ref.17). This analysis defines sources of task complexity that include:
Characteristics of the task (dynam sm, many highly interacting parts, risk, and t
uncertainty)
Characteristics of the agents (multiple agents, both human and autonomous machines Characteristics of the HSI (system hmetions such as information integration requirements and display access requirements)
The analysis of operator activities and cognitive demands defines the major classes of operator activities that the HSI support.s, and the types of complex situations that are sampled in evaluating the effectiveness of the HSI in supporting each of the classe operator activity. The descriptions of operator activity and potential sources of perf problems, together with the descriptions of how the AP600 HSI features mitigate th performance problems, provide the basis for defining the major evaluation iss and the range of complex situations that need to be sampled during testing.
May 1997 Phase 1: Issue Definition Rnision 1 m:\\3638w.wpf;1b-Os0697
5-6 5.2.1 Detection and Monitoring / Situation Awareness
' This class of operator activities encompasses those activities that are conc obtaining information about plant status. It includes the periodic ac monitoring that determines current status and availability (such as assessin power level, temperature, pressure, and systems available); monitoring neede l
malfunctions or trends that are too small to activate an alarm; the pr that accompanies shift tumover; and monitoring directed by queries abou values.
e er I
A distinction is made between " active" and " passive" monitoring. Active m obtaining information about plant state through active manipulation of the d interface. Passive monitoring refers to maintaining an awareness of plant state w manipulation / navigation of the display system. It is analogous to the practice b of traditional control rooms in maintaining situation awareness of changes in monitoring the control board. In the AP600, the primary HSI resource that supports situation awareness is the wall panel information system (WPIS).
Dimensions of Task Complexity The following factors contribute to the complexity of detection and monitoring:
Many plant indications are available at different levels of abstraction (such as equipment status, process status, function status and goal status)
Normal parameter values very with plant conditions Appropriate parameters for determmmg plant status vary with plant conditions Some expected, plant parameter behavior is difficult to assess; relevant parameter information needs to be immediately available to be called up Relevant data may be distributed across individuals Some goals and status of automated systems are difficult to observe i
The following are potential types of human error-i Failure to detect / observe relevant plant parameter values (an error of omission) l I
Misreading relevant plant parameter values (an error of commission) i l
i Phase 1: Issue Definition r
May 1997 mA3638w.wpf:11450697 Revision 1
5-7 Failure to identify or misinterpreting plant state or implications of plant state 2
Failute to identify goals and activities of other agents (person or machine)
Failure to communicate to other personnel (for example, during shift turnover) plant state or system information (either an error of omission-not mentioning informat or an error of commission-mentioning incorrect information)
\\
5.2.2 InterpwNhn and Planning l
l The interpreta'aon.nd plannmg class of operator activities encompasses those acti concemed t ith dution assessment and response planning. The focus is on situation o plant disturbances. In exploring this class of activities, the emphasis is require resporde d goals,
- n identifying p en aturbances, assessing their implications for plant functions an and selecting /formalating a response plan. The focus is on the cognitive activities underlying intention formation, rather than response execution.
While response execution is an important part of handling emergencies, it is also d as controlling the plant during rormal operation. Therefore, response execution is covere part of the controlling plant state class of activities.
In evaluating the extent to which the HSI supports operator intention formation d disturbances, the range of plant disturbances that may arise is considered.
Small Upsets - These disturbances do not lead to a plant trip and can include disturbances that lead to alarm response procedures.
Controllable Upsets - These disturbances lead to a plant trip but are the result of single malfunction that is recoverable using emergency procedures.
Multiple Fault Accidents - These disturbances require identification of mult that can mask each other and/or require consideration of multiple constraints (side-effects) in fonnulating recovery strategy.
Severe Accidents - These disturbances are more difficult, in that they require additional personnel te diagnose and handle (that is, a need for coordmation o multiple personnel and engineering expertise) and they are not addressed by formalized procedures (therefore, a need for knowledge-based behavior).
The HSI evaluttion covers the types of disturbances described above. Included involve malfunctions in automated systems requiring the operator to identify a nee
\\
manual override.
May 1997 Revi. ion 1 s
Phase 1: Issue Definition m:\\3638wspf:1b4150697
5-8 Dimensions of Task Complexity The following factors contribute to the complexity of this activity:
Multiple faults may produce large numbers of alarms, making the dete particular alarm difficult (due to attention overload) on of a Evidence of plant disturbance may be missing or obscured (that is by another fault)
, masked or altered Changes in plant state may make familiar cues inappropriate {such as may have different significance under different plant conditions) ensors that Multiple faults may create goal conflict situations requiring tradeoffe competing goals among Information on the goals and status of automated systems may be dif cult to assess The following are potential types of human error:
Failure to observe or recognize an abnormal plant state or system malfunction Failure to develop a correct system understanding (perhaps due to correctly interpret the evidence) a failure to being entertained) Fixation errors (ignoring evidence that is incons eses that are Over reliance on familiar cues or response plans Missing negative side effects associated with a response plan; mi a conflicts Making inappropriate goal tradeoffs 5.2.3 Control Plant State The controlling plant state class of activities is concemed with makin including tuning plant parameters, in plant mode (such as startup g changes in plant state, surveillance tests, and taking systems out of ope
, shutdown, intermediate rming this class of activities, the emphasis of the evaluation is on the g ng out). For planrung and the execution of responses.
Phase 1: Issue Definition m:\\3638w.wpf:lt4150697 May 1997 Revision 1
5-9 A distinction can be made between operator-paced (procedure-paced) control activities and event-paced (plant dynamics-paced) control activities. Operator-paced activities are activities where the rate at which a maneuver is performed is determined primarily by the operators performing the task. Event-paced control activities are activities where the rate is prima
(
I f
determined by the process dynamics of the event controlled.
A second distinction can be made between maneuvers that can be performed by a single individual versus maneuvers that require the coordination among multiple individuals and/or automatic systems. Manual plant startup is an example of an activity that is both event-paced and requires coordination of multiple operators. Automatic plant startup is an enmple of an activity that is event-paced and requires supervisory control of autonomous systems.
The simplest case of control execution occurs when there is ample time, control actions are discrete (all-or-none actions, such as turning on a pump), control actions can occur in any order, little or no coordination is required, and control actions have no side effects that impact other plant processes or plant operability. Complications set in as this simplest cas time becomes short; controls are used in a fixed order; controls are at physically altered:
disparate locations; or control actions are continuous and require small tunmg adjustments; there are lags between the time a control action is taken and when an indicator reflects the change or control actions require strict coordination between operators or between an operator and an automated process.
An important aspect of control execution is the need to obtain feedback from the system t the action has been successfully executed. This feedback can occur at severallevels. First, there is an indication from the control itself that an action is taken. In the hard-wired environment, a light changes state or a toggle switch changes position. With soft controls, the change may be more transient and less noticeable. Next, there must be an effect on the parameter or display that is manipulated. Time lags may exist that make this detection more difficult. Finally, the plant process or system that the operator is intending to control shows a response to the control action to close the feedback loop.
With supervisory control of automated systems, there is a need to assess what goal the automated system is attempting to achieve. That is, whether the automated system is performing correctly or whether intervention is required, and if so, what manual actions ara taken.
May 1997 Phase 1: Issue Definition Revision 1 m:\\3638w.wpf:1b-050697
5-10 Dimensions of Task r.omplexity The following factors contribute to the complexity of this activity:
Complex process dynamics (such as rapid process changes or long constraints on operators and/or require open-loop responses Actions of multiple operators may be interdependent, requiring
=
communication / coordination among multiple individuals (such as assess state, anticipating future plant state or preventing working at cross-purpose Actions may have negative side effects requiring assessment of precondit action is taken, and assessment of post-conditions and execution of addit re after the original action is taken. For example, when tagging out a trai s
the operator must be cognizant of preconditions that must be satisfied befo operability. The operator must also be cognizant o re the train taking the system out of service, such as limits on plant operation and con t which additional systems can be taken out of service and automated s raints on malfunction or fail to keep up with process dynamics.
The following are potential types of human error:
Failure to check preconditions, anticipate side effects and post-conditions
=
Failure of execution (that is, either an error of om% ion-not taking a requ or an error of commission-taking the wrong action or taking actions in wrong on, sequence)
Failure to observe feedback of actions (that is, monitor that the action
=
executed; monitor that the action had the desired effect on the plant para y
process, and goal hierarchy)
Failu'.e to keep pace with process dynamics Failure to coordinate a id/or communicate with other crew members
=
Failure to monitor automated systems and take manual intervention w e
Phase 1: Issue Definition mA3638wspf:1b-050697 May 1997 Revision 1
5-11 J
5.3 MAPPING OF HSI RESOURCES TO OPERATOR ACTIVITIES (MODEL OF SUPPORT)
Subsections 5.2.1 through 5.2.3 describe three classes of operator activities that are supported 4
by the HSI and the major cognitive processing stages that underlie these activities. These 1
subsections identify the scope and boundaries of the tasks to be included in the evaluation of the HSI. They also identify the dimensions of task complexity and human error. This foundation allows one to tie the various HSI resources to tasks. That is, each HSI resource is intended to support human performance in simple and complex tasks and to reduce error.
j in this subsection, links are drawn between the HSI resources and the operator activities to show how the HSI resources support control room performance. This supports the development of evaluation issues for testing those relationships. More specifically, the j
evaluation issues link an activity, one or more HSI resource, and a performance measure.
The hunum performance evaluation issues are discussed in subsection 5.4.
The mapping of AP600 HSI resources to operator activities is accomplished by reviewing the rationale for each HSI resource. An understanding of each resource's rationale provides a
)
means for relating it to the human performance model and to the operator activities. Because the design of the HSI resources is not complete, there are limits on the detail that can be assigned to the model at this time.
Figure 5 identifies the primary mappings between HSI resources and the elements of the operator decision-making model that they are intended to support.
)
The following subsections describe mappings between the operator activities and HSI resources that are important for supporting the development of evaluation issues and the HSI resources.
5.3.1 Detection and Monitoring / Situation Awareness Wall Panel Information System (WPIS)
The WPIS provides high-level information about the status of safety and availability goals, allowing operators to quickly identify violations. The WPIS also indicates plant operating I
made and a corresponding set of plant parameters that are important to monitor. This aids operator monitoring by bringing together the most meaningful data in a central location.
Functionally Organized Alarm System The value of the functionally organized alarm system for detection and monitoring lies in focusing attention on the most significant alarms. Therefore, data overload is reduced. The Phase 1: Issue Definition May 1997 m:\\3638w.wpf:1b-050697 Revision 1 j
S-12 Detection and Monitoring / Situation Awareness Alert Observe identify State Alarm System WPlS WPIS WPIS Plant information Syelem Alarm System QDPS Plant information System GDPS compart=d Pmoedm Interpretation / Planning implications Goal Selection Plan Success Select /
)
of State compuntzed Pmoodu,e.
Path Formulate Plant informahon S Plant informaton System Actions p
M computertand Procedures Plant Informaton System Computerized Procedures Plant Irdormaten System Control Execute Actions Soft cont,ois
- co,*
- Feedback Monitor Goal Monitor State Verify Action Achievement WPS ww Alarm System Aintm System Dodcated Controls WPIS Plant Informaten System ON Plant Information System CDPS Figure 5 Mapping of HSI Resources to Operator IEcision-Making Model Phase 1: Issue Definition May 1997 m:\\363Sw.wpf Ib 050697 Revision 1 1
l 5-13 alarm system removes redundant or less meaningful alarms from the set of alarms that are activated.
j Plant Information System and QDPS The functional and physical displays support operators in monitoring plant data. The functional and physical displays provide detailed information by allowing access to any
~
parameters through a network of displays that can be obtained by the operator. These displays provide indication of data quality (such as failed or unreliable sensors) and context for plant data by hnkmg the physical views with the functional views. They also support the monitoring of automated systems.
t The remaining major HSI resources, dedicated and soft controls, and the procedures are not tied to supporting detection and monitoring.
5.3.2 Interpretation and Planning Functionally Organized Alarm System The alarm system aids the operators in selecting appropriate views of the plant and appropriate procedures for mitigating the abnormal event. The alarm system reduces confusion by subordinating alarms that are misleading or secondary to the primary disturbance. It also cues operators to multiple fault situations and/or situations where multiple safety goals are compromised.
Plant Information System and QDPS r
The functional and physical displays aid situation assessment and planning by encouraging operators to take a functional view of the plant that is tied to the physical view. The functional view presents information about the current goal, goal violations, processes required to satisfy the goal, and potential side effects. The intent is to provide a tool for planning activities that reduces the likelihood that the operator loses sight of the larger picture when engaged in control activities.
Computerized Procedures The procedures created for MCR operators formalize the set of appropriate control actions that are available to achieve safety and availability goals. 'Ihese are the set of actions opera-tors should take. An important operator cognitive activity in using procedures is selecting the appropriate procedure and periodically evaluating its appropriateness. The procedures aid the operators in making these decisions.
Phase 1: Issue Definition May 1997 m:\\3638w.wpf:1b-Os0697 Revision 1
5-14 Wall Panel Information System (WPIS)
The WPIS maintains a high-level view of safety and availability goals so that operators can assess how well the current response plan is achieving its goal. The WPIS reflects significant changes in plant status that are tied to the appropriateness of the procedure. This overview system also lets crew members in the main control area share information about current goals and responses.
The remaining major HSI resources, dedicated and soft controls, are not strongly tied to supporting interpretation and planning.
5.3.3 Controlling Plant State Dedicated and Soft Controls The control devices clearly communicate to operators the available control actions. They also provide feedback to the operator indicating that a control action is successfully performed.
For example, a control should provide a visual, auditory, or tactual cue to indicate a change in setting. Operators should not become confused when locating, selecting, or executing a control action.
Computerized Procedures The procedures are the specific instructions for execution of the control activities. These are clear and concise, avoiding confusion or underspecification of control actions or their criteria.
The procedures also clearly indicate their intent so that operators can more easily determine the procedure's appropriateness.
Wall Panel Information System (WPIS)
The WPIS provides an overview of plant status to control room personnel in the main control area, including those with no access to a compact workstation. It provides a source of information on the control activities of other operators to support crew communication and coordination. It also supports feedback on control actions, particularly at the level of rnonitoring plant state and goal achievement.
~
Plant Information System The plant information system provides a means for each operator to view the activities of other operators involved in coordinated or related control activities. The value of this viewing is related to error detection, control action timing, and feedback on the effects of rnultiple control actions.
Phase 1: Issue Definition May 1997 m:\\3638w.wpf:1b-050697 Revision 1
5-15 The remaming major HSI resources, alarm system and QDPS, provide feedback about the success of control actions.
5.4 HUMAN PERFORMANCE EVALUATION ISSUES A set of human performance evaluation issues is derived based on consideration of the major classes of operator activity, the HSI resources intended to support each operator activity class, and the analysis of dimensions of complexity that could arise to increase performance demands. This subsection presents the results of applying this analysis.
The evaluations are organized into three main groups corresponding to the three major classes of operator activity. Within each group an attempt is made to start with issues that examine the role of a single HS! resource and then progress to issues that assess the joint effect of multiple HSI resources. A second theme in defining the set of issues is to start with studies that test the ability of the HSI to support operator performance on straightforward tasks and then to progressively test the ability of the HSI to support operator performance in cognitively complex situations.
In addition to these groups, two additional evaluations are specified. The first is conformance to human engineering design guidelines, included to address existing design guidelines that apply to control renms. This corresponds to the HFE verification task described in the Programmatic Level Description of the AP600 Human Factors Verification and Validation Plan (Ref. 2). 'Ihe second is validation of the integrated HSI, included to address the requirements for a validation of a fully integrated HSI. This corresponds to the integrated system validation task described in the Programmatic Level Description of the AP600 Human Factors Verification and Validation Plan (Ref. 2). These five groups and corresponding evaluation issues are shown in Table 1.
The AP600 V&V plan includes three additional V&V tasks: task support verification, issue resolution verification, and final plant HFE verification. Additional descriptions of these V&V tasks are provided in Reference 2.
May 1997 Phase 1: Issue Definition Revision 1 m:\\3638wxpf:1 bas 0697 W
5-16 Table 1 Major Evaluation Issues Operator Activity: Detection and Monitoring Issue 1 Does the WPIS and the workstation summary and overview displays support the operator in maintaining an awareness of plant stahis and system availability without needing to search actively through the workstation displays?
Issue 2 Does the WPIS support the operator in getting more detail about plant status and system availability by directed search of the workstation functional and physical displays?
Issue 3 Do the HSI features support efficient navigation to locate specific information?
Issue 4 Do the HSI features effectively support crew awareness of plant condition?
Operator Activity: Interpretation and Planning Issue 5 Does the alarm system convey infonnation in a way that enhances operator awareness and understanding of plant condition?
Issue 6 Does the physical and functional organization of plant information on the workstation displays enhance diagnosis of plant condition and the plannmg/
selection of recovery paths?
Issue 7 Does the integration of alarms, WPIS, workstation, and procedures support the operator in responding to single-fault events?
Issue 8 Does the integration of alarms, WPIS, workstation, and pr edures support the operator in interpretation and plannmg during multiple-fault events?
Issue 9 Does the integration of alarms, WPIS, workstation, and procedures support the crew in interpretation and plannmg during multiple-fault events?
Issue 10 Does the integration of alarms, WPIS, workstation, and procedures support the crew in interpretation and plaruung during severe accidents?
i i
Phase 1: Issue Definition May 1997 m:\\3638w.wpf:It>O50697 Revision 1 l
5-17 Table 1 Major Evaluation Issues (cont.)
Operator Activity: Controlling Plant State Do the HSI features support the operator in performing simple, operator-paced Issue 11 j
control tasks?
Do the HSI features support the operator in performing control tasks that require Issue 12 assessment of preconditions, side effects, and post-conditions?
Do the HSI features support the operator in performing control tasks that require Issue 13 multiple procedures?
Do the HS1 features support the operator in performing event-paced control tasks?
Issue 14 Do the HS1 features support the operator in performing control tasks that requires Issue 15 coordination among crew members?
Conformation to HFE Design Guidelines Do the HS1 components satisfy relevant HFE design guidelines?
Issue 16 Validation of Integrated HSI Does the integration of HSI components satisfy requirements for validation of control Issue 17 room functions and integrated performance capabilities?
May 1997 Phase 1: Issue Definition Revision 1 m:\\3638w.wpf:1t450697
6-1 6
PHASE 2: TEST DEVELOPMENT This section discusses the development of tests to examine the human performance issues identified in Phase 1. Test development involves developing:
Testable hypotheses and performance requirements from the evaluation issues e
Approaches for conducting each evaluation Requirements for conducting each evaluation Written descriptions of each evaluation
=
Each of these activities is depicted in the lower portion of Figure 2. They are described in the following subsections.
6.1 TESTABLE HYPOTHESES AND PERFORMANCE REQUIREMENTS The first activity of test development is to develop testable hypotheses and performance requirements from the human performance evaluation issues. The evaluation issues defined in Phase 1 describe characteristics of operator performance that are critical to safe and efficient operation of the AP600. For each issue, the following are developed:
A testable hypothesis Performance requirements Each testable hypothesis and performance requirement specifies how the HSI resources are expected to support opcrator performance with respect to a particular evaluation issue. A testable hypothesis is simply a statement that guides the evaluation of design concepts.
Quantitative and qualitative data are collected to determine whether a design concept supports the hypothesis, and if so, how wellit supports the hypothesis. This process is used to:
Explore and clarify human performance issues associated with specific design concepts Contribute to the development of functional requirements for the HSI Contribute to the development of criteria for human performance requirements of the HSI Performance requirements are statements of man-machine system behavior that the HSI supports to provide safe and efficient operation of the AP600. For each performance Phase 2: Test Development May 1997 m:\\3638w.wptib450697 Revision 1
---.. ~.--.
l 6-2 requirement, performance measures need to be developed. These are objective, measurable dimensions by which performance is evaluated.
(
6.2 EVALUATION APPROACH The second activity of test development is to define the evaluation approach. For each evaluation issue, an evaluation approach is defined to guide the development of concept testing. The following factors are considered:
Dimensions of task performance to be addressed, including types of scenarios and i
dimensions of task complexity Types of performance measures to be collected, including errors, response time, and i
operator understanding of plant condition i
Evaluation method to be used, including expert review, walk-through, simulation, and decision tracing Evaluation criteria, including absolute and relative measures of performance l
l Implications of the results, including selection of design alternatives, clarification of performance issues, refinement of functional requirements and HS1 design criteria -
)
Section 7.0 provides an overview description of the proposed evaluation approach for each of the HSI evaluation issues identified in Table 1.
6.3 EVALUATION REQUIREMENTS The third activity of test development depicted in Figure 2 is to define evaluation requirements. For each evaluation a set of requirements is defined, including test bed fidelity i
requirements, the required stage of development of the HSI, and subject characteristics. Test bed fidelity is discussed in subsection 4.2.
The concept and V&V testing is coordi~ ated with the HSI design process. Concept testing is n
performed with breadboard designs that demonstrate relevant aspects of the design concept.
The evaluation hypotheses and approach determine the required level of test bed fidelity.
. The level of test bed fidelity determines the required stage of development of the HSI design.
For example, the concept testing of Evaluation 3 is not conducted until design concepts exist for display formats, navigation aids, and display selection mechanisms. Also, preliminary decisions must be made regarding the display system hardware. V&V testing is performed using production prototypes or equipment that emulates production prototypes.
Phase 2: Test Development May 1997 m:\\3638w.wpf:1b-050697 Revision 1
,-wa y
m--.3
..n.-
3 r
-.- ~ -~, _.
- 6-3 The role of test participants during the concept testing evaluations is to represent characteristics of plant operators. While plant operations experience is desirable, it is not required for concept testing. Test participants may include nuclear operations training instructors, engineers, and designers. The individuals who participate in V&V testing evaluations are nuclear power plant operators trained in the use of the AP600 HSI. Test participant requirements, including training and experience, are defined for each concept and V&V test.
l 6.4 EVALUATION DESCRIPTIONS The last activity of Test Development in Figure 2 is to document the evaluation descriptions.
Overview descriptions of the general test approach are presented in Section 7.0. The test implementation details are documented in individual test implementation plans that are prepared for each concept and V&V test.
6.5 DATA ANALYSIS AND FEEDBACK TO THE DESIGN PROCESS This subsection describes the data collection and analysis approach that is used to analyze data from the Man-in-the-Loop tests conducted during the concept testing phase and the integrated system validation phase of the V&V program.
The methodology is adapted from a method that was introduced by Holinagel, Pederson and Rasmussen (Ref. 2E') and extended and refined in several empirical studies including a Man-l in-the-Loop evaluation of safety parameter display systems (Ref. 25). Other discussions of
{
methods for empirical evaluation studies are included in References 6,7,12,14,19, and 29.
There are two key elements of the methodology. The first is to use multiple, convergent j
performance data to develop a description of performance and the factors that contribute to that performance. The second is to use conceptual models to focus data collection and analysis activities, and to enable aggregation and generalization across specific cases.
I The main steps in the methodology are presented in Figure 6. The first step is to collect multiple, convergent performance data. The objective is to chart not only what actions are taken, but also the decision process and context that led to the actions. The emphasis is on recording process measures as well as outcome measures of performance. Outcome measures include the accuracy and completeness of the response made and the time to respond. Process measures address how the operator reached the outcome.
t f
Phase 2: Test Development May 1997 m:\\3638w.wpf:1b 050697 Revision 1
6-4 i
Conclusions about effectiveness of HSI in Confextindependent supporting human performance requirements; sources of human performance difficulty and (GeneralCondusions) enhancements to HSI required to improv;e human performance j
interpretation l
Prototypical Performance description general conclusions regarding performance based on aggregating similar pattems of performance across individuals and cases n
Aggregation &
Generalization Abstrachon and
&nem#zation Formal Performance description for the j specific case based on application of human performance model and model of support i
Analysis Descrintion of Actual Performance in a specific case Multiple convergent performance data
. soeurecy and completeness or response
. response Sme Context SpxXic
. errors (Specino Cases)
. record or mn m Hsi
.dooselon-troce protocol
- record Of CrWW commurW
.debrie6ng intervieWe Figure 6 Data Collection and Analysis Process Phase 2: Test Development m:\\3638w.wpf:1b-050697
[3[I
6-5 Data sources include:
Direct observation of participant behavior Records of interaction with the HSI
\\
Traces of actions taken on the underlying process Records of the dynamic behavior of critical process variables i
Records of verbal communication among team members or via formal c media Verbal reports made during debriefing interviews following the performance Measures of workload and situation awareness Cortunentaries on operator performance made by other knowledgeable ob The specific data collected for each evaluation is listed as part of the in descriptions in Section 7.0.
h d
Data on how the operator reaches the outcome provides crucial information Outcome measures alone why a particular interface helps or fails to help user performance.
i l factors active in any are not powerful enough to detennine which of the multiple potent a lt In testing a display evaluation of interface systems contribute to a specific outcome resu.
ll ts of interface concept,it is important to differentiate the effects due to fundamenta i
For example, if the concept from the effects due to incidental details of the implementat on.
flow path performance problems are detected with a new display concept that resou blish whether there are coding (such as energy flow or material flow), it is imp t ti (display whether the performance problems are due to details of the implemen a on legibility or specific flow coding technique) and performance would implementation details were changed.
ht Data on the background and context of a user problem can help localize h
f it can help identify contribute to successful or unsuccessful human performance. T ere ore iding potential improvements or additions to interface location and severity of problems identified during the performance testin May 1997 Revision 1 Phase 2: Test Development m:\\3638w.wpf: reg 0697
In the next step, the multiple data sources are correlated description of performance for a specific individual (or crew) iand co
\\
experimenter actively cross-references the different lines of e id n a specific case. The
[
participant behavior and cognitive activities. This cross-checking an support the validity of the description of actual performance g n egration can help enerated from the data.
According to the Reference 28 methodology, the descripti followed by successive stages of progressively more conce t don of actu levels of analysis. Conceptual models are used to producep - ependent, becomes more possible to generalize across spec ontext independent, it concepts about human error have characterized specific instauals and cases. For e errors as " slips" and " mistakes" (Ref. 30). This allows divers nces of human perfor adapted from Rasmussen and the model of supp n performance model produce formal descriptions of operator performance in particular ca e n ection 5.0 are used to generalize across cases.
ses and to aggregate and Once performance is described in formal terms, it becomes p pattems across individuals and cases. For example, multiple instancossib probleras may be identified and aggregated. The result of the aggr prototypical performance including common performance difficultieegation i s, confusions, and errors.
The description of prototypical performance is then used t effectiveness of particular HSI resources in supporting humao draw condus improve human performance. contribute to human performance difficu n s o the HSI required to e
Phase 2: Test Development m:\\3638w.wpf:1b-050697 May 1997 Revision 1
7-1 7
EVALUATION ISSUES AND DESCRIPTIONS This section describes the general test approach used to address each of the 17 major human performance evaluation issues defined in Phase 1 and presented in Table 1.
The 17 evaluation issues are organized under five headings:
Evaluations for detection and monitoring Evaluations for interpretation and planning Evaluations for controlling plant state Evaluations of conformance to human factors engineering (HFE) design guidelines Evaluations for validation of the integrated HSI The first 15 evaluations are grouped into the first three headings. Each of these 15 evaluations addresses a human performance evaluation issue.
Evaluations 16 and 17 describe evaluations that are performed as part of the AP600 V&V.
A more complete description of the V&V tests is provided in Reference 2.
The evaluation issue descriptions provided in subsections 7.1 through 7.5 are intended to provide a description of the testing approach and requirements for addressing each of the major evaluation issues. The actual number and content of tests that are performed depends on the schedule of development of individual HSI resources and availability of rapid prototypes and simulations to serve as testbed platforms. It is possible to address more than one evaluation issue in a single concept test. Conversely, several concept tests may be performed that address different aspects of a single evaluation issue. Human performance evaluation issues that are not addressed by concept tests are addressed in the final integrated system validation. Additional information on the concept tests that are planned to be performed as part of the AP600 HSI design process are provided in Reference 1.
The test implementation details are documented in individual test implementation plans that are prepared for each concept and V&V test.
7.1 EVALUATIONS FOR DETECTION AND MONITORING The purpose of evaluations in this subsection is to provide confidence that the design of the HSI supports the operators in maintaining an awareness of plant condition. It includes periodic active and passive monitoring by operators to determine current status and availability of plant systems; periodic monitoring needed to detect malfunctions or trends that are too small to activate an alarm; the more proceduralized monitoring that accompanies Evaluation Issues and Descriptions May 1997 m:\\3638w.wpf:1b-050697 Revision 1
~
7-2 shift turnover; and monitoring directed by queries about specific plant parame These issues are relevant to individual operators as well as to crews of operato The following set of evaluations are designed to test the ability of the HSI categories of detection and monitoring. These categories increase in complexity r
the level of detail of plant data and the degree of interaction between operator level understanding of plant condition from the wall pa
. The first
~
workstation without excessive raanipulation of the HSI to retrieve data.
The second category, which is addressed in Evaluation 2, tests the abi to use the cues provided by the wall panel information station to obtain more d ta erator data from the workstation. This evaluation tests the coordination of data prese e
e p ant between the wall panel information station and the workstation.
The third category, addressed in Evaluation 3, tests the ability of a single oper detailed plant data from the workstation based on a request from a supervisor o an procedure. This evaluation tests the ability to use the navigation aids of the presented on the workstation to find detailed data.
The fourth category, addressed in Evaluation 4, tests the ability of operators to information to maintain crew awareness of plant condition. Three situations are e
the informal transfer of information to a new person entering the control room th f transfer of information to a new crew entering the control room during shift tum e ormal the coordination ofinformation among crew memoers during ongoing detection a over, and monitoring.
n Evaluation issues are the following:
Issue 1:
Do the wall panel information station and the workstation summary and overv displays support the operator in maintaining an awareness of plant status a ability without needing to search actively through the workstation displays?
Issue 2:
Does the wall panel information station support the operator in getting more about plant status and system availability by directed search of the workstation and physical displays?
Issue 3:
Do the HSI features support efficient navigation to locate specific information?
Issue 4:
Do the HSI features effectively support crew awareness of plant condition?
Evaluation Issues and Descriptions m:\\3638w.wpl1t>450697 May 1997 Revision 1
7 7.1.1 Evaluation Issue 1: Passive Monitoring of WPIS and Workstation Displays Do the wall panel information station and the workstation summary / overview displays support the operator in maintaining an awareness of plant status and system availability
]
without needing to search actively through the workstation displays?
1 Relevant HSI Resources:
l WPIS (plant parameter data and alarm data) l Workstation summary displays and display navigation features l
t SpeciSc Concerns:
Do the WPIS and the workstation summary displays present sufficient information
{
about plant state and system availability?
l Do the overview displays effectively call more attention to more important information?
Do the WPIS and workstation summary displays help reduce the likelihood of l
omitting critical information in plant state assessment?
I Approach l
t The WPIS and workstation summary displays provide plant condition overview information to operators. This overview information is used by the operators to ascertain plant state and current status of operating equipment, to anticipate alarms and disturbances, to identify plant
{
systems or components that have become unavailable for use, and generally to stay "in touch" with the plant conditions. The WPIS display and workstation summary displays must j
be complete, correct, and well-designed to depict an overview of the plant. This is necessary to allow operators to maintain an awareness of plant status and system availability. Ideally, operators can obtain this overview from a passive monitoring of these displays. That is, the operators should not have to select and browse within a set of workstation displays.
Concept Testing Hypothesis The WPIS and workstation summary displays provide operators with an accurate overall understanding of plant state and system availability.
Evaluation Issues and Descriptions May 1997 m:\\3638w.wpf:1b 050697 Revision 1 i
7-4 Experimental Manipulations This evaluation includes reviews of the display content of the WPIS and default workstation displays to determine whether they contain sufficient information to allow operators to assess
-overall plant condition. Participants are shown static views of these displays and asked to infer the condition of the plant. These reviews occur early in the design process with low fidelity test beds to refine functional requirements regarding the types of data and data format that must be provided for various plant modes.
Next, the effectiveness of these displays is evaluated empirically. Participants are shown overview displays for a brief period, and then the displays are removed. Participants are then asked to describe current plant state and conditions as thoroughly as possible.
Participants are asked to describe the implications of plant conditions including potential future problems and parameters that are approaching alarm conditions. Following this " free recall" session, participants are asked to reconstruct, either verbally or with sketches, the arrangement of plant data from the workstation and WPIS displays. Well-designed displays organize plant data in meaningful groups that facilitate operator understanding and recall.
Dependent Measures and Evaluation Criteria This evaluation uses breadboard designs consisting of static drawings and computer-based rapid display prototypes. The evaluation investigates human factors issues related to the ability of operators to extract summary-level information about plant conditions from overview displays. During the display reconstruction task, evaluators analyze which groups of plant data the participants are able to recall easily and which they have difficulty recalling.
This leads to better understanding of effective data display formats. Protocol analysis and debriefings are used to identify characteristics of the design concepts that support operator -
understanding as well as characteristics that lead to confusion and errors by the subject.
Objective measures collected during the review and reconstruction task may include:
Number of plant conditions correctly identified Correct identification of implications of plant conditions
)
Time required to complete the task These measures provide performance baselines for comparing alternatives and for evaluating the benefits of display modifications.
Implications of Results The purpose of this evaluation is to contribute to the development of functional requirenwnts for the design of ' rerview displays for the WPIS and workstation. Results are used to assess Evaluation Issues and Descriptions May 1997 m:\\3638w.wpf:1b-050697 Revision 1
~..-. -
7-5 t
- and refine functional requirements for information content. The display reconstruction task l
is used to identify display arrangements that support operator understanding. The results support development of general guidelines for grouping and highlighting data in the overview displays.
i Stage of Development of the HSI This test is conducted during the functional requirements phase of the WPIS design. The HSI design needs to be at a phase where the information content for the WPIS and/or workstation overview displays is defined.
-(,
Test Bed Requirements:
Physical form - Test beds may be drawings or computer-based rapid display prototypes. Displays have formats that are representative of alternative display concepts for the plant HSI.
Information content -Information developed for displays is sufficient to assess plant I
conditions for a number of normal, abnormal, and emergency states. - Displays contain realistic, meaningful values, and not random values. The display parameters and l
values do not have to be AP600-specific.
Dynamics - Static displays may be used. If rapid display prototypes are used, display animation, such as blinkmg and flashing, may be used.
Participant Characteristics l
Participants may include personnel who are familiar with important pressurized water l
reactor (PWR) plant operating parameters, including operator trainers, operators, and '
i knowledgeable engmeers and designers.
i 7.1.2 Evaluation Issue 2: Directed Search for Information Within the Workstation Displays Based on WPIS Displays Does the WPIS support the operator in getting more detail about plant status and system availability by directed search of the workstation displays?
Relevant HSI Resources:
WPIS Workstation displays and display navigation features Evaluation Issues and Descriptions May 1997
}
m:\\3638w.wpf:1b-050697 Revision 1 j
l
._~..e
7-6 l
t 1
i Speci6c Concerns:
1 When given a WPIS cue for more detailed monitoring, how accurately and efficiently can operators locate and select this information from the workstation displays?
What types of confusions are caused by the WPIS displays when it indicates the need j
e for active search of the workstation?
What types of navigation errors are made with the workstation displays?
l
~
l Approach.
{
The WP5 and the workstation displays work together to support the operator in actively -
l obtaining a picture of the plant condition. This active monitoring is driven by cues on the WPIS that indicate a potential problem, such as a plant parameter trending toward an alarm condition.. From this cue, the operator navigates through the displays to locate more detailed i
information about the status of a particular process or parameter. (Operator response to l
. plant alarms is addressed by a set of experiments under evaluations for interpretation and i
plannmg, SSAR subsection 18.11.)
The intent of this experiment is to test operators' ability to do this' display navigation and selection efficiently with the AP600 display systems. Participants are given a plant state scenario that indicates the need for more detailed monitoring. They are asked to use the workstation to find the functional or physical display (s) that are most useful for more detailed monitoring, i
Concept Testing l
Hypothesis '
)
The WP5 display and workstation display system support the operator in efficiently locating and selecting the display (s) that contain greater detail about plant parameters required for 1
maintaining awareness of plant state.
5 Experimental Manipulations This experiment addresses a majority of plant conditions that require or cue the operator to obtain additional information from the workstation displays. These conditions may include:
- Improvement or deterioration of power generation goals as indicated by the WPIS Improvement or deterioration of plant safety as indicated by the WPIS
. Evaluation Issues and Descriptions May 1997 mA3638w.wpf:1b-050697 Revision 1
1-7-7 Changes in the operating status of plant equipment (such as activation / deactivation o automatically controlled systems) as indicated by the WPIS or other information sources in the MCR Expenmental manipulations address the full range of HSI display devices (with the exceptio of alanns) that cue operators to seek additional information about plant state.
Dependent Measures and Evaluation Criteria This evaluation uses breadboard designs to investigate human facters issues related to -
directed search through large sets of displays. Qualitative information is gathered through protocol analysis or debriefing discussions with the participants. The intention is to identify characteristics of the design concepts that lead to confusion, errors, and slow or awkward actions by the subject.
Objective dependent measures may include:
How many displays are accessed before selecting the correct display or displays?
Which displays are selected and in what order (the navigation path)?
=
The degree to which the relevant information is located f
=
l The success in returnmg from search to a designated location Time required to complete the task Implications of Results The purpose of this evaluation is to contribute to the development of functional requirements for the design of display navigation aids that support the operator in information gathering.
The qualitative information gathered through protocol analysis or debriefing discussions are analyzed to identify design features that lead to confusion, errors, and slow or awkward actions by the participants. Funedonal requirements are developed to address those design characteristics that had significe.t effects on the participants' performance of the display navigation task, including display navigation aids that:
Inform the operator via the WPIS that detailed information can be retrieved from the l
i workstation Direct the operator via the WPIS to relevant categories of information or display space locations in the WPIS Support the operator in scanrung through potential information fields and selecting
=
the required data Evaluation Issues and Descriptions May 1997 m:\\3638w.wptib450697 Revision 1
_ _ _ _ _ _ _ _ _ _ _ _ _ - - - - ~ ~ ~ ~ ~ ~ ~ ~
7-8 and task completion time, are used as baselin performance benefits achieved through subsequent refinements of evaluate Stage of Development of the HSI This test is conducted during the functional requirements phase of the HSI d The HSI design needs to be at a phase where design concepts exist fo esign process.
~
navigation aids, and display selection mechanisms. A set of detailed workstati ay formats, are needed to provide meaningful display navigation tasks. Prelimmary decisi on displays workstation display system hardware need to be made prior to these evaluatio ns regarding ns.
Test Bed Requirements:
Physical Form - The displays are representative of the style used in terms of appearance, including display format and use of windows n
displays are computer-based.
. The workstation Information Content - Information developed for displays is sufficie substantial set of displays to establish a fair te aea the displays do not have to be AP600-specific. The values present d f on but not interpret the data. parameters do not have to be realistic because t e
or plant Dynamics - Static displays may be used. In some cases, display animat blinkmg and flashing, may be used. The workstation display selection m as need to be operational.
sms Participant Characteristics Participants may include designers, engineers, operator trainers and opera need to have familiarity with the operation of the WPIS and the workst ors. Participants ys.
7.1.3 Evaluation Issue 3:
Displays Based on a RequestDirected Search for Information within t Do the workstation displays support efficient navigation to locate specific i f n ormation?
Evaluation issues and Descriptions m:\\3638w.wptib-Os0697 May 1997 Revision 1
7-9 i
Relevant HSI Resources:
)
Workstation displays and display navigation features Specific Concerns:
How accurately and efficiently can operators, when given a specific request, locate and select the correct workstation display?
What types of navigation errors are made with the workstation displays?
Approach The workstation functional and physical displays are intended to support the operator in searching for specific parameter values and other indicators of plant status that are not part of the default displays. This is the case of directed search of the workstation displays. In many cases, this search is directed by a request from a supervisor (or other technical staff) or by a procedure. From this request, the operator navigates through the displays to determme the status of the requested process or parameter. This directed search must be efficient and -
not detract from other duties. The intent of this experiment is to test operators' ability to use the workstation display system to efficiently perform display navigation and selection task.
Participants are given a parameter or process name and asked to use the workstation.
displays to determine the current value and then return to the display from which they began.
Concept Testing i
Hypothesis The workstation display system supports the operator in efficiently determinmg the current j
value of plant parameters and processes not represented in the default displays.
Experimental Manipulations t
Manipulations involve the complexity of navigating through the display system. In some cases, the required navigation is brief, and in other cases, the most complex navigation is required.
Dependent Measures and Evaluation Criteria This evaluation uses breadboard designs to investigate human factors issues related to navigation through large sets of displays. Qualitative information is gathered through Evaluation Issues and Descriptions M8Y 1997 mA3638w.wpf:1b450697 Revision 1
i 7-10 protocol analysis or debriefing discussions with the participants. The intention is to' identify characteristics of the design concepts that lead to confusion, errors, and slow or awkward actions by the subject.
Objective dependent measures may include:
How many displays are accessed before selectmg the correct display or displays?
Which displays are selected and in what order (the navigation path)?
The degree to which the relevant information is located The success in returning from search to a designated location Time required to complete the task Implications of Results l
The purpose of this evaluation is to contribute to the development of functional requirements for the design of display navigation aids. These aids are for the workstation displays including display hierarchy / network structure, menu design, cross-references between displays, audit trails of display navigation paths, display space " landmarks" and orientation j
aids, content overlap of related displays, and user interface mechanisms for display selection.
The qualitative information gathered through protocol analysis or debriefing discussions are analyzed to identify design features that lead to confusion, errors, and slow or awkward actions by the participants. Functional requirements are developed to address those design l
characteristics that have significant effects on the participants' performance of the display navigation task.
The quantitative measures of the participants' performance may be used as baselines to compare alternative designs and evaluate performance benefits achieved through subsequent refinements of design concepts.
Stage of Development of the HSI l
t This test is conducted during the functional requirements phase of the HSI design process.
The HSI design is at a phase where design concepts exist for display formats, navigation aids, and display selection mechanisms. Preliminary decisions regarding display system i
hardware need to be made prior to conducting this evaluation.
{
~
Test Bed Requirements:
{
i Physical form - The displays are representative of the style used in the AP600 HSI in terms of appearance, including display format and use of windows. The workstation displays are computer-based.
)
Evaluation Issues and Descriptions May 1997 mA3638w.wpf:1b-050697 Revision 1
7-11 Information content - Information developed for displays is sufficient to generate a significant number of workstation displays, since participants must navigate through a substantial set of displays to establish a fair test of the display system. The plant parameters presented on the displays do not have to be AP600-specific or realistic.
Dynamics - Static displays may be used. In some cases, display animation, such as blinkmg and flashing, may be used. The workstation display selection mechamsms need to be operational.
I Participant Characteristics Participants may include designers, engineers, operator trainers and operators. Participants need to have familiarity with the operation of the workstation displays.
7.1.4 Evaluation Issue 4: Maintaining Crew Awareness of Plant Condition Do the HSI features effectively support crew awareness of plant conditions?
Relevant HSI Resources:
WPIS
=
Workstation displays Paper-based / computer-based operating and administrative procedures Specific Concerns:
Does the HSI:
Support the operating crew in maintaining awareness of plant conditions and their implications?
Support the crew in maintaining awareness of each others' actions, intents, and information needs?
Support effective and efficient shift turnover?
Support new personnel entering the MCR to develop an awareness of plant conditions
=
and their implications?
May 1997 Evaluation issues and Descriptions Revision 1 m:\\3638w.wpf-1b-Os0697
7-12 Approach This evaluation addresses three situations for crew awareness:
Orientation of a new person entering the MCR Shift turnover Ongoing detection and monitoring by the crew Ongoing detection and monitoring by the crew requires that the crew members main awareness of plant conditions and the implications to operational goals. It also requi crew members be aware of information that is relevant to other operators' responsibilit The design of the HSI supports each operator in:
=
Detecting and monitoring parameters relevant to his own task Identifying parameters that are relevant to other crew members Checking that those parameters relevant to other crew members are being addre In this evaluttion, participants carry out the activities associated with a new tl.e MCR, shift tumover, and ongoing detection and monitoring using defined pla scenarios.
Concept Testing Hypothesis w
The HSI supports the crew in maintaining awareness of the plant condition.
Experimental Manipulations Tests are conducted for normal, abnormal, and emergency plant states using define scenarios. Plant conditions include:
Normal states, plant maneuver in progress Normal states, with certain equipment indicated as unavailable Normal states, with regular changes in actuation and termination of automated systems Normal states, with parameters trending toward abnormal Outage state, for tag outs or tests in progress
=
Evaluation issues and Descriptions i
m:\\3638w.wpf;1b-Os0697 May 1997 Revision 1
7-13 Atmormal states
=
Emergency states Participants walk through these scenarios using a test bed that may consist of static displays and mockups of the workstation consoles and the WPIS. Alternative design concepts may be tested for displays, workstation console, the WPIS, and the relative position of thex components in the main control area. Factors that affect crew awareness include:
Display content and format Operator's view of the WPIS Operator's view of other operators and their workstations Operator's ability to communicate data verbally Operator's ability to communicate data by other means Arrangements of workstation consoles and the WPIS may affect the operators' ability b view the WPIS, other operators, and their workstations and to communicate verbally. The effect of these arrangements on crew awareness of the plant is evaluated. The effect of alternative display concepts on crew awareness of the plant also may be evaluated.
Dependent Measures and Evaluation Criteria Qualitative information is gathered using protocol analysis or debriefing discussions with the
, c.iicipants. The intent is to identify characteristics of the design concepts that lead to confusion, errors, and slow or awkward use. In addition, participants assess plant conditions at the end of each trial to evaluate their degree of understanding of plant condition.
Additional measures to collect for the tests of shift turnover may include:
Time to complete shift tumover Number of required plant parameters addressed Number and types of omission errors made Accuracy errors made in reviewing plant parameters Additional measures to collect for ongo4.g detection and monitoring may include:
Identification of plant parameters relevant to the operator's responsibilities Identification of plant parameters relevant to the responsibilities of other crew members Evaluation issues and Descriptions May 1997 m:\\3638w.wptib-050697 Rmsion 1
~
7-14 Implications of Results The primary purpose of this evaluation is to contribute to the development of funct requirements for:
The design of the workstation and WPIS displays The design and layout of the workstation consoles and the WPIS a
Functional requirements related to the design of the workstation and WPIS dis the organization and format of plant data for supporting crew awareness of plant s
These functional requirements address the design of the summary and def n
on.
the presentation order of the data for shift turnover. Functional requirements related design and layout of the workstation consoles and the WPIS address design ch o the that support the communication of information. These functional requirements addr operator headphones to facilitate communication, design of workstation conso ess use by two or more people, and mechanisms for coordinating views of the da operators.
This evaluation also contributes to the development of functional requiremenis for the operating and administrative procedures and logs that contribute to coordinating info among crew members.
Stage of Development of the HSI This test is conducted during the functional requirements phase of the HSI des The HSI design needs to be at a phase where design concepts exist for display fo content, workstation console design, WPIS design, and MCR layout. Preliminary decisi regarding display system hardware need to be made prior to conducting this evaluatio Test Bed Requirements:
Physical form - The displays are representative of the style used in the plant HS terms of appearance, including display format and use of windows
. The workstation displays are computer-based.
Information content -Information developed for displays needs to be sufficient to
=
generate a significant number of workstation displays. The plant parameters presented on the displays do not have to be AP600-specific. The values presented fo plant parameters must be realistic.
Dynamics - A dynamic simulation of plant behavior is not required. Static d may be used. For tests of ongoing plant monitoring, a series of static display i
Evaluation Issues and Descriptions m:\\3638w.wpf:1b-Os0697 May 1997 Revision 1
7-15 used. Display animation, such as blinkmg and flashing, may be used. The workstation display selection mechanisms need to be operational.
Participant Characteristics Participants may include designers, engineers, operator trainers, and operators. Participants needs to have familiarity with the operation of the workstation functional and physical displays.
7.2 EVALUATIONS FOR INTEIU'RETATION AND PLANNING The purpose of evaluations in this subsection is to provide confidence that the HSI design supports situation assessment and response planning. The focus is on situations that require a response to plant disturbances and significant plant accidents. Responding to plant disturbances covers the stages of cognitive processes. The emphasis of this set of evaluations is on identifying plant disturbances, assessing their implications for plant functions and goals, and selecting, evaluating, and, if necessary, adapting a recovery procedure.
The following set of evaluations are designed to test whether the HSI features, individually and in combination, support operator response to single-fault, multiple-fault, and severe accident events. They test the ability of the HSI to support both rule-based and knowledge-based performance, including supervisory control of automated systems during emergencies. In addition, they address the ability of the HSI to support crew problem solutions and coordination during plant disturbances.
Evaluation issues are the following:
Issue S: Does the alarm system convey information in a way that enhances operator awareness and understanding of plant conditions?
Issue 6: Does the physical and functional organization of plant information on the workstation displays enhance diagnosis of plant condition and the planrung/ selection of recovery paths?
Issue 7: Does the integration of the alarms, WPIS, workstation and procedures support the operator in responding to single-fault events?
Issue 8: Does the integration of the alarms, WPIS, workstation and procedures support the operator in interpretation and planning during multiple-fault events?
Issue 9: Does the integration of the alarms, WPIS, workstation and procedures support the crew in interpretation and planning during multiple-fault events?
Evaluation Issues and Descriptions m:\\3638w.wpf-1b-050697 May 1997 Rnision 1
7-16 Issue 10: Does the integration of the alarms, WPIS, workstation and procedures support the crew in interpretation and planning during severe accidents?
7.2.1 Evaluation Issue 5: Detecting and Understanding Disturbances Using Alarms Does the alarm system convey information in a way that enhances operator awareness and understanding of plant condition?
Relevant HSI Resources:
Overview alarm system Alarm support displays Specific Concerns:
Does the alarm system overview organize alarm messages in a way that facilitates the e
operator's understanding of the alarm state and its implications for the plant's operational goals?
e Does the presentation format, including visual coding techniques intended to establish relative salience (that is, salience coding), enable rapid detection and interpretation of alarm messages?
Does the alarm system prioritization scheme facilitate the operator's understanding of the relative importance of alarm conditions?
Does the alarm system enable operators to identify and interpret the implications of lower priority alarms?
Approach The assumption of this evaluation is that a well-structured alarm system presents the most important alarm messages and organizes alarms in a way that is meaningful to the operator.
Redundant and less important messages do not appear. Participants are able to perceive the alarm messages in patterns related to types of plant faults, recognize high-priority goal violations, and are aware of the number and general content of lower priority alarms.
A time-step sequence of alarm patterns, corresponding to an evolution of an accident scenario, is presented to a subject. The subject is asked to indicate the alarm messages that are presented, and their priority for response (from most to least important). The subject is also asked to describe the implications of the alarms to plant safety and productivity goals and any causal interrelationship among alarms. Next, the subject is asked to identify other Evaluation Issues and Descriptions May 1997 m:\\3638w.wpf;1b-050697 Revision 1
1 7-17
- alarrn conditions that may have existed but were not displayed because they were not shown by the prioritization scheme. Lastly, the display of lower priority alarms is presented, and the subject is asked to describe the implications of these alarms.
Concept Testing Hypothesis The alarm system supports identification and prioritization of alarm messages.
Experimental Manipulations In the concept testing phase, the alarm pattern for each time step is presented for a fixed length of time and then removed. The premise is that if the alarm system is well organized, i
it results in meaningful alarm patterns. An individual rapidly identifies the alarms present l.
and recalls them once they are removed.
Underlying plant upsets vary in severity from single malfunction events to multiple-failure l
events. Alarm messages vary in number and level of abstraction (such as from equipment state.to goal state). Upsets include:
d Cases where a single fault leads to a cascade of alarms; where the objective is to
{
determine whether the subject can correctly assess the interrelation between the l
original fault and the consequent disturbances l
Multiple-fault cases, where the objective is to determine the ability of the subject to identify, prioritize, and track the implications of multiple, functionally unrelated alarms I
Cases where lower priority alarm queues of varying types and number exist.
i
=
i l
Dependent Measures and Evaluation Criteria This evaluation uses breadboard designs to investigate the effectiveness of the alarm organization and prioritization scheme in supporting operator's in identifying, prioritizing, j
and assessing the implications of alarms. The intention is to identify characteristics of the design concepts that lead to confusion, missed alarms, misinterpretation of alarm messages, misinterpretation of interrelation among alarms, or incorrect / incomplete understanding of alarm priorities.
i This is assessed through objective performance measures, as well as the participant's subjective assessment obtained during debriefing inteniews.
b Evaluation issues and Descriptions May 1997 m:\\3638w.wpf:11450697 Revision 1
7-18 Objective-dependent measures may include:
The number of alarms correctly identified The subject's assessment of alarm priorities compared to the priorities that are assigned during the development of the test scenario The extent to which the implications of the alarms for present and future plant state are correctly assessed The extent to which the causal interrelation among alarms is recognized The ability to infer lower priority alarms The ability to interpret displayed and lower priority alarms Implications of Results The purpose of this evaluation is to contribute to the development of functional requirements for the alarm system. Quantitative measures (such as the number of alarm messages identified and successful identification of alarm implications) may be used to evaluate and refine alarm organization and prioritization concepts. Issues include organization of alarms, number of slots available in parallel for alarm messages, alarm prioritization rules used, and meaning of alarm messages. Participants' comments regarding lower priority alarm messages may be used to assess the alarm prioritization schemes. For example, if operators indicate that the lower priority alarms contain important information that needs to be more available, then the alarm prioritization scheme may be revised. Qualitative results (such as comments regarding salience coding, alarm message format, and lower priority alarm queue format)
{
may also be used to refine display format functional requirements.
l Stage of Development of the HSI This test is conducted during the functional requirements phase of the HSI design process.
The HSI design needs to be at a phase where prelimmary design concepts, with respect to alarm system organization and alarm priority rules, exist.
Test Bed Requirements:
Physical form - The alarm display test bed accurately reflects the spatial organization of alarm messages, message wording, and alarm prioritization rules. The alarm display may be presented on cathode ray tubes (CRTs), plasma panels, or on paper.
Evaluation Issues and Descriptions May 1997 mA3638w.wpf:1b-050697 Revision 1
4 7-19 Information content - The infonnation content is essential for evaluating the
=
usefulness of the alarm messages and prioritization schemes. Only a subset of the alarm messages are required for this test to be performed. However, a complete set of alarm messages must exist for each plant upset condition tested. The information content need not be from the AP600, but should be representative of the AP600.
Dynamics - A series of static displays corresponding to discrete time steps during an i
accident scenario may be used. A dynamic plant simulation is not needed to drive the alarm system.
Participant Characteristics Participants may include designers, engineers, operator trainers, and operators. Participants need to have familiarity with the alarm system concept.
7.2.2 Evaluation Issue 6: Interpretation and Planning Using Workstation Displays i
B Does the physical and functional organization of plant information on the workstation t
displays enhance interpretation of plant condition and the planning / selection of recovery paths?
Relevant HSI Resources:
Physical and functional displays of operator workstation
=
i Specific Concerns:
1 Do the functional displays support the operator in assessing goal satisfaction?
I Do the functional displays support the operator in assessing whether currently active processes are performing correctly?
Do the functional displays support the operator in assessing whether automated
=
systems are performing correctly?
Are the implications of plant state for operational goals conveyed effectively via
=
functional displays?
Do the displays support the operator in identifying the plant condition that caused an j
alarm?
j Evaluation Issues and Descriptions May 1997 m:\\3638w.wpf:1b 050697 ~
Revision 1
7-20 Do the displays support the operator in understanding interrelations among systems
)
and processes?
Do the displays support operator understanding of interrelations among observed disturbances due t.a process interactions?
Do the displays support the operator in assessing validity of data?
Is equipment status conveyed effectively via the physical displays?
Do the functional displays support the operator in assessing the availability of
=
alternative processes for achieving a given goal (success path monitoring)?
Do the functional displays support the operator in making choices among alternative j
processes (success path choice)?
Do the displays support the operator in assessing the effect of the selected recovery path on other plant goals (side effects)?
Are operators able to effectively coordinate physical and functional displays?
=
Approach:
The purpose of this evaluation is to determine whether operators can efficiently extract the necessary information from the physical and functional displays of the operator workstation.
An individual is asked to interpret, track, and indicate a response strategy for an evolving plant upset by examining workstation displays. In the concept testing phase, the displays may be static representations that correspond to discrete time steps through the evolving upset. A set of probe questions is used to test the ability of participants to extract information from the displays.
Initial plant conditions are described to the subject. The subject is then presented with an alarm message, either verbally or via a static display. The subject is then taken through a series of discrete time steps through the evolving plant upset. At each time step, the subject accesses physical and functional displays to answer a set of questions about plant state and its implications.
Evaluation Issues and Descriptions May 1997 m:\\3638w.wpf:1b-050697 Revision 1 L
l 7-21 Concept Testing Hypothesis i
The functional and physical displays support the operator in interpreting plant state and planning recovery action.
~
Experimental Manipulations Underlying plant upsets vary in severity from single-fault events, for which diagnosis and 2
plannmg is straightforward, to multiple-fault events, for which diagnosis and recovery planmng is complex.
Upsets involving complex diagnosis include multiple-failure mode accidents in which k
important plant indications are disguised or obscured. Upsets involving complex recovery 1
path plannmg require monitoring of side effects to evaluate undesirable effects on other parts 4
j of the plant (conflicting goals). Cases include sensor failures and invalid data, as well as automated system failures that require decisions regarding manual intervention. Alternative display concepts may be tested and compared.
1 Dependent Measures and Evaluation Criteria This evaluation uses breadboard designs to investigate the ability of the physical and functional displays to support interpretation and planning. Qualitative information is gathered through protocol analysis of participant's comments during the testing and debriefing interviews following the test. Responses to questions about the plant state and status of operational goals are also evaluated.
- Questions to participants may include their perceptions of:
Existing plant disturbances Causes and interrelations among thesa disturbances Consequences of these disturbances for plant operational goals Alternative processes available to achieve plant operational goals Status of automated systems and whether manual intervention is required Competing goals that need to be satisfied (such as side effects)
Appropriate recovery actions that must be taken The participants' responses are compared to a predefined set of correct responses.
Evaluation Issues and Descriptions May 1997 m:\\3638w.wpf;1b450697 '
Revision 1
7-22 Implications of Results The purpose of this evaluation is to contribute to the development of functional requirements for the workstation functional and physical displays. The philosophy behind the functional and physical display system is based on Rasmussen's abstraction hierarchy. A major goal of the display system is to support functional reasoning and knowledge-based decision-making.
The objective is to reduce errors such as fixation effects, in which operators concentrate on one set of symptoms to the exclusion of other more relevant symptoms; and missing side effects, in which operators fail to notice that their chosen recovery strategy may have negative consequences for other plant systems. A primary focus of concept testing is to provide feedback on the effectiveness of the display system in fostering a broad view and supporting knowledge-based reasoning. Specific attention is paid to cases where operators make errors in plant state assessment (such as fixation errors); cases where operators fail to understand the status of automated system and/or anticipate automatic system action; and cases where they make planning errors (such as missing side effects). These are used to test and revise the functional requirements for the display system.
Stage of Development of the IISI This test is conducted during the functional requirements phase of the HSI design process.
The HSI design needs to be at a phase where preliminary designs for content and layout of the functional and physical displays are available.
Test Bed Requirements:
Physical form - The plant displays are generated on a VDU screen using rapid display prototyping software. Some animation is displayed. However, other static media, such as color drawings, could be used. Display representations convey design features such as salience coding and grouping of data.
Information content - Displays for the plant functions and systems that are relevant to
=
the plant upsets used in the study need to be prepared. For the issues being tested, the displays need not be AP600-specific.
Dynamics - Because plant upsets may be presented as a series of discrete time-steps, a near full-scale sirnulation of plant dynamics is not required.
Participant Characteristics Participants may include designers, engineers operator trainers, and operators. Participants need to have familiarity with the operation of the workstation functional and physical displays.
Evaluation issues and Descriptions May 1997 m:\\3638w.wpf;1b-050697 Revision 1
7-23 7.2.3 Evaluation Issue 7: Interpretation and Planning During Single-Fault Event Using Alarms, Workstation, WPIS, and Procedures Does the integration of the alarms, WPIS, workstation, and procedures support the operator in responding to single-fault events?
Relevant HSI Resources:
WPIS Alarm system Workstation displays Computer-based and/or paper-based procedures i
Specific Concerns:
Does the integration of alarms, displays, controls, and procedures support the operator in:
Obtaining detailed information concerning alarm messages l
Retrieving the appropriate procedure in response to plant condition Performing actions indicated in procedures Assessing goal threats and achievement Approach The purpose of this test is to provide confidence that operators can use the alarms, procedures, displays, and controls as intended to respond to straightforward, single-fault plant upsets. Operator response is primarily procedure-based. In Rasmussen's terminology, this corresponds to rule-based behavior. This experiment focuses on the performance of individual operators (such as a reactor operator) and does not focus on the interaction of multiple operators. The study is performed using a crew size consistent with the AP600 MCR mannmg assumptions for handling emergency events. The subject is presented with an alarm or set of alarms. Then the HSI is used to select the appropriate procedure, select the appropriate plant displays, and execute the procedure.
l
' Concept Testing i
Hypothesis i
The integrated HSI supports operators in handling single-fault events.
Evaluation Issues and Descriptions May 1997 m:\\3638w.wpf;1b-050697 Revision 1
7-24 Experimental Manipulations Test scenarios are based on a variety of plant faults for a variety of plant conditions (normal, abnormal, and emergency).
Dependent Measures and Evaluation Criteria This evaluation uses a breadboard design to investigate the effectiveness of the integrated HSI in supporting operator response to single-fault plant upsets. This is assessed through objective performance measures, as well as the participant's subjective assessment obtained during debriefing interviews.
Subject decisions and actions are analyzed using decision tracing and analysis of task completion time. The evaluation focuses on errors of intent and execution for both control / display navigation and plant control.
The participants' performance in responding to the plant pset is compared to an ideal u
response path defined by experts. Performance may be assessed in terms of:
Successful task completion (such as selection of proper procedures and displays, and the proper execution of procedures)
Task completion time Errors (such as incorrect intentions and incorrect execution of actions)
Inefficiencies (such as delays or wasted actions, including excessive transitions between displays, induced by HSI design)
Implications of Results The purpose of this evaluation is to contribute to the development of functional requirements to support the integration of the different HSI features. The focus is on the points of interface among the different HSI features and how effectively they work in combination to support rule-based performance.
The results are analyzed to identify design features that lead to confusion, errors, and slow or awkward actions by the participants. Functional requirements are developed to address those design characteristics that have significant effects o a the subject's performance.
Evaluation Issues and Descriptions May 1997 m:\\3638w.wpf:1b-050697 Revision 1
7-25 Stage of Development of the HSI This test is conducted during the functional requirements phase of the HSI design process.
The HSI design needs to be at a phase where preliminary design concepts exist for the WPIS, the alarm system, workstation displays, and procedures.
Test Bed Requirements:
Physical Form - The WPIS, alarm system, workstation displays, and computerized procedure displays are representative of the AP600 HSIin terms of appearance. Tlus includes display format, use of windows, display navigation mechanisms, and links among the different HSI resources (for example, mechanisms linkmg alarm messages to particular workstation displays or procedures). The HSI features need not be high-fidelity with respect to physical scale, for example, the WPIS and alarm system could be simulated on a VDU.
Information Content - A set of displays is developed to cover the set of faults
=
included in the test, as well as to provide a set of realistic displays to test the adequacy of navigation.
Dynamics - Static displays may be used. The displays need not be AP600-specific.
Participant Characteristics Participants may include designers, engineers, operator trainers, and operators. Participants need to have familiarity with the AP600 HSI features.
Performance Testing Verification Design features of the hardware and displays are examined and evaluated against functional requirements using a checklist-type procedure. This evaluation focuses on the functional requirements that were defined during the concept test phase to support integration of HSI features, especially those developed during the concept testing phase of this evaluation. This test is conducted with equipment that emulates production prototype hardware for the workstation. Deviations from the functional requirements are documented and then evaluated.
f Evaluation issues and Descriptions May 1997 l
m:\\3638w.wpf;114s0697 Revision 1
7-26 l
Validation This test is a validation that integrated HSI supports trained operators in responding to single-fault events.
i Requirement: Prompt and correct interpretation of alarm messages Measures:
Operator report of fault and implications Task completion time Requirement: Prompt retrieval of detailed information from workstation regarding alarm messages Measures:
Successful retrieval of req.uired information Information retrieval time Requirement: Prompt and correct selection of procedure P
Measures:
Successful retrieval of procedure Procedure selection time Requirement: Prompt and correct selection of controls and displays r
Measures:
l Successful retrieval of controls and displays Control and display selection time 1
Requirement: Prompt and correct assessment of goal threats and goal achievement l
Measures:
Operator assessment of goal threats and goal achievement j
Task completion time Evaluation Issues and Descriptions May 1997 m:\\3638w.wpf:1b-Os0697 Revision 1
7-27 For each scenario used in the validation study, a description is created of how the operator must respond to the event. It includes a description of the alarms that are identified, how they are interpreted, what workstation displays are accessed, what conclusions about pla state and implications for operational goals are drawn, what procedures must be accessed, and what control actions are taken. Operator performance is compared against this description. The performance criterion is the correct response. Criteria for task completion time are determined at a later point.
Experimental Manipulations The types of plant upsets presented are the same as in the concept testing phase.
Stage of Development of the HSI This test is conducted after the design of the WPIS, alarm system and workstation hardware, software and information content have been completed. This test is conducted using a near full-scope simulator consisting of equipment that emulates the HSI hardware.
Test Bed Requirements:
Physical form - The hardware emulates HS1 equipment in the relevant respects.
Information content - The information content of the HSI is representative of AP600 interfaces in content and format.
Dynamics - A high-fidelity, near full-scope AP600 MCR simulator is used.
=
Participant Characteristics (Validation)
Participants are experienced operators who have a basic understanding of the AP600 requirements. They also have familiarity with the operation of the AP600 HSI.
7.2.4 Evaluation Issue 8: Interpretation and Planning During Multiple-Fault Events Using Alarms, Workstation, WPIS, and Procedures Does the integration of alarms, WPIS, workstation and procedures support the operator in interpretation and planning during multiple-fault events?
l May 1997 Evaluation Issues and Descriptions Revision 1 m:\\3638w.wpf:lt>050697
~_
7-28 Relevant HSI Resources:
WPIS Alarm system
=
Workstation displays Computer-based and/or paper-based procedures a
Specific Concerns:
Does the integration of alarms, displays, and procedures support the o or in:
Diagnosing multiple-fault plant conditions Planning / selecting the most appropriate recovery path when multiple
=
need to be considered e y goals Assessing the effect of the selected recovery path on other plant goals (
=
e effects)
Supervising automated systems and determuung when manual interve required s
Approach The purpose of this test is to provide confidence that operators can use the l path in multiple-fault situations. The test assesses: procedure a arms, System understanding for diagnostically complex cases Success path planrung for cases where the recovery path is complex Operator response is guided by emergency response procedures althoug skills are required for interpreting plant status indications and for evalu owledge-based effectiveness of the current procedure.of automatic control system performance s, and the This test focuses on the performance of individual operators (such as react does not focus on the interaction of multiple operators. The study is perfo or operator) and The subject (s) is presented with a complex alarm e using a crew y events.
appropriate response using procedures and displays of the HSI.ecutes the Evaluation Issues and Descriptions m:\\3638w.wpf:1b450697 May 1997 Rnision 1
7-29 Concept Testing Hypothesis The integrated HSI supports operators in handling multiple-fault events.
Experimental Manipulations A variety of multiple-fault plant conditions are included to test:
System understanding in diagnostically complex cases (such as masked symptoms and obscured evidence)
Success path plannmg in complex cases (such as complex constraints, side effects, and conflicting goals)
Ability to provide supervisory control of automatic control systems, to assess when intervention is required, and to have them take over effectively Dependent Measures and Evaluation Criteria This evaluation uses a breadboard design to investigate the effectiveness of the integrated HSIin supporting operator response to multiple-fault plant upsets. Particular attention is focused on the ability of the integrated HSI to support knowledge-based reasoning. This is assessed through objective performance measures, thmk-aloud protocol during task performance, and the participant's subjective assessment obtained during debriefing interviews.
Subject decisions and actions are analyzed using decision tracing and analysis of task completion time.
The subject's performance in responding to the plant upset is compared to an ideal response path defined by experts. Performance may be assessed in terms of:
Successful task completion (such as selection of proper procedures and displays, and proper execution of the procedure)
Task completion time Errors (such as incorrect intentions and incorrect execution of actions) l Evaluation Issues and Descriptions May 1997 m:\\3638w.wpf;1b-050697 Revision 1 s
7-30 l
Inefficiencies (such as delays or wasted actions, including excessive transitions between displays, induced by HSI design).
Implications of Results The purpose of this evaluation is to contribute to the development of functional requirements for the integrated HSI to support operator response to multiple-fault events. The focus is on the points of interface among the different HSI features and how effectively they work in combination to support knowledge-based performance.
]
The results are analyzed to identify design features that lead to confusion, errors, and slow or awkward actions by the panicipants. Particular attention is paid to the interpretation of plant state and response planning by the participants. Instances of errors of intention are analyzed in detail to determine HSI characteristics that might have contributed to the error and improvements that could be made to the HSI to reduce this type of error.
This evaluation leads to the following types of recommendations:
Ways of presenting alarm and procedure information that assist operators in determining the appropriate priority of multiple alarm messages Ways of presenting information on the WPIS, the workstation displays, and in the procedures to reduce the likelihood of operator fixation on a single fault Ways of presenting information on physical and functional plant displays, the WPIS and in procedures to assist operators in determuung the cause and consequences of plant component malfunctions Ways of presenting alarm and procedure information that assist operators in determuung appropriate goals for plant recovery Ways of presenting information on physical and functional plant displays, the WPIS and in procedures to maintain operator awareness of side effects (consequences of plant recovery path that may violate other safety goals)
Ways of presenting information on physical and functional plant displays, the WPIS and in procedures to support operator supervisory control of automated systems, and to assist operator identification regarding when manual intervention is required l
Evaluation Issues and Descriptions May 1997 m:\\3638w.wpt11450697 Revision 1
7-31 Stage of Development of the HSI This test is conducted during the functional requirements phase of the HSI design process.
The HSI design needs to be at a phase where preliminary design concepts exist for the WPIS, the alarm system, workstation displays and procedures.
Test Bed Requirements:
d Physical form - The WPIS, alarm system, workstation displays, and computerized
)
procedure displays are representative of the AP600 HSIin terms of appearance. Tius t
{
includes display format, use of windows, display navigation mechanisms and links among the different HSI resources (such as mechanisms hnkmg alarm messages to particular workstation displays or procedures). The HSI features need not be high-fidelity with respect to physical scale. For example, the WPIS and alarm system could j
be simulated on a VDU.
j Information content - A set of displays is developed to cover the set of faults included
=
in the test, as well as to provide a set of realistic displays to test the adequacy of navigation.
I i
]
Dynamics - Static displays may be used. The displays need not be AP600-specific.
=
1' Participant Characteristics t
Participants may include designers, engineers, operator trainers, and operators. Participants need to have familiarity with the AP600 HSI features.
7.2.5 Evaluation Issue 9: Interpretation and Planning by Crew During Multiple-Fault Events Using Alarms, Workstation, WPIS, and Procedures Does the integration of alarms, WPIS, workstation and procedures support the crew in interpretation and planning during multiple-fault events?
Relevant HSI Resources:
WP5
~
Alarm system Workstation displays Computer-based and/or paper-based procedures Evaluation Issues and Descriptions May 1997 m:\\3638w.wpf:1b.Os0697 Revision 1
7-32 i
Specific Concerns:
Does the integration of alarms, displays, and procedures support the crew in:
Communicating relevant plant state information Developing and maintaining a shared understanding of plant state Allocating and coordinating goals and responsibilities Maintaining awareness of the goals and activities of other crew members f
Detecting performance errors of other crew members Engaging in group problem solving Maintaining successful role separation (that is, the supervisor is able to maintain a broad view while leaving the detailed monitoring and control activities to control operators)
Approach The purpose of this test is to provide confidence that the integrated HSI supports crew communication and coordination in responding to multiple-fault situations. This study examines crew performance on the same types of plant upsets described in Evaluation Issue 8. The difference is that the emphasis of this study is on crew interaction and joint problem-sohing. The study is performed using a crew size consistent with the AP600 MCR manning assumptions for handling emergency events. The subject (s) is presented with a complex alarm condition and selects and executes the appropriate respcnse using procedures and displays of the HSI.
]
Participants are presented with complex multiple-fault events to test several facets of crew communication and coordination. The primary experimental manipulations are:
l
\\
The type of event presented l
j l
l Whether each of the individuals forming a crew are " participants," or whether only
)
. one is the subject and the others are confederates whose actions are determined by scripts Whether the crews are observed responding to the event uninterrupted, or whether the simulation is frozen at specified points and the participants are asked questions Evaluation Issues and Descriptions May 1997 m:\\3638w.wpf;1b450697 Revision 1
7-33 I
relating to their knowledge of plant state, the activities of the other individuals, and
]
the implications for safety goal achievement Concept Testing Hypothesis The integrated HSI supports crew communication and coordination during multiple-fault l
events.
i Experimental Manipulations Dunng concept testing, two experimental conditions are used. In one condition, multiple individuals participate as a crew in the study, and their interaction and coordination are observed. This condition is more realistic. The second condition is more controlled. In the second condition, one individual is the subject of the study. One or more additional individuals are used to complete the crew, but these additional individuals are part of the experiment team (that is, expe.riment confederates). Their actions are determined by a script designed to create critical crew interaction situations with the individual who is serving as the subject. For example, a confederate might fail to take an action or may take an action that is incorrect. In this second condition, the question of interest is whether the subject detects the error, brings it to the attention of the confederate, and attempts to resolve the situation.
At various points in the event, the simulation is frozen, and the participants in the study are i
asked a series of questions designed to assess-i Awareness of plant state 1
Awareness of the response plan being followed j
=
Awareness of the activities of the other operator (s)
=
. Awareness of the goals and activities of other crew members
=
Awareness of the impact of the activities of the other operator (s) on their activity, and i
vice versa Dependent Measures and Evaluation Criteria j
This evaluation uses a breadboard design to investigate the effectiveness of the integrated HSI in supporting crew communication and coordination for interpretation and planning.
This is assessed through objective performance measures, thmk-aloud protocol during task performance, as well as the participant's subjective assessment obtained during debriefing interviews.
1 Evaluation Issues and Descriptions May 1997 mA3638w.wpf:1b4150697 Rmsion 1 J
7-M Subject decisions and actions are analyzed using decision tracing and analysis of task completion time. The subject's performance in responding to the plant upset is compared to an ideal response path defined by experts. Particular attention is focused on analysis of crew communication and coordination activities.
Objective dependent measures may include:
Whether relevant plant status information was communicated Whether participants maintained a shared understanding of plant state Whether participants successfully allocated and coordinated goals and responsibilities Whether participants successfully maintained role separation Whether participants were able to detect performance errors made by other crew members (such as errors intentionally made by confederates)
Whether participants engaged in group problem-solving, obtained consensus on interpretations and planned decisions.
Implications of Results The purpose of this evaluation is to contribute to the development of functional requirements for the integrated HSI to support crew communication and coordination for interpretation and planning.
The results are analyzed to identify design features that lead to confusion, errors, and slow or awkward actions by the participants. Particular attention is paid to the ability of the integrated HSI to support development of a shared plant state interpretation, efficient task allocation and coordination, effective role separation, and group problem-soMng and decision-making. Instances of breakdowns in communication or task coordination are analyzed in detail to determine HSI characteristics that might have contributed to the error and improvements that could be made to the HS1 to reduce this type of error.
Stage of Development of the HSI This test is conducted during the functional requirements phase of the HSI design process.
The HSI design needs to be at a phase where preliminary design concepts exist for the WPIS, the alarm system, workstation displays and procedures.
Evaluation issues and Descriptions May 1997 m:\\3638w.wpf:lt>050697 Revision 1
7-35 Test Bed Requirements:
I Physical form - The WPIS, alarm system, workstation displays, and computerized f
procedure displays are representative of the AP600 HS1in terms of appearance. This includes display format, use of windows, display navigation mechanisms, and links among the different HSI resources (such as mechanisms linking alarm messages to particular workstation displays or procedures). The HSI features need not be high-fidelity with respect to physical scale. For example, the WPIS and alarm system could be simulated on a VDU Information content - A set of displays are developed to cover the set of faults e
included in the test, as well as to provide a set of realistic displays to test the adequacy of navigation.
Dynamics - A dynamic simulation is required to drive the HSI. The plant simulation and displays nee'd not be AP600-specific.
Participant Characteristics Participants may include designers, engineers, operator trainers, and operators. Participants need to have familiarity with the AP600 HSI features.
7.2.6 Evaluation Issue 10: Interpretation cad Planning by Crew During Severe Accidents Using the Technical Support Center, Alarms, Workstation, WPIS, and Procedures Does the integration of the alarms, WPIS, workstation and procedures support the crew in interpretation and planning during severe accidents?
Relevant HSI Resources:
WPIS
=
Alarm system Workstation displays Computer-based and/or paper-based procedures a
Specific Concerns:
Does the HSI present information in ways that support interpretation of plant state under-degraded plant information conditions?
Evaluation Issues and Descriptions m:\\3638w.wphlb450697 May 1997 Revision 1
l l
7 l Does the HSI enable the crew to assess data quality and recognize when plant parameter measures are unreliable?
l Does the integrated HSI support the formulation of a response strategy in cases where procedural guidance is not available?
=
- Does the HSI encourage efficient use of information found inside and outside the MCR7 l
Does the HSI provide confidence of effective communication between crew members and personnel located outside the MCR (such as the technical support center)?
Does the HSI support effective group decision-making?
Approach 3
The purpose of this test is to provide confidence that the HSI design supports response to severe accident events. Severe accidents place increased cognitive demands in several respects First, because plant sensors can become unreliable, interpreting plant state is more difficult. Second, conditions may arise beyond the scope of emergency operating procedures (EOPs), requiring a response strategy to be developed. In Rasmussen's terminology, this means that there is greater emphasis on knowledge-based performance during severe accidents. A third complication is a greater need for communication and coordination with a variety of personnel outside the MCR, including personnel in the technical support center j
and the offsite emergency response facility.
This study is performed using a crew size consistent with the AP600 MCR maruung assumptions for handling severe accident events.
l The participants are presented with a severe accident scenario. Additional personnel resources (such as the technical support center or the offsite emergency response facility) are added, based on the time-frame and manning assumptions for the AP600. Decision-trace I
methodology is used to trace the information access activities, communication, goal t
formulation, response strategy planning, and decision-making activities of the MCR crew and outside support personnel.
Concept Testing Hypothesis i
The integrated HSI supports the coordination of people and information required for plan.
j' state interpretation and response strategy planning activities during severe plant accidents.
[
l
[
Evaluation issues and Descriptions May 1997 j
m:\\3638w.wp!:1b450697 Revision 1 e.-~
m
.r, e<--
,-~---e
7-37 Experimental Manipulations
.i A variety of severe accident conditions are included to test:
I System understanding in diagnostically complex cases (for example, masked j
=
symptoms and obscured evidence due to degraded sensors) i i
Success path planning in complex cases (such as complex constraints, side effects, and l
conflicting goals)
Communication and coordination between the MCR staff the technical support center and offsite emergency center staff l
Dependent Measures and Evaluation Criteria l
This evaluation uses a breadboard design to investigate the effectiveness of the integrated HSI in supporting interpretation and planning during severe accidents. Particular attention is focused on the ability of the integrated HSI to support knowledge-based reasoning. This is assessed through objective performance measures, think-aloud protocol during task performance, and the participant's subjective assessment obtained during debriefing interviews.
Subject decisions and actions are analyzed using decision-tracing and analysis of task completion time. The subject's performance in responding to the plant upset is compared to j
an ideal response path defined by experts.
Objective dependent measures may include:
Successful task completion (such as the selection of proper procedures and displays, i
and the proper execution of the procedure) l Task completion time t
Errors (such as incorrect intentions and incorrect execution of actions) between displays, induced by the HSI design)
'l Inefficiencies (such as delays or wasted actions, including excessive transitions Implications of Results l
The purpose of this evaluation is to contribute to the development of functional requirements I
for the integrated HSI to support severe accident management. The study identifies errors Evaluation Issues and Descriptions May 1997 m:\\3638w.wpf:1b-050697 Revision 1 8
7-38 and inefficiencies induced by the design of the HSI that may affect emergency response during severe accidents. This evaluation leads to the following types of recommendations:
Ways of presenting information that promote the efficient formation and testing of hypotheses regarding plant state Ways for verifying that multiple information sources are used effectively Ways for promoting effective communication between crew members and personnel
=
located outside the MCR Ways of enhancing group problem-solving, including understanding plant conditions, planning, and coordinating actions Ways of promoting effective group decision-making Stage of Development of the HSI This test is conducted during the functional requirements phase of the HSI design process.
The HSI design needs to be at a phase where preliminary design concepts exist for the WPIS, the alarm system, workstation displays and procedures. In addition, preliminary concepts for the technical support center and offsite emergency response center manning, responsibilities and resources need to be available.
Test Bed Requirements:
. Physical Form - The WPIS, alarm system, workstation displays, and computerized procedure displays are representative of the AP600 HSIin terms of appearance. This includes display format, use of windows, display navigation mechanisms, and links among the different HSI resources (such as mechanisms linking alarm messages to particular workstation displays or procedures). The HSI features need not be high-fidelity with respect to physical scale. For example, the WPIS and alarm system could be simulated on a VDU.
Information content - A set of displays are developed to cover the set of faults included in the test, as well as to provide a set of realistic displays to test the adequacy of navigation.
Dynamics - A dynamic plant simulation is required to drive the HSI. The plant simulation and displays need not be AP600-specific.
Evaluation Issues and Descriptions May 1997 m:\\3638w.wpf:1b 050697 Revision 1
7-39 Participant Characteristics Participants may include designers, engineers, operator trainers, and operators. Participants have familiarity with the AP600 HSI features.
7.3
~ EVALUATIONS FOR CONTROLLING PLANT STATE The purpose of evaluations in this subsection is to provide confidence that the HSI supports the operator in making changes in the plant state, including:
Control activities that are operator-paced Control tasks that require coordination of multiple procedures a
Control activities that are event-paced Control activities that require coordination among multiple individuals.
a Control activities that require consideration of pieconditions, side effects, and post-conditions of control actions The controlling plant state class of evaluation issues includes the following:
Issue 11: Do the HSI features support the operator in performing simple, operator-paced control tasks?
Issue 12: Do the HSI features support the operator in performing control tasks that require assessment of preconditions, side effects, and post-conditions?
Issue 13: Do the HSI features support the operator in performing control tasks that require multiple procedures?
Issue 14: Do the HSI features support the operator in performing event-paced control tasks?
Issue 15: Do the HSI features support the operator in performing control tasks that require coordination among crew members?
7.3.1 Evaluation Issue 11: Simple Operator-Paced Control Tasks Do the HSI features support the operator in performing simple, operator-paced control tasks?
Evaluation Issues and Descriptions May 1997 m:\\3638w.wpf:1b 050697 Revision 1 l
J 7-40 Relevant HSI Resources:
Workstation displays and display navigation features i
Soft controls
=
Computer-based and/or paper-based procedures Specific Concerns:
1 Are the procedures well-coordinated with the workstation displays to allow efficient location and execution of control actions?
i Do the workstation displays support the operator in efficiently locating relevant displays and executing control actions?
Are the soft controls provided in the workstation adequate for supporting operator execution of control actions (including providing adequate feedback on actuation of control action)?
s Approach Control maneuvers (such as taking systems out of operation, or switching systems) represent a primary activity operators perform during normal and abnormal operations. The purpose of this test is to verify that the AP600 HSI can support operators in performing i
straightforward control maneuvers (that is, maneuvers that can be accomplished by a single operator, are operator-paced, and do not involve consideration of preconditions, side effects, or post-conditions). The HSI is evaluated by recording a number of performanca measures while participants attempt to perform a series of straightforward control tasks.
)
Concept Testing Hypothesis The workstation displays and the soft controls provided in the workstation support operator execution of control actions. These controls minimize errors, provide appropriate feedback on control actuation, and allow the operator to quickly correct actions identified as erroneous.
Experimental Manipulations Mechanisms for VDU-based (soft) controls are tested for various control actions (such as i
initiation / termination, tuning, or mode selection) and under varying task conditions, including the presence of time pressure and task distractions. Also, the coordination of
)
controls with the displays that provide feedback for control actions is tested.
I Evaluation Issues and Descriptions May 1997 m:\\3638w.wpf:1b45%97 Revision 1
"-~-'------_m.__
7-41 Dependent Measures and Evaluation Criteria This evaluation uses breadboard designs to investigate human factors issues related to the selection of displays and controls and the execution of soft controls. Qualitative information is gathered through protocol analysis or debriefing of participants. The intention is to identify characteristics of the design concepts that lead to confusion, errors, and slow responses by participants in attempting to make an appropriate control action.
Objective dependent measures may include:
I Efficiency of navigation - The number of displays traversed to locate a relevant display is compared to the ideal navigation path specified by design engineers.
Degree of coordination of displays and procedures - The number of shifts in displa to accomplish a procedure (particularly shifts back and forth between sets of displays display thrashing) are recorded. This is compared to an optimal standard (such as each display supports several procedure steps; display shifts follow a logical progression and occur at logical breaks in procedure step grouping and, there is no display thrashing).
Ideally, number, type, and severity (that is, Number and type of execution en plant control versus navigation) of execution errors observed with the AP600 HS compared with number and type of execution errors observed for identical control tasks in a typical MCR (under identical conditions).
Ability to correct control actions not executed correctly.
Anthropometric problems with the soft controls (if any). Any problems locating, activating, or obtaining feedback on soft control activation are recorded.
Time required to complete task.
Implications of Results The purpose of this evaluation is to contribute to the development of functional requir for the design of the physical and functional displays and the soft controls embedded w them. Specifically, the results may guide the design of the display navigation schem control representation, screen interaction devices, and control selection and actuation.
The qualitative information gathered from concept testing is analyzed to identify desig features that lead to confusion, errors, and slowness. Functional requirements are deve May 1997 Evaluation Issues and Desenptions Rension 1 m:\\3638w.wpf:lt>050697
f 7-42 to address those design characteristics that had significant effects performance on the control task.
on the participants' The quantitative measures may be used as baselines to compare alternativ evaluate performance benefits achieved through subsequent refinements o esigns and e
s gn concepts.
Stage of Development of the HSI This test is conducted during the functional requirements phase of the HSI d The HSI design needs to be at a phase where design concepts exist for di esign process.
navigation aids, screen interaction devices and soft control mechanisms splay formats, decisions regarding display system hardware are made.
. Preliminary Test Bed Requirements:
Physical form - Computer-based displays are used to simulate the work t displays. This includes soft controls that have high physical fidelity s a on Information content - A set of workstation displays are developed to co control tasks tested, as well as to provide a set of realistic displays to test t e set of adequacy of navigation.
Dynamics - Static displays may be adequate (that is, no changes in pa required to provide feedback of soft control actua e er values actuated and that the desired change in plant state took place).
as Participant Characteristics Participants may include designers, engineers, operator trainers and ope have familiarity with the operation of the workstation displays rs. Participants 7.3.2 Evaluation Issue 12: Conditional Operator-Paced Control Tasks of preconditions, side effects, and post-conditions?Do the H u re assessment Relevant HSI Resources:
Workstation displays WP5 I
)
Evaluation Issues and Descriptions mA3638w.wpf:1b-050697 May 1997 Revision 1
m 7-43 Computer-based and/or paper-based procedures Soft controls Specific Concerns:
Do the WPIS and workstation displays support the operator in identifying violations in preconditions, side effects of control actions, and post-conditions that result from control actions?
Are the plant specifications (computer-based or paper-based) well coordinated with the workstation displays to allow efficient identification of violation of preconditions, side effects, and post-conditions that result from control actions?
Approach Control maneuvers can become complicated when operators need to consider action preconditions, subtle side effects, and necessary post-conditions (such as an action that results in a violation of plant specifications). The purpose of this test is to verify that the AP600 HSI can support operators in performing control maneuvers where preconditions, side-effects and/or post-conditions need to be considered. The tagging out of plant components is an example of this type of situation.
The proposed approach to test these issues is to record a number of performance measures while participants attempt to perform a series of control maneuvers that require consideration of preconditions, side effects and post-conditions.
Concept Testing Hypothesis l
l The WPIS and workstation displays support the operator in identifying violations in preconditions, side effects of control actions, and post-conditions that result from control actions. Further, the plant Technical Specifications (TS) (computer-based or paper-based) are l
well coordinated with the workstation displays to support this same function.
i Experimental Manipulations l
This evaluation uses breadboard designs of the workstation displays and control displays to investigate human factors issues related to completing control tasks that can lead to violations. Participants are given control tasks such as tagging out a component.
Participants are not told whether a violation of plant specifications (or other violation) is t
Evaluation Issues and Descriptions May 1997 m:\\3638w.wpf:1b450697 Revision 1 l
l
7-44 likely. They have the WPIS displays, which provide them with an overview status, the workstation displays, and the plant specifications.
plant The evaluation manipulates the complexity of the control task in terms of the n displays that must be accessed. The control tasks presented to particip er of following situations:
m the No violations occur from completion of the control task Preconditions for performing the task are violated Negative side effects for other ongoing activities occur through completion control tasks Completion of the control task results in plant systems being unavailable an operational goals being violated Completion of the task results in plant TS violations
=
Dependent Measures and Evaluation Criteria The subject is asked to perform each of a series of control tasks. Objectiv measures may include whether:
ent There are any violations of preconditions for performmg the task Performmg the tasks results in negative side effects for other ongoing activi
=
Completion of the task results in plant systems being unavailable and/o goals being violated a
Completion of the task results in plant TS violations The primary, dependent measures are the subject's ability to detect and if p action to avoid viciations of any kind. Subject comments and teactions to the bre
,ae HSI features are also solicited during the debriefing following completion of oard The intention is to identify characteristics of the design concepts that led to conf asks.
difficulty, errors, or slowness. Other measures are the following:
s on, Task completion time Display navigation paths Number of inappropriate dispL. selected Evaluation Issues and Descriptions mA3638w.wpf:1b4s0697 May 1997 Revision 1
7-45 Implications of Results l
The purpose of this evaluation is to contribute to the development of functional requirements for the design of the WPIS, the physical and functional displays, the controls and control displays and the plant TS. Specifically, the results may guide development of functional requirements to enable operators to maintain a broad perspective of the plant and the interrelations between their actions and other ongoing activities.
The qualitative information gathered from concept testing is analyzed to identify design
)
features that lead participants to miss preconditions, side effects, or post-conditions associated with their control task. Functional requirements are developed to address those design characteristics that have significant effects on the participants' performance on the control task.
The quantitative measures are used as baselines to compare alternative designs and evaluate performance benefits achieved through subsequent refinements of design concepts.
Stage of Development of the HSI This test is conducted during the functional requirements phase of the HSI design process.
The HSI design needs to be at a phase where design concepts exist for the WPIS displays, workstation displays, navigation aids, plant TS, rudimentary procedures, and soft control mechanisms.
Test Bed Requirements:
Physical form - Computer-based displays are used to simulate the WPIS displays.
Computer-based displays are also used to simulate the process data displays.
Information content - A set of workstation displays are developed to cover the set of control tasks tested, as well as to provide a set of realistic displays to test the adequacy of navigation.
Dynamics - Static displays may be adequate (that is, no changes in parameter values over time). The only dynamic characteristics required are changes in the WPIS and workstation displays needed to indicate consequences of control actions on plant state.
These can be simulated by replacing one static display with another. For example, after a control action, a new static WPIS display is presented that provides revised indications of plant state. The display need not be AP600-specific.
Evaluation Issues and Descriptions May 1997 m:\\3tdSw.wpf:Ib-050697 Revision 1
. ~. -.
- ~.
l 1
7-46 i
Participant Characteristics Participants may include designers, engineers, operator trainers, and operators. Participants must have familiarity with the operation of the workstation displays.
7.3.3 Evaluation Issue 13: Control Using Multiple, Simultaneous Procedures l
Do the HSI features support the operator in performing control tasks that require multiple procedures?
Relevant HSI Resources:
l l.
Workstation displays WPIS Computer-based and/or paper-based procedures Specific Concerns:
Does the design of the procedure display interfaces prevent operators from getting l
lost in nested procedures?
l l
Does the design of display devices support the concurrent use of multiple mdependent (not nested) procedures?
Does the coordination of procedure displays with physical and functional displays j
allow effective use of plant displays during concurrent use of multiple procedures?
)
Approach Operators may be required to access more than one procedure at a time. There are typically two general cases of multiple-procedure use. One case is the use of independent, concurrent l
procedures. For example, an operator may be involved in both an operating procedure and a mamtenance procedure. The second case is the use of nested procedures, where the first procedure refers the operator to a second procedure. In this case, the operator typically completes the second procedure and then returns to complete the first procedure. Given that these cases exist, the design of the procedure display interface must allow operators to accomplish several feats during control tasks:
Perform steps of procedures with mirumal disruptions due to manipulations and adjustments of other procedures and corresponding plant displays Maintain an awareness of the dependent nature of nested procedures I
Evaluation Issues and Descriptions May 1997 m:\\3638w.wpf:1b450697 Revision 1
4 7-47 i
r Maintain an awareness of the procedures that are "open" and which steps remain to
-a be completed 4
l Therefore, success in these tasks requires considerable coordination between displays and procedures. The intent of this evaluation is to test an operator's ability to use multiple procedures in the two general cases described. Participants are given each multiple-procedure case and asked to work through the procedures. Performance is evaluated in terms of the subject's ability to maintain an awareness of the status of these f
procedures and their implications to plant state.
Concept Testing Hypothesis The procedures and workstation displays support the operator's ability to access and use multiple (independent and nested) procedures and to maintain an awareness of the status of these procedures and their implications to plant state.
Experimental Manipulations This evaluation uses breadboard designs to investigate alternative procedure display selection concepts. Important characteristics may include bookmarks and other navigation aids; logical l
branch displays to identify open procedures and steps; control logic for accessing
{
corresponding plant displays; and windowing features for depicting open procedures.
i The approach for testing the case of multiple independent procedures is to have participants I
work through a plant procedure (s) (such as a normal operating procedure and mamtenance or surveillance procedures) that is unrelated to the procedure currently accessed. For example, while the operator is executing a procedure for change in plant power, he is asked to perform a surveillance procedure that requires a plant system to be realigned. The subject I
is allowed some flexibility for determuung which procedure steps to perform first (that is, determuung when to switch from ona procedure to the other). As in the previous case, the subject is asked, at various predefined points, to identify which procedures are open and the implications of these open procedures to plant state. Perforrance is evaluated in terms of correct access of procedures and correct response to questionn regarding open procedures.
Inefficiencies in procedure and display use (such as excessive eearch and manipulation of displays) are noted.
]
The approach for testing the nested procedure case involves having participants work through a scenario that requires the use of nested procedures. The subject accesses the required procedure and a corresponding set of plant displays. Using the procedure and plant displays, the subject explains how each procedure step would be executed. The subject Evaluat;on Issues and Descriptions May 1997 m:\\3638w.wpf:1b-050697 Revision 1
7-48 I
accesses other nested procedures as required. At various predefined points in the scenario, the subject is asked to identify which procedures are open and the implications of these open procedures to plant state. The participants actions and comments are recorded on video tape. Performance is evaluated in terms of correct access of nested procedures and correct response to questions regarding open procedures. Inefficiencies in procedure and display use (such as excessive search and manipulation of displays) are noted.
b After each test condition, the subject is debriefed to identify features of the procedure display system that made the task difficult. In the case of multiple independent procedures, the subject is questioned to determine whether constraints of the procedure display system affected the order in which the independent procedures were performed.
Dependent Measures and Evaluation Criteria Qualitative results include assessment of difficulties, delays, and inefficiencies induced by the design or performance of the procedure display system. Qualitative results also include j
participants comments concerning characteristics of the display perceived to make the task difficult, and comments regarding how the design of the display system affected the way independent procedures were executed.
Objective dependent rneasures may include:
Number of errors made in procedure and display navigation and selection Correct identification of "in-progress" procedures and steps awaiting completion
=
Subject responses are compared to a predefined set of most correct responses. Performance is analyzed using protocol analysis to identify errors of intent (such as, the subject identified the wrong procedure but retrieved it correctly) and errors on execution (such as the subject identified the correct procedure but made a mistake while retrieving it). The subject's stated intentions and actual behavior are recorded and analyzed to identify the causes of these errors.
Implications of Results The purpose of this evaluation is to contribute to the development of functional requirements for the design of the procedures, particularly the computer-based procedures, and the physical and functional displays. The performance measures may be used to assess the relative merits of different procedure display concepts. The results may be used to identify procedure display concepts that lead to fewer navigation errors and a better awareness /
understanding of the status of active procedure steps.
Evaluation issues and Descriptions May 1997 m:\\3638w.wpf:1b-050697 Revision 1
i 7-49 The qualitative information gathered from concept testing is analyzed to identify design features that lead to confusion, difficulties, errors, and slowness. Functional requirements are developed to address those design characteristics that had significant effects on the participants' performance on the control task.
The quantitative measures are used as baselines to compare alternative designs and evaluate performance benefits achieved through subsequent refinements of design concepts.
Stage of Development of the HSI This test is conducted during the functional requirements phase of the HSI design process.
Procedures need to be defined for the specific scenarios addressed. Operator actions for 1
accessing additional procedures need to be defined. Plant displays corresponding to the procedure steps need to be established.
Test Bed Requirements:
Physical Form - Procedures, procedure selection, and retrieval aids included in the test bed are of high-fidelity. Wording of text, labels, and titles is accurate. Character sizes and salience coding are well thought out and well executed. Menus are structured well. Procedures, procedure selection information, and plant displays are displayed in the same mode as in the MCR, either on VDUs or plasma panels.
Information Content - Scenarios for these tests are well defined and credible.
Procedures for these scenarios are complete, including full, properly formatted text for procedure steps. Clear criteria for referring the subject to other procedures are very important. Data values for plant parameters are credible, but need not be actual values derived from a simulation.
Dynamics - Computerized plant procedures must have as many operational properties as possible (such as scrolling, book marking, or electronic links between procedures). Individual static displays of the plant may be used. The human-machine mterfaces for retrieving, and displaying procedures and plant displays need to be operational. The displays need not be AP600-specific.
Participants Characteristics Participants may include designers, engineers, operator trainers, and operators. Participants have familiarity with the operation of the workstation displays.
Evaluation Issues and Descriptions May 1997 m:\\3638w.wpf:1b-050697 Revision 1
7-50 i
i 7.3.4 Evaluation Issue 14: Event-Paced Control Tasks i
i Do the HSI features support the operator in performing event-paced control tasks?
Relevant HSI Resources:
i l
Workstation displays Soft controls WPIS Computer-based and/or paper-based procedures i
3 Specific Concerns:
Do the workstation displays support the operator in locating relevant displays and i
j executing control actions at a rate that allows the operator to keep pace with the event?
l i
i
' Are the computer-based (or paper-based) control procedures well coordinated with
.j l-the workstation displays to allow the operator to keep pace with the event?
j In cases where event dynamics are slow (that is, long response time to reach desired l
=
state for a step in the procedure), do the displays and computer-based control j
procedures allow the operator to go on to perform the next steps (that is, suspend a step) and return to complete the pending step at the appropriate time?
?
Do the soft controls support the operator in execution of control actions and evaluation of feedback in pace with the event?
I Approach 1
~ Many operator activities performed during normal 'and abnormal operation involve dynanic control tasks (such as plant startup, plant mode changes, and load changes) where the operator activity keeps pace with plant dynamics. Keeping pace often refers to the operator's ability to execute actions quickly enough to stay ahead of the dynamics of a plant state progression. However, problems can also arise when the event moves slowly. In this case, the operator may have to suspend one procedural step (such as wait for a parameter value to reach a threshold) but not other steps subsequent to it. The operator may continue to complete steps subsequent to the suspended step. This decision creates a need for the operator to remember to complete a step that is no longer cued by the procedures.
Therefore, when the condition is satisfied (such as the value reaches the threshold), the operator returns to the step and executes it.
Evaluation Issues and Descriptions May 1997 m:\\3638w.wpf 1b 050t 97 Revision 1 I
l l
l
7-51 The purpose of this test is to verify that the AP600 HSI can support operators in performing event-paced control tasks. The proposed approach to test these issues is to have participants attempt to perform a series of event-paced control tasks and to record a number of i
performance measures.
Concept Testing Hypothesis The workstation displays, the cornputer-based procedures, and soft controls support the operator in efficiently locating relevant displays and executing control actions in pace with process dynamics. In cases where event dynamics are slow, the operator is able to perform subsequent steps (that is, suspend a step) and return to complete the pending step at the appropriate time.
i Experimental Manipulations The proposed approach to test these issues is to have participants attempt to perform a series of event-paced control tasks. Two types of control tasks are used. The first type of control task involves rapid process dynamics (such as manual feedwater control during startup),
requiring skilled operator response to keep up with process dynemia,. The second type of control task involves slow dynamics (processes with a long response time to reach the desired state) so that operators are required to initiate processes, go on to other activities, and then return to confirm that the process goal states are achieved and complete pending steps.
Dependent Measures and Evaluation Criteria Qualitative results include assessment of difficulties, tielays, and inefficiencies induced by the design or performance of the control and displays syetems. Qualitative results also include comments from the participants conceming characteristics of the displays or controls that I
were perceived to make the task difficult, and comments regarding how the design of the display system affected the way independent procedures were executed.
Objective dependent measures may include:
Ability of the operator to keep up with process dynamics (Are process goal states reached in adequate time? Are process limits exceeded?). Evaluation criteria require determmation of time limits within which control tasks must be completed, and control boundaries that are not exceeded (such as trip setpoints)
Number and types of errors of execution (Are steps omitted? Does the operator fail to return to complete pending steps? Are boundary limits exceeded?)
Evaluation Issues and Descriptions May 1997 m:\\3638w.wpf:1t>050697 Revision 1
. - _ _ _ _.. _.~._.-._ - _.__._._._ _... _.. _ _ _ _ _.._
7-52 j
Whether operators detect and correct errors of execution when they occur
=
Anthropometric problems with the soft controls (if any)
Degree of coordination of displays and procedures. The number of shifts in displays to accomplish a procedure (particularly shifts back and forth between sets of displays
- display thrashing - are recorded)
Implications of Results The purpose of this evaluation is to contribute to the development of functional requirements for the design of the controls and the physical and functional displays. The performance measures may be used to assess the relative merits of different control and display concepts, including the following:
Information on the ease of locating relevant displays, controls, and procedures in pace with process dynamics and improvements that may be needed Information on whether the displays and procedures are well coordinated, allowing operators to keep pace with process dynamics and improvements that may be needed
- Effect of the HSI on errors of execution and the ability to detect and correct errors of execution and improvements that may be needed t
Adequacy of anthropometric characteristics of soft controls (such as size, shape, or saliency) for supporting event-paced control activities and improvements that may be needed 3
The qualitative information gathered from concept testing is analyzed to identify design
' features that lead.to confusion, difficulties, errors, and slowness. Functional requirements are developed to address those design characteristics that are found to have significant effects on the participants' performance on the control task..
l The quantitative measures may be used as baselines to compare alternative designs and evaluate performance benefits achieved through subsequent refinements of design concepts.
[
i l'
t-i 4
Evaluation issues and Descriptions May 1997 m:\\3638w.wpf:lt>450697 Revision 1 -
[.
vv-+
rT T-'
M
7-53 Stage of Development of the HSI This test is conducted during the functional requirements phase of the HSI design process.
The following components need to be available:
Workstation displays for the event-paced control tasks selected; a dynamic prototype of the workstation that includes a set of realistic displays and navigation mechanisms j
to test the adequacy of navigation j
\\
~
A dynamic prototype of the workstation that includes soft controls that have high
=
physical form fidelity (that is, size, shape, saliency, actuation feedback characteristics)
Procedures (either paper-based or computer-based)
=
A high-fidelity plant simulation that models the plant dynamics for the event-paced control tasks selected is necessary to drive the displays. The displays need not be AP600-specific.
i Test Bed Requirements:
Physical form - Computer-based dynamic displays are used to simulate the workstation display. This includes soft controls that have high physical fidelity.
Information content - A set of workstation displays is developed to cover the set of control tasks tested, as well as to provide a set of realistic displays to test the adequacy of navigation.
Dynamics - A dynamic plant simulation is required to drive the workstation displays I
to simulate the plant dynamics involved in the event-paced control tasks selected.
The displays need not be AP600-specific.
Participant Characteristics Participants may include designers, engineers, operator trainers, and operators. Participants need to have familiarity with the operation of the workstation displays.
7.3.5 Evaluation Issue 15: Control Tasks Requiring Crew Coordination Do the HSI features support the operator in performing control tasks that require coordination among crew members?
Evaluation Issues and Descriptions May 1997 m:\\3638w.wpf:1b4s0697 Revision 1
7-54 Relevant HSI Resources:
Workstation displays Soft controls WPIS
=
Computer-based and/or paper-based procedures Specific Concerns:
Do the WPIS displays, workstation displays, and procedures allow an operator to:
Maintain awareness of control actions of other personnel working in parallel
=
Provide a common frame of reference and promote common mental models of plant state Anticipate the consequences of the control actions of other personnel working in
=
parallel Coordinate activities with other personnel working in parallel Develop control strategies that take into account the control actions of other personnel
=
working in parallel (that is, build on the activities of the other personnel rather than work at cross-purposes)
Monitor performance of others to verify actions and identify and correct errors Allocate tasks among crew members as plant conditions change to improve efficiency of performance, provide assistance, and/or avoid reaching u iesirable plant states Approach The purpose of this test is to verify that the AP600 HSI can support operators in performing control tasks that require coordination among multiple individuals. Coordination supports increased error checking, more efficient use of human resources, and better response to changing plant conditions.
Participants are placed in control tasks requiring coordination to test several facets of crew coordination. The primary experimental manipulations are:
Type of control task presented, including simultaneous but related and simultaneous but unrelated procedures Evaluation Issues and Descriptions May 1997 m:\\3638w.wpf:1bO50697 Revision 1
I I
l:
7-55 i
L l
' Whether each of the operators forming a crew are participants or whether only one is i
the subject and the others are confederates whose actions are deternuned by scripts L
Whether the crews are observed performing the control task uninterrupted or whether the simulation is frozen at specified points and the operators are asked questions relating to their knowledge of the activities of the other operators and their l-consequences J
l,.
Concept Testing
]
Hypothesis l
l The WPIS displays, workstation displays, and the computer-based procedures support crew coordination in control tasks by enabling crew members to Maintain awareness of control actions of other personnel working in parallel (that is, t
l they provide a common frame of reference and promote common mental models of l
plant state)
Anticipate the consequences of the control actions of other personnel working in
{
j parallel i
l Coordinate activities with other personnel working in parallel.
j l.
Develop control strategies that take into account the control actions of other personnel I
l working in parallel (that is, build on the activities of the other personnel rather than work at cross-purposes)
Allocate tasks flexibly (dynamically) among themselves in order to improve efficiency -
of performance and/or avoid reaching undesirable plant states.
Experimental Manipulations This evaluation uses concept designs of the workstation, the WPIS, and the procedures to test the human factors issues related to crew coordination in control tasks. The approach is to have participants attempt to perform a series of control maneuvers that require coordination among multiple individuals, and examine whether operators are able to maintain awareness of the activities of others and coordinate with them effectively. This requires setting up a f
dynamic test bed that enables multiple operators to interact with plant processes l
simultaneously. Two experimental conditions are proposed. In one, multiple operators participate as a crew in the study and their interaction and coordination is observed. This f
condition is more realistic. In the second condition, which is more controlled, one operator is l
Evaluation issues and Descriptions May 1997
(
m:\\3638w.wpf.It>450697 Revision 1 I
l l
l o
^
i 7-56 the subject of the study. One or more operators are used to complete the crew required to perform the control task, but these additional individuals are part of the experimental team
{
(that is, experiment confederates). Their actions are determined by a script designed to create critical crew interaction situations with the operator who is seving as the subject. For l
example, the confederate might take an action that has undesirable effects on the process l
controlled by the subject.~ The subject detects this error, brings it to the other operator's attention, and attempts to resolve the situation.
I At various points in the control maneuver, the simulation is frozen, and the operators participating in the study are asked a series of questions designed to assess:
- Awareness of the activities of the other operator (s) e.
Awareness of the impact of the activities of the other operator (s) on their activity, and vice versa Ability to anticipate the future consequences of their activities on the activities of the other operator (s) and vice versa Ability to formulate coordination strategies that build on the activities of the other a
operators rather than working at cross-purposes Dependent Measures and Evaluation Criteria Qualitative results include assessment of difficulties, delays, and inefficiencies induced by the design or performance of the control and displays systems. Qualitative results also include 3
comments from the participants concerning characteristics of the displays or controls that they felt made the task difficult, and comments regarding any effect that the design of the display system had on the way they executed independent procedures.
Objective dependent measures may include:
\\
The adequacy of performance on the control task (that is, the ability and time to I
- achieve goal states, and to limit violations)
)
l I
'i The degree of crew communication and coordination (that is, using decision-trace
~ methodology) l 4
j The responses to questions relating to operator awareness of activities of the other l
operators and their consequences i
1 4
Evaluation issues and Descriptions May 1997 nu\\3638w.wpf:1t>050697 Revision 1 5
7-57 1
Criteria may be set for adequate response (such as maximum time to achieve goal state,.or no f
limit violations). Another criterion is the subject's success in detecting and describing the actions of crew members that conflict with his actions.
i Implications of Results The purpose of this evaluation is to contribute to the development of functional requirements for the design of the controls, the WPIS, and the physical and functional displays. The performance measures may be used to assess the relative merits of different control and display concepts.
The qualitative information gathered from concept testing is analyzed to identify design features that lead to confusion, difficulties, errors, and slowness. Functional requirements are developed to address those design characteristics that had significant effects on the participants' performance on the control task.
The quantitative measures may be used as baselines to compare altemative designs and i
evaluate performance benefits achieved through subsequent refinements of design concepts.
I I
Stage of Development of the HSI This test is conducted during the functional requirements phase of the HSI design process.
The following components need to be available: multiple workstations to support multiple operators working in parallel, workstation displays for the control tasks selected, and procedures (either paper-based or computer-based).
A high-fidelity plant simulation that models the plant dynamics for the event-paced control I
tasks selected is necessary to drive the displays.
- Test Bed Requirements:
Physical form - Multiple workstations are high-fidelity with respect to physical form and layout in the MCR. A WPIS is high-fidelity in physical form (such as size, location relative to workstations, and display characteristics).
Information content - A set of WPIS and workstation displays is developed to cover the set of control tasks tested.
Dynamics - A dynamic plant simulation is required to drive the WPIS and workstation displays to simulate the plant dynamics involved in the operator coordination control tasks selected. The plant simulation need not be AP600-specific.
Evaluation Issues and Descriptions May 1997 m:\\3638w.wpf-1b450697 Revision 1
7-58 Participant Characteristics Participants can include designers, engineers, operator trainers, and operators. Participants have familiarity with the operation of the workstation displays.
7.4 EVALUATIONS FOR CONFORMANCE TO HUMAN FACTORS ENGINEERING DESIGN GUIDELINES
~
The purpose of these evaluations is to provide confidence that the HSI features satisfy relevant HFE design guidelines and operator requirements for comfort and ease of use.
The HFE evaluation issue follows:
Issue 16: Do the HSI components satisfy relevant HFE criteria for acceptability?
7.4.1 Evaluation Issue 16: Conformance to HFE Guidelines Do the HSI components satisfy relevant HFE criteria for acceptability?
Approach This evaluation corresponds to the HFE design verification task described in Reference 2.
]
The objective of the HFE design verification is to verify that all aspects of the HSI (for example, controls, displays, procedures, and data processing) are consistent with accepted HFE guidelines, standards, and principles. Reference 2 provides a description of the activities performed as part of the HFE design verification task.
l 7.5 EVALUATIONS FOR VALIDATION OF INTEGRATED HSI This evaluation corresponds to the integrated system validation described in Reference 2.
The purpose of this evaluation is to provide confidence that the integration of the HSI features satisfies the design mission of supporting safe and efficient operation of the AP600 in a variety of plant conditions.
Validation of the integrated man-machine interface system (M-MIS) is:
Issue 17: Does the integration of HSI components satisfy requirements for validation of MCR functions and integrated performance capabilities?
Evaluation Issues and Descriptions May 1997 m:\\3638w.wpf:1b-050697 Rnision 1
l 7-59 7.5.1 Evaluation Issue 17: Validation ofIntegrated HSI Does the integration of HSI components satisfy requirements for validation of MCR functions and integrated performance capabilities?
Relevant HSI Resources:
Plant Information System (including functional and physical displays of plant processes)
Alarm system Computerized procedure system and/or paper-based procedures a
Dedicated and soft (computer-based) controls
=
WP5 Qualified data processing system Specific Concerns:
Does the integration of HSI components in the MCR support operator performance requirements for normal, abnormal, and emergency conditions?
Approach This evaluation corresponds to the integrated system validation task described in Reference 2.
The objective of integrated system validation is to ensure that the functions and tasks allocated to the plant personnel can be accomplished with the HSI design implementation.
Explicitly included in the integrated system validation is validation of the AP600 EOPs.
An implementation plan is developed that specifies a methodology for integrated system validation prior to test performance.
Evaluation Issues and Descriptions May 1997 m:\\3638w.wpf:ltM50697 Revision 1 l
l
--.~I
8-1 8
REFERENCES 1.
Kerch, S., Roth, E. M. & Mumaw, R. J., Man-in-the-Loop Test Plan Description, WCAP-14396, Rev. 2,1996.
2.
Roth, E. M. & Kerch, S., Programmatic Level Description of the AP600 Human Factors Verification and Validation Plan, WCAP-14401,1995.
3.
O'Hara, J. M. and Wachtel, J.,1991, " Advanced Control Room Evaluation: General Approach and Rationale" in " Proceedings of the Human Factors 35th Annual Meeting," pp. 1243-1247, (Santa Monica, CA, Human Factors Society).
4.
Woods, D. D. and Roth, E. M.,1988, " Cognitive Systems Engineering," Helander, M.
(ed.), " Handbook of Human-Computer Interaction," pp.3-43, (New York, NY, Elsevier Science Publishing Co., Inc.).
5.
Helander, M. (ed.),1988, " Handbook of Human-Computer Interaction," (New York, NY, Elsevier Science Publishing Co., Inc.).
6.
Woods, D. D.,1992, " Process Tracing Methods for the Study of Cognition Outside of the Experimental Psychology Laboratory" in " Decision Making in Action: Models and Methods," Klein, G., Calderwood, R., and Orasanu, J. (eds.), (Norwood, NJ, Ablex).
7.
Woods, D. D. and Sarter, N. B.,1992, " Evaluating the Impact of New Technology on Human-Machine Cooperation," Wise, J. A., Hopkins, V. D. & Stager, P. (eds.),
pp.133-158, (Berlin, Germany, Springer-Verlag).
8.
Stubler, W. F., Roth, E. M. & Mumaw, R.,1992 " Integrating Verification and Validation with the Design of Complex Man-Machine Systems," Wise, J. A.,
Hopkins, V. D. & Stager, P. (eds.), pp. 159-172, (Berlin, Germany, Springer-Verlag).
9.
Meister, D.,1987, " Systems Design, Development and Testing" in " Handbook of Human Factors," Salvendy, G. (ed.), pp.17-42, pp.1271-1297, (New York, NY, John Wiley & Sons).
10.
Rasmussen, J.,1986, '"Information Processing and Human-Machine Interaction, Approach to Cognitive Engineering," (New York, North-Holland).
11.
Woods, D. D., Wise, J. A., and Hanes, L. F.,1982, " Evaluation of Safety Parameter Display Concepts," NP-2239, (Palo Alto, CA, Electric Power Research Institute).
Referenc$
May 1997 mA3638w.wpf:H>&.M7 Revision 1 1
l l
8-2 1
12.
Woods, D. D. and Roth, E. M.,1982 (unpublished study, Proprietary), " Operator Performance in Simulated Process Control Emergencies," (Pittsburgh, PA, Westinghouse Science and Technology Center).
13.
Woods, D. D. and Roth, E. M.,1986, "The Role of Cognitive Modeling in Nuclear Power Plant Personnel Activities," NUREG-CR-4532, Volume 1, (Washington, DC, U.S.
Nuclear Regulatory Commission).
1 14.
Roth, E. M., Mumaw, R. J., and Lewis, P. M.,1994, "An Empirical Investigation of j
Operator Performance in Cognitively Demanding Simulated Emergencies,"
NUREG/CR-6208, (Washington, DC, U.S. Nuclear Regulatory Commission).
15.
Roth, E. M. and Woods, D. D.,1988, " Aiding Human Performance: I. Cognitive Analysis" in "Le Travail Humain," Volume 51 (1), pp. 39-64.
)
i 16.
Woods, D. D., and Holinagel, E.1987, " Mapping Cognitive Demands in Complex Problem Solving Worlds" in " International Journal of Man-Machine Studies,"
i Volume 26, pages 257-275. (New York, Academic Press).
17.
Woods, D. D.,1988, " Coping with Complexity: The Psychology of Human Behavior in Complex Systems" in Goodstein L. P., Andersen, H. B., and Olsen, S. E. (eds.), " Tasks, Errors, and Mental Models," (London, UK, Taylor & Francis).
18.
Mumaw, R. J., Swatzler, D., Roth, E. M., and Thomas, W. A.,1994, " Cognitive Skill Training for Decision Making," NUREG/CR-6126, (Washington, DC, U.S. Nuclear Regulatory Commission).
19.
Vicente, K. J., Burns, C. M., Mumaw, R. J. and Roth, E. M.,1996, "How Do Operators Monitor a Nuclear Power Plant? A Field Study," Proceedings of the 1996 American Nuclear Society International Topical Meeting on Nuclear Plant Instrumentation, Control and Human-Machine Interface Technologies, pp. 1127-1134. (NPIC&HMIT'96, La Grange Park, Illinois, American Nuclear Society).
20.
Woods, D. D., Roth, E. M., and Pople, H. E., Jr.,1987, " Cognitive Environment Simulation: An Artificial Intelligence System for Human Performance Assessment,"
NUREG-CR-4862, (Washington, DC, United States Nuclear Regulatory Commission).
21.
Reason, J. T.,1990, " Human Error," (Cambridge, UK, Cambridge University Press).
References May 1997 m:\\3638w.wpf:1b-050697 Revision 1
8-3 1
l-22.
Taylor, J. H., O'Hara, J., Lucks, W. J., Parry, G. W., Cooper, S. E., Roth, E. M.,
Bley, D. C., and Wreathall, J.,1996." Frame-of Reference Manual for ATHEANA: A Technique for Human Error Analysis," Tech. Rep. L-2415/96-1, (Upton, New York, Brookhaven National Laboratory).
23.
Stubler, W. F., Roth, E. M., and Mumaw, R. J.,1991, " Evaluation Issues for Computer-j' Bar,ed Control Rooms"in " Proceedings of the Human Factors Society 35th Annual Meeting," pp. 383-387, (Santa Monica, CA, Human Factors Society).
24.
Mumaw, R. J., Roth, E. M., and Stubler, W. F.,1991, "An Analytic Technique for
[
Framing Control Room Evaluation Issues" in " Proceedings of the IEEE International i
Conference on Systems, Man and Cybernetics," pp. 1355-1360, (Charlottesville, VA, j
The Institute of Electrical and Electronic Engineers).
25.
Woods, D. D.,1982, " Application of Safety Parameter Display Evaluation Project to Design of Westinghouse SPDS," Appendix E to " Emergency Response Facilities Design i
and V&V Process," WCAP-10170, submitted to the U.S. Nuclear Regulatory
)
l~
Commission in support of their review of the Westinghouse Generic Safety Parameter Display System Non-Proprietary, (Pittsburgh, PA, Westinghouse Electric Corp.).
i 26.
Roth, E. M. " Description of the Operator Decision-Making Model and Function Based j
Task Analysis Methodology," WCAP-14695,1996.
l 27.
Endsley, M. R.,1995, "Towards a Theory of Situation Awareness in Dynamic Systems," Human Factors, 37, 65-84.
28.
Holinagel, E., Pederson, O. M., and Rasmussen, J.,1981, " Notes on Human i
Performance Analysis," Tech. Rep. RISO-M-2285, (Roskilde, Denmark, RISO National Laboratory).
j 29.
Roth, E. M., Bennett, K. B., and Woods, D. D.,1987, " Human Interaction with an i
Intelligent Machine" in " International Journal of Man-Machine Studies," Volume 27, pp. 479-525.
l 30.
Norman, D. A.,1981, " Categorization of Action Slips" in " Psychological Feview,"
t Volume 88, pp.1-15.
i'
)
1-4 1
4 i
References May 1997 m:\\3638w.wpf 1b-050697 Revision !
i
,-