ML20117H034

From kanterella
Jump to navigation Jump to search
Methodology & Results of Defining Evaluation Issues for AP600 Human Sys Interface Design Test Program
ML20117H034
Person / Time
Site: 05200003
Issue date: 08/27/1996
From: Roth E, Vijuk R
WESTINGHOUSE ELECTRIC COMPANY, DIV OF CBS CORP.
To:
Shared Package
ML20117G995 List:
References
WCAP-14701, NUDOCS 9609090093
Download: ML20117H034 (108)


Text

...

=.

,,..,.,'r...-,..;.. :

.,.s...':.....;;,...~,'.\\

...r.g;;..,

g. r.,... g 4.',3;. J.

g.. ~... L.,. s. v.. f,.,,:.-

,....c' ;, c,.. +,. :...:. 4.;, f.

p

.. s.l... 4...

2 g, -.

  • ..., :. ::. 5,;,s....

y p...

,: ~.. *,..

..;,, s....-

-t ~.:.,

,.c

!I**-.'::'...,..w,_.f......,.7.._

'.(;..,....

~,

q

.o

..:' '. ~ ^

.. _. -.+:...

  • . ;. a

.;. 1

.- ;v

.c

,a v

.c, y.

,, t'-

. ^l

. ',e-

. ;-.. p * -

.s i

~. :: -l ' _.s.. [ ',..,;.... ;. _'e;. 'a..'

v'.! M M -

..n..,..~..

-..3....'.,.,r.'..'l;'.;~,:_'-_..,.*._ :

,..';..,.o,Y...f..*d'

~.<* :n,.} d-l. '.%',..; :w 3.,,

e.

.s 3..

.~ ;

s*

.. - k :s

-..., ^v e.,,.,....,p

....e.*.cy..

.a..
:.

, 't f, s. -.-,~..g'.-*.g,...

..v

-af

,g.

,-.g...- :.

x O r.

r,? N.;,,., e.

p. 2 f

...e

- v.y g,., y..

.n+,

.. ~..

.?..c',

s

..: ;.:..i r ;.-

'.3,.

g.....,

..e

-(

  1. ,..a.,s..

3..

f

. :.. ; e,.,. q,,. s. ?.;:

u'

,g.,...,;..i.,..g

.c,

 ; ? *,'j ;. 5..,,..;

y W

WE.hr.d.g.,s u:

_.,q r

. vis e

'...'."..f...w y. 'j.' 'f.; :s..r.... : '..; -l g:7

.r.y.

,q.r., ;

-.. r,f..

.,r.,.-.: * ;. :.,...;' y, %:. ).y ;.,...p..*li

). u,.

.s;--

g H y',',y:,,,,,u.:q i,t

~~.

...- n g r. -;..y..-. -. -.- -

, ;. :r..

,r

.., 4

.:<,..v.

..]..:

u s; '...o:.';,,'z.

., g,..p * : ; '...~

.#,, ;, <. ' h t s.

G.;.

-. o ;. ", *, s :: :

y

....:, _v..-

y.;

n a:

. ; a. :;.- '

., :.. s,.

...% y...,..l,...

._:.Q ':.?; ef: 5 ',., s. :..y. *. g.

.,)-

v

c...:....,...:! ; __.

.. e

.. ?:, : _:.

.d.f:..

  • 1

..f*;'. <. '- f. c.} [. . '

y s,{, s. n y.... __

4\\-

'. :. ?-

.. as

. - - l?

.,.. l, :.. y.,...,

.;3 l,._.. '

.~.l '..: : L '

n, ; ?..= ;;.., ;,

Klls' c,.,.... :;..' g.p ; ; '

'e g f..,.p,,..,.,

,..v..,.

J, N:. :.).f. fy,r' -, -.

y..;...' wc..,.4.,;

c+.

f.. '.....7.,. sy 1

.,;, c..

w:

,,u v

c.

-g

, ;..;. :, y...

s s

....,. -,.,,, -_.,..,,..,........,..,.g,...

=. _. _...~;.....,

a;:,,,.s_,,y,.,..

t

..s/.

,.,. -.;.?.

(,*

,,., j,.....,

..,,,z,,

..j;. : g..

.,r.,

.s

,:.. g,,,,..

.;,..,,.,f-.

f......;.

,j v,

.. *f? -

.. ;.,.. s:.

y

.,,,g-

....e i.

'e

.'.;'I

.a t,

Q-Q-

.J

..-.:.*p"..

y a,. ; j ;. ;,.-,,;

'..e.

._'.,;... e n.

6.,,,

,.g..,,..;..,j sl A:..Y. '. s V.. b.. *s' *.... - -

,,. f, :,, y,.,...; ;

.g.:

...:.,c.-

.~.

s. _. ;; 9....,.

~,,.,.x,'*;.

a.

......v T,*....%";,,*.....:

.a.._

. *.s :.cQ.^g

,r n

...*;[,

7

..r c'.,6*, y. C,

..,{

, e. y..la.,

,. j,, [

'I

, fgjk.., h,jl. " * '. *"

. ',i;#

,,,,f-'.,s..<

3..,.",'.

y :,. '* +. 3 4 *o.4' 7 -<.* +,,,. _. M /sy~,. ?. ' #

4

_ '.,.*..{l M;.*

.?,.

,,i.,,,,.,.*..e'.;;,.,.i.'.,';7.';.',',.

A.,,..?,9,...;.

,o m-g g

  • i. : ;,/... Q. _..,..

e

,.,.n....,.,4

.y

..?

o s.,,

g.)

4,,,,.

nL,._.

c.

h,

-..s v.',

,c

..- ;*, "..'.=.
.,.

...,,p,,-

,y, f

/

,.a

. 3..-

. p.y...

..,.9.

.<,fs,_......

. y y g.,.,._,

..,., _, - ::.v-e j

a.

.c....

....3

,.a

.y,4.,,,.,;.,. 9 p ;.p

.,s

..,+

~

e.

y y,.,-.

..-c'.-

g*

..-.. f;.,. o,g.f r.

9., 2,. ;.,,..$

3,, ag., y - -

,c.-...........

,...,..7.....,,..

,d,...

v

.3....i,_a......,.,..

y

-.~..!

.s_.....

..s..-

.~

......j

. ~,

...T._..y, s,.

,'_...;,~., ;,,,.

p..

.o.

t

..,3.,p.. y..

,.,,,..,..<;,,,,.__,;~ :,....

....4,,

..... 3.r

,,,.e.:.,,

s s

.m

.f: f, S;, ;...... -

3

,,, j.,.. p

..i,

., q 1, f.

3./

,:.-.r-c.

.n.. <s

.,s.,.

... e_,,

.;,..,,.i,......f:..,y.,,,,,3.,

s o.

-. ~...a..,..

s.

...:sge....,

s; a,

.,. e

.s e,-

.s c

,,:..,..,t

....,e...

..,s..,-.... ;,.

2

,.,,...r.

.4,

...,.f y..

g,:

g

.. ~.[,

.t },

/*, ':, *,*

  • '. ;. :..., - ? }*.

s

.'ff.,

T-

. *.. ',!e.,eE

., ' ', ; ;,. ;-. *,. g.. y,3 ; t,. -..;.,,,...,....,.

y.

\\,f..

.....,,1 7,

r -.,

g,..

,..,y,

...p

?f.;; s_.*

    • r....

'. '. <,. ;. _. ; w._;

8. \\o... :.. r

.o m.

. -. :,. p;,.,., y:*-..,;',....,o.

,..... '.~. ;., y

  • _

s

.~

y, ~ m. :....,

_.......,. '..<.g.

-.L'..,..

....;..s.. _. s ;,... 3 ;.,... :.. q:,,.. g 3 :

4.; [ t, y fyq ; _.

s._

..o

+.,.

f, 3. -

.. - :-.;,.,._ q..

*,. _. _.i.,_

. _;i, ;.

.g->

>.;...g,..- -.. ;,.... ;,.,-

f..

g,

)

,'...p,,.,,,

.. * ;,;9.,.,...,....c.........;,,.,,......n..;,.:

. :s.

, < - - r :.+.;,...,..,

~,.

.,,.,,,..,.s;,...n..t..-.-

.......,., -.,,,..,;~..

.s c.

.r-

  • p..

4 4

..,j.,;._..

w e.,.,,.

..es,l,e c.-

,.,... 7,,,

.,...q..,,.,.,,,'.:.;;.,

.. -;. [ r ;... -

,C.. s;

  • *... ~..

.t

.,..,.....,. z l- _ y.;. f '.,,, [a.

...c:..,

,s..

,c...< :,,,;

..,. -. g.(,.

s s.

(.; -..e..., -

j.',[;."-.,,*>*.,,".8,*-

..,s...>,9'.,.~u.

..J g

..,s

?..:*;.[ '..

-. +

' '* i'* *.

- ' - ' ?" '..

+.. :..

.i -' _ '; _ -

g 9

,'I

    • . 3 w- -

.'. ' ',.,,. e f

f,', ' f *,[h..

.'?

J. v* -

$' Y

,,e.

..o.,.. '-

' s%.-

^,,

..,,n....,......:...r,,,s,. c.

..,-s r

,,,p

...e..,.,-

i [;

.g;..?, *g ' ; -. ',,.

,,: N, y,ik < '-

,i, 7,.

j

..l>,y

.;. '. * * ] *. _.  : :l ; Y ),,..;.

".);} ';U,.;l., d ~;h ','O {,:.

G..: L b,

.. g.

...,.'..y:..'

,q._

l.-

-i...

e a

.g.

,,% ?[

<^..p.,....

.r

~ ~.',.,

a

.. ~..

^..... -

j..R ', :.',., ).:y. '. '.., '..

.... -.- :._,.,....-..,....,,,s... : ?.,.

~

. ~

....f.

,f.,.,

.?..,!,. j, _,

,+

.;., e a

.r

..:? r ' 1

... :. t.

r

.:,s...

-.: ?,

c

.2

. t,;. 7, s ;,.., -,:..... -

... _[s..,,.y ;

i j

.-e

..ll

.w

\\

', y n._M

  • .+.

.....a.

.- ; 3,.. ;.. ; ;. _ _.;.., 15 ;,., :.,.*

, _ [ : ;. v.3..... _, _ _ r 3...

'.. :;r,.{e.. ;,

f,-,

r

, i.,._.,,.. _.,,,,,. -,..

_.g v. ;;.,

....,..,,3 y..,,.

....-...,.,..,,.i s. _ r ; _,.. _.

  • ,,',..)

,\\,.

,,e

,.?..'*.,_.*

z'...

y..: ".,

,,;.L..

.; _ _ \\

s

... ;.. ;[.

.... '.... '.. ma' i

m,,.,,.,...,-..

,. *'.. "... h..'J ~. [',: ; [ h..;. 4,

L.

..).,-

. f:.f,,g t ' 2.i,., :

  • - ;*. '. '..., '.; i 1

],..;,....

f.e. - '

. h. o...t <........

, p

.n,...

.s.

..;.,,. _...r.... -., -,. :y

-t.,

..;.: e.

..a.,,.*

2

,.i.. -.

s.

.,J".:....r.

. : ~:

c...r...0,,,,-

s

(..-

.,e..

".' < 9.... :......i.;

  • .L.

.:.; 6,

  • _ a:- -

A,. c

. '.,...-...o...rg.

i,..:

.- g e

e.-,.;

4 c

..,i.,,,".,.

/

..,1 s

,_.,a,.,,,. *..

,1,.., -

,...2

.*%q...

,t.,

8 1

7....,. *;

...,r..,.... :.-......

(

s,.

"g.s. n

,,,.)

.,/ j '.,...

2 1..,..---.. ' 3,; i.;..

.. ~...

,.j.34.. ' n. f....

.3

..Y.,

. y,

.. ~ -..s

.,.a.

. ' :, o;..,..

._ j;..

'..'. '... /

r;f.j....%.,..!; *.]: J.. ' e.z.,l '. -

,1.,..v.:

... s

_.p'..'..'.::,.

s.

,,,'*7n.

.,..-r._

,... p ;_, ;,:...-,

e

..n

g. : :..,..-.,......

..... ;.A ~.

. a

-.n **:,7'*{

\\

7 -

.,l

- %a. ~'. :......' > -.

.,* '. i sp

~

g,.,

i,.y.. ';,.., ;.,, '.... /.,'

-;Q

...;.'..*:.s.^.:,...., -

{-

. :. 3 c

.'....s.'..<.,%..

./l,;

.;.t,...

..j.* '.i P..

.. *. ~ ',..

...y/....

.?":..*

3.......,,

v,..'..';.-t..m.'.".'....,y.,p:..,s....:..,.-.g..,.

t

,.,,.. -..-*f....: s,. : i.

,' * ::-';. !v ",

. e v ~ " :-.: l

...9 r,..'. - :.: '....:

...,.. r.

- r u,,

.J,.,........e,,..........~;..

.. a..r,.

s.;..... -. -,

s

.:.....c,.,..-

.g s

t..-

...e<

...,,a -

.,.r..

i.,.,,.a... -:

[

.,1.

7 :, ;,

.g i..,.,,,

. :.

  • y :..

.,.,i.,.,i,.. '

g

...g

-, -,., '.: r N..;.s! *e<

,...,1.'...,,., -.',

.A,'.'-

2,.y;

.,7,g; g -.,.s..,,,.c....;.

.,,:.s..'..,'.,4 3'

v s '.. >n...,..i,%

.* w.4W

,:.';-.h,V ; :. C'e% :,.7.... h, '

.;w.

e:. :(.....

g;r.,.:,;7;dp

.C ;k

-l ~:.. - '

4.. e *. '.t

, s.,.,.. f, *. ; _ :....

,wx.',y N,,:,y.:

y,

. *. a.

y

.\\-

~,e

-.,. :...g.w ;*. Y....

.s.., ; f. *n -K ::j y,

.s w;.;..i. ;; q....;,;

m.. :.. v. '. 7.y.,.,.

z.,:7, j r.

.,,. g,

,...,,.g ; ;. f,?

. g.:.

. ;, p,j.

%..,.*'. i./,. ; ; c:*:

...<-(.".,.,...

  • ,i:1. _",;<..T

... ::,,.:.: 1

  • V..

. '. - s.;n..:.

,.-.,u.,,.,#e...;:.....:,!,.,.

. % i.:,A.,. y....

V., ".;g:r y...... :

v z.

. J

.e m v

..M -

gi

..e.4.

..,.c..

\\

>.,a q

1 c;

..~,.....q...

v..

,v.4,,.c.3.t.<..

< *,=:.,,. / g..j., f,.e,..ea....".,--

s ' -

  • g

'o.r,..-.?".,u.....4.-,.

s,, -

- i

,;p, 1 1 v4..........e w

- >. ;. y:, p...cJ,.y,.,.....

.. g,?.,,.,y.c v.;..

, ;. C :.w;,.g..,o., eg.y.n.. y

,.s. s... c,

,... f

.,6 p

.%;p

..m p;.,.,s.,:c,p,, 3g :.

p

s s

b:)'6 j.(.C,bv. p!:y;f :%;.e...s..,. t

. ~ s e.s t...

,,m s p 3.,.,. -.., -.

.~.u.,

c,,

e,., '6 * ;Q)[ g.

a;3.e^r w -

el[u;?.Q;r;.\\ j.';V^.,1;y>,.% g, )...
~., >,.

. gg,n.'f,y:i::::f...,.

g..

a

.~. u. e J

.v 9.g *p.:.'

C ;:.,;

~,;._ ;;_

N. y f W '.: ;*W* ;. y-

j8
  • k

, p, i...,f.,.h.v. ;%.,f_ ':. j".M.R ;f..a:W:

.' C :'

- f

.s

y. :

Q V:y.? p;. -..

,:1 v;:M; H.

3 :.,..,.,....-.

, w%s;.W ;. :v,.. 4.m. e;.y; y., p s

. f.,n..

r

. " x '-

3

.e c.;y ". \\

.,--n 1. x. :.. s..

uo.:.

,.,m:

.vm.Y.

y a m. c, e..j;,,,s. ;:.

.._.;...:. : gr.,g. y' f, r f.. '* _ s.'..',... t......

.,. -...>,:.,..+..,.

,. ~., --a

.? r

~..t;.. ;

.a n.-

e:,g. e. j.,. r. v. :...

.a

.,c ).

c p..

.. x. ek s'.

y,s y; Q_. Wj, s.

  • i p.y +* : >.

.l.;.,.

e96.y

.,,,.,.,.F.., a., s w::y, 6.

f.,.;.. '. ; g ; u.,Q:...,;,:

.v ~a'

,e 9,*.>4.; v q;.:..;p;P,.,<,.

..' ; v. i. y, ;, j.^ Jf -

..
, :.,, ;...:.ve,

~

9

..,c. e,s,.....%m.._,'..,.

,..Q,.;,t ' ~l :,3...,...}...

N.,...,,>c

.?,.
  • ,Q y,', ;,,.;,,:p y};,, j,..Q :; 'Q.V;
f.._Q..,, '-
,;y,[.,,. ' ;; ;; a-.s.,

r ;,..

~ -

2

...;. 3

(j;;;. :l ::y, :.;., :

t.c'

}yf : -Q:

A

, a. : 4

...f ".,'.. ;.. 3., *. 3.,, w. Qv +.4 1.;.,.,...,,,..+, ):t t

s. >

.s,.. -.. ;., ( ' _ ; 7..y '.- _.:,.

p ;.;,.;.*.: ;,,..,,u~...,.~

y g..,,.,

s

y....

..s.

..r

  • g..,..a

...,a......,...

' c..., p s.,f :,. ; g'.. '

..,s.'".

..-. rm.: ~.. ; - ; -...

a

v... :.:...

, '..*t.. :s.,....3

,..'. ;Q., c. g...

p...

<;. g '.t.' ;..>g; A. *,. :c ' ;.,,:, ;.,.. -' %..

a..

... !5, i -?'.......... '

s w,

v...

Q.,, i c..'..

.r'.4.,. :.. ; !,

i 7.* W.;r,.,. '..;..,....

....,.r,9, c.

, ?. e,,,.

(

.f..

e.

- L..

l i. ;,,,..,.<.. ;w..

.,,. : '.. :. v... :.g-

..,b W :l,?,- >

Jn ;... c.,. 'i; _1,

  • C.~ ',* : t.4:.h

,..^~; ;,::.. h'.

,ld u "

...f._,'",.p.

a

. e c,. it

.v.

.b. ;... 'q. i *a s s,,

?t. ; f ;,::

s..

a vg ::, :.;, 9 :.4 :.gg:.h;. '*::',S :.t ::&

.:.,: '.' ;., !;?*.c' y' q.:; - a v...- : ?.

...y.

.?; :,::n 3...: -

.u-

-s *.: lw

- ; v. '.,

~

v

r..n.% e:., a..t~a. -

r,.c s.f,'.v.r l(..eh. ;. ;* ?. f",..

.,5.~ s."?.

. 1...

1 -

'..,....e..-w:-

,L.

-?. ' ;-

.p...?.,.,'t..,.pa', m 1-e

. a.

.~,;.s..,

e s,.c

.v t

- - -.,. ^. n

. <-f. r.'.

'. s. ;, $.hf. : ?*.

'.'r,e..

h ;"e$

Y o j'

>,,..'.N.*;'....-l. ' h_'. C'. ; 5. ;. ".,;. '?. '_

'll c ',,',j. _' 0 ?. l

.h....'?,., '.., '.-,.. y. :f.:'.

l. b _ s'",*:

> I..:

...,; a.a'.: *

s....-.

&g.. v '. d.. y. ;... f. J.,.

,.'.n..;.

e

+...

,".e.3,,.3,,..

3

... ~

... < -- ;.p....

.i -

,;.w.

. n+.; -!...

y. v.... :. ~ y....,.

< ;r.

.. :7, -

1

,s ;

..; - ~;.+....r ::

- ee..

~: y

. ;;.. K....,. u.. :- : r g '.z. b. y.

,..,;,..,..........i _ p..

o.

..r,..2.3...m...... -.. _,..,.;% ;r:;..,p :., ; y -

o; w.

a.;

....t r.

,. ev-y 9

-,..t.,.

m

.f.:. : s. '..

z s 3, e.,,..).;3,.;.

s C.

v y. ;..

.r.g3,

,...... :.... v 6.,..}+ ;o. :

yo

, :'.,. 3 'p, y

,3

.m : ; '.. :

.:%.<...,.?,..

.,y e.',:

.., w. '. s 4.1,<.

V. n.x,. ;; r..s :.:e ;-, ;.,.

^g.... c;

...,:...s..!..

.'S".-[.,...,..

.s..

s

..v J.. /../..,...

p.

.. A... ;ll>..,~..~ w [..*. 4; '. ;.......

'.e.

y 3,

j '.. t w

3

e..

'g 3

3

,m

't.',;*

([

  • 3,, %. ' <;;.;.l, g_..
i. ;.-',. '

y.' p._.,,'

.f. _;;. _

7..;. ),, ql **..:,;f *; *..J.. '. ;, '.,;,: l./.:. 3.. :'.',

. K,. f.,. %. j...v.,t a. '

l

...._..L...'.*'1.'..,

.h..'.:.,..')... ',....

s-,

l;;

P i..

1 K

7t., i..

,.r%,

.. \\

Q...

. ri

.r,,, s.,7,' '. ".'.,. '.$.

' N

\\

,y,.:f.;t,. 3.,4.f,(.

... ' ",.,.,,p t,.,;..f,., :*.q..,.',,.

},

c,t l

jii

,.,..g s

. ?.

. so.

_v

.?. -..

...t. % y.*.. ; ; 's ; '

?

...,...,:, _. ;j-.... : n_

,.,v_ y g. v :...;b :

.f.

i..

e., -m-g-

,..l

  • '..D,. ' :(.,' ';

3 h:.Q ;Q'.3 p.p.. z.;; %'3

%.C..y;+f:;..)!;.: ;

.)

s.

y ;. s

',l $;k y

. l, h. ;

L.

2

'.s;...... v. t;;.p.p:.;.v... ;..r. s,: ::. A.;;:.

v.n

. ?. *..,; :.,. t.,.,+.....,

.. j,7yy&;p,s.
.:..-

"..6 a

. :+: ;-

.g..y
.,. ;.:..:.,.. ;..?....

w

. p..:

,: ~... : : + V......

..y.... ?.

e.-

c s

g:v'.,, q.:

.m..

,p :

g...:. 3.,.;.J;..J..... ~. '

i:

.:y..,..c.

':',:a::.pp... ? ; :..

.n.,

A.... f m.'.

p...:,..,;....

.e.,;4. g.. y,s.; ;.a ;- =.*..,ue;s.

. m.ei.>::(.

.... ;_.i,.

1.q.y.... ; v.......

..n

..e.s,.;.f.. :,.::a.;* l. 4...y; ;m...,,...

4.,c ;

._.t g....

p, : M,... _...;;.,

5 a.

.. < ~.r : q " ".,.

w., c y.' 3. ;

. M;;m;.t.V..

p

. g..:'. :.

r:

.... j...

.. ;7,

,,,,j.

,s W?.WF.M.@.lW..._,W.......,.., M.,. M.1, Tid $M 9609090093 960829

W./W.y
J:4.MF
1 W A

PDR k.O W fM.WIP?*'U% MIM%hM'%:M!OD.IT PDR ADOCK 05200003 CM

., y.. g.%.,m,;%;: _..NS. 'y%.% c:; W: hhWQWh M.4 V 9%. ;:.3_ ;yqq,p.

_a m mg> w yg ;

uh p,..,...,. n c, x. y. ~ ~,,y,.... v.,... ;.3,,.y., u.::

...m ::..

. 3 y 9:

n.,.. ~. g s.u.a.:.

a 4

..,,...., ~... _

~.:

... ;..w.

.4

  • A..x

,,.....v..,.....c.s..,....,

  • q... ;
,....; u......{..,..

... g e.

-:...y. ;lg,.,.~....:. c,,...n.g....

n...w....~-.. :.

-.o..,q ' f v..,.... p ;, f;. /:p...,o..'-

>:4, :..,. *

..,. n.g-p..,.f:

.e. f,.,., e

. 4

. y g :,;;

. ;... i j e; ";...

j;...

n.

a e

B

Westinghouse Non-Proprietary Class 3

)

\\

WC AP-14701

+ + + + + + +

+.

Methodology and Results of Defining Evaluation Issues for the AP600

~

Human System Interface Design Test Program 4

Wes tinghouse Energy Systems W

?

1o

.i.

h [ > I' W

. AP600 DOCUMENT COVER SHEET Form 58202G(5/94)[m:\\3114w 1.wpf 1b] AP600 CENTRAL FILE USE ONLY:

0058.FRM RFS#:

RFS ITEM #:

AP600 DOCUMENT NO.

REVISION NO.

ASSIGNED TO OCS-GEH-031 0

Page 1 M Robin NydeS ALTERNATE DOCUMENT NUMBER: WCAP-14701 WORK BREAKDOWN #: 3.3.2.4.15 DESIGN AGENT ORGANI7ATION:

PROJECT:

AP600 TITLE:

Methodology and ReSultS of Defining Evaluation ISSUES for the AP600 Human System Interface insign Test Program ATTACHMENTI DCP #/REV. INCORPORATED IN THIS DOCUMENT REVISION:

CALCULATION / ANALYSIS

REFERENCE:

ELECTRONIC FILENAME ELECTRONIC FILE FORMAT ELECTRONIC FILE DESCRIPTION (C) WESTlHGHOUSE ELECTRIC CORPORATION 199fL 0 WESTINGHOUSE PROPRIETARY CLASS 2 This document contains information proprietary to Westnghouse Electric Corporaton: it is subrnitted in confidence and is to be used solely for the purpose for which it is furnished and retumed upon request. This document and such information is not to be reproduced, transmitted, disclosed or used otherwise in whole or in part without pnor written authorizaton of Westinghouse Electric Corporation, Energy Systems Business Unit.

subject to the legends contained hereof.

O WESTINGHOUSE PROPRIETARY CLASS 2C This document is the property of and contains Proprietary information owned by Westinghouse Electric Corporation and/or its subcontractors and suppliers, it is transtnitted to you in confidence and trust, and you agree to treat this document in stnct accordance with the terms and conditions of the agreement under which it was provided to you.

@ WESTINGHOUSE CLASS 3 (NON PROPRIETARY)

COMPLETE 1 IF WORK PERFORMED UNDER DESIGN CERTIFICATION OR COMPLETE 2 IF WORK PERFORMED UNDER FOAKE.

1 DOE DESIGN CERTIFICATION PROGRAM - GOVERNMENT LIMITED RIGHTS STATEMENT [See page 2]

yright statement A license is reserved to the U.S. Govemment under contract DE ACO3 90SF18495.

O DOE CONTRACT DELIVERABLES (DELIVERED DATA)i September 30,1995 or Design Certification under D Subject to speofied exceptions, disclosure of this data is restncted unt 90SF18495, whichever is later.

EPRI CONFIDENTIAL: NOTICE: 1b 2O aE 4 s O CATEGORY: A D B C

D E

F 2 0 ARC FOAKE PROGRAM - ARC LIMITED RIGHTS STATEMENT [See page 2J Copyright statement A license is reserved to the U.S. Govemrnent under contract DE-FCO2-NE34267 and subcontract ARC-93-3-SC-001.

O ARC CONTRACT DELIVERABLES (CONTRACT DATA)

Subject to specified exceptions. disclosure of this data is restricted under ARC Subcontract ARC-93-3 SC-001.

ORIGINATOR SIGNAT RE/Dy

.L g/M(o Emilie Roth AP600 RESPONSIBLE MANAGER SIGNXTURE*

APPR TE Robert Vijuk

[b

' Approval of the responsible manager signifies that docurh6nt is i:omplete, all equired reviews 1kre completty'electrorisc file is attached and document is f

r. leased for use.

cos._,

i l

AP600 DOCUMENT COVER SHEET Page 2 Form 58202G($/94)

LIMITED RIGHTS STATEMENTS DOE GOVERNMENT UMITED RIGHTS STATEMENT (A) ness data are submitted with limited rights under govemment contract No. DE.AC03-90SF18495. These data may be reproduced and used by the govemment with the express limitation that they will not, without written permission of the contractor, be used for purposes of rnanufacturer nor disclosed outside the govemment; except that the govemment may disclose these data outside the govemrnent for the following purposes, if any, provides that the government makes such disclosure subject to prohibition against further use and disclosure:

(l) This

  • Proprietary Data' may be Gelosed for evaluation purposes under the restrictons above.

(II) The ' Proprietary Data' may be disclosed to the Electric Power Research Institute (EPRI), electric utility representatives and their direct consultants, excluding direct commercial competitors, and the DOE National Laboratories under the prohibibons and restrictions above.

(B)

This notice shall be marked on any reproduction of these data, in whole or in part.

ARC UMITED RIGHTS STATEMENT:

This proprietary data, fumished under Subcontract Number ARC 93-3-SC-001 with ARC may be duplicated and used by the govemment and ARC, subject to the limitations of Article H-17.F. of that subcontract, with the express hmrtations that the proprietary data may not be disclosed outside the govemment or ARC, or ARC's Class 1 & 3 members or EPRI or be used for purposes of manufacture without prior permsssion of the Subcontractor, except that further disclosure or use may be made solely for the following purposes:

This proprietary data may be disclosed to other than commercial competitors of Subcd.G4 for evaluation purposes of this subcontract under the restriction that the proprietary data be retained in confidence and not be further disclosed, and subject to the terms of a non-disclosure agreement between the Subcontractor and that organization, excluding DOE and its contractors.

DEFINITIONS CONTRACT / DELIVERED DATA - Consists of documents (e.g, specifications, drawings, reports) Which are generated under the DOE or ARC contracts Which contain no background proprietary data.

EPRI CONFIDENTIALITY / OBLIGATIONNOTICES NOTICE 1: The data in this document is subject to no confidentiality obligations.

NOTICE 2: The data in this document is proprietary and confidential to Westinghouse Electric Corporation and/or its Contractors, it is forwarded to recipient under an obligaton of Confidence and Trust for limited purposes only. Any use, disclosure to unauthortzed persons, or copying of this document or parts thereof is prohibited except as agreed to in advance by the Electric Power Research institute (EPRI) and Wesbn i

Electric Corporation. Recipient of this data has a duty to inquire of EPRI and/or Westinghouse as to the uses of the informatH)n contai herein that are permitted.

NOTICE 3: The data in this documentis rietary and confidential to Wesunghouse Electric Corporation and/or its Contractors. It is forwarded to recipient under an obligation of Con nce and Trust for use only in evaluation tasks specifically authorized by the Electric Power Research institute (EPRI). Any use, disclosure to unauthortzed persons, or copying this document or parts thereof is prohibited except as agreed to in advance by EPRI and Westinghouse Electric Corporabon. Recipient of this data has a duty to inquire of EPRI and/or Westinghouse as to the uses of the information contained herein that are permitted. This document and any copies or excerpts thereof that may have been generated are to be retumed to Westinghouse, directly or through EPRI, when requested to do so.

NOTICE 4: The data in this document is proprietary and confidential to Westinghouse Electric Corporation and/or its Contractors. It is being revealed in confidence and trust only to Ernployees of EPRI and to certain contractors of EPRI for limited evaluation tasks authorized by EPRf.

Any use, disclosure to unauthorized persons, or copying of this docurnent or parts thereof is prohibited. This Document and any copies or excerpts thereof that may have been generated are to be retumed to Westinghouse, directly or through EPRI, when requested to do so.

NOTICE 5: The data in this document is proprietary and confidential to Westinghouse Electric Corporation and/or its Contractors. Access to this data is given in Confidence and Trust only at Westinghouse facilitbs for limrted evaluation tasks assigned by EPRI. Any use, disclosure to unauthortzed persons, or copying of this document or parts thereof is prohibited. Neither this document nor any excerpts therefrom are to be removed from Westinghouse facihties.

EPRI CONFIDENTIALITY / OBLlGATION CATEGORIES CATEGORY 'A'-(See Delivered Data) Consists of CONTRACTOR Foreground Data that is contained in an issued reported.

CATEGORY *B* -(See Delivered Data) Consists of CONTRACTOR Foreground Data that is not contained in an issued report, except for computer programs.

CATEGORY *C'- Consis's of CONTRACTOR Background Data except for computer programs.

CATEGORY *D* - Consists of computer programs developed in the course of performing the Work.

CATEGORY *E'- Consists of computer programs developed prior to the Effective Date or after the Effective Date but outside the scope of the Work.

CATEGORY *F* - Consists of administrative plans and administrative reports.

6 1

cosa newam

WESTINGHOUSE NON-PROPRIETARY CLASS 3 1

WCAP-14701 l

Methodology and Results of Defining Evaluation issues for the AP600 Human System Interface Design Test Program i

l Emilie Roth Science & Technology Center August 1996 l

O 1

Approved:

Robert Vijuk, Project anager AP600 Design Certi ' cation l

Westinghouse Electric Corporation P.O. Box 355 Pittsburgh, PA 15230 4355 C 1996 Westinghouse Electnc Corporation All Rights Reserved August 1996 WCAP-14701 m:\\3114w.wpf:1t>082296

i iii e

i j

TABLE OF CONTENTS l

4 1

L LIST OF TAB LES........................................................ vi s

LIST OF FIGURES........................................................ vii LIST OF ACRONYMS AND ABBREVIATIONS................................ viii 1

INTRODUCTION................................................

1-1 2

GOALS OF HSI TEST PROGRAM....................................

2-1 3

EVALUATION SCOPE............................................

3-1 4

FRAMEWORK FOR DEVELOPING THE HSI DESIGN TEST PLAN..........

4 4.1 INTEGRATION OF THE HSI DESIGN TEST PROGRAM IN THE HSI DESIGN PROCESS..........................................

4-1 4.2 MODEL OF TEST BED FIDELITY...............................

4-6 43 TESTING DIFFERENT LEVELS OF STAFF INTERACTION...........

4-9 5

PHASE 1: ISSUE DEFINITION......................................

5-1 i

5.1 HUMAN PERFORMANCE MODEL.............................

5-2 5.1.1 Detection and Monitoring / Situation Awareness..............

5-2 5.1.2 Interpretation and Planning..............................

5-3 5.13 Control.............................................

5-4 5.1.4 Feedback............................................

5-4 5.2 MAJOR CLASSES OF OPERATOR ACTIVITIES....................

5-4 5.2.1 Detection and Monitoring / Situation Awareness..............

5-6 52.2 Interpretation and Planning..............................

5-7 5.2.3 Control Plant State....................................

5-8 53 MAPPING OF HSI RESOURCES TO OPERATOR ACTIVITIES (MODEL OF SUPPORT)............................................

5-11 53.1 Detection and Monitoring / Situation Awareness.............

5-11 53.2 Interpretation and Planning.............................

5-13 533 Controlling Plant State.................................

5-14 5.4 HUMAN PERFORMANCE EVALUATION ISSUES................

5-15 6

PHASE 2: TEST DEVELOPMENT....................................

6-1 6.1 TESTABLE HYPOTHESES AND PERFORMANCE REQUIREMENTS...

6-1 6.2 EVALUATION APPROACH..................................

6-2 63 EVALUATION REQUIREMENTS...............................

6-2 6.4 EVALUATION DESCRIPTIONS................................

6-3 6.5 DATA ANALYSIS AND FEEDBACK TO THE DESIGN PROCESS......

6-3 WCAP-14701 August 1996 m:\\3114w.wpf:1b-082896

iv TABLE OF CONTENTS (Continued) 7 EVALUATION ISSUES AND DESCRIPTIONS...........................

7-1 7.1 EVALUATIONS FOR DETECTION AND MONITORING............

7-1 7.1.1 Evaluation Issue 1: Passive Monitoring of WPIS and Workstation Displays..................................

7-3 7.1.2 Evaluation Issue 2: Directed Search for Information Within the i

Workstation Displays Based on WPE Displays...............

7-5 7.13 Evaluation Issue 3: Directed Search for Information within the Workstation Displays Based on a Request...................

7-8' 7.1.4 Evaluation Issue 4: Maintaining Crew Awareness of Plant Condition 7-11 7.2 EVALUATIONS FOR INTERPRETATION AND PLANNING........

7-15 7.2.1 Evaluation Issue 5: Detecting and Understanding Disturbances l

Using Alarms.......................................

7-16 7.2.2 Evaluation Issue 6: Interpretation and Planning Using l

Workstation Displays.................................

7 l 7.23. Evaluation Issue 7: Interpretation and Planning Dunng Single-Fault Event Using Alarms, Workstation, WPS, and Proced ures..........................................

7-23 l

7.2.4 Evaluation Issue 8: Interpretation and Planning During l

Multiple-Fault Events Using Alarms, Workstation, WPIS, and l

Procedures..........................................

7-27 7.2.5 Evaluation Issue 9: Interpretation and Planning by Crew During Multiple-Fault Events Using Alarms, Workstation, WPIS, and Proced ures..........................................

7-31 7.2.6 Evaluation Issue 10: Interpretation and Planning by Crew During Severe Accidents Using the Technical Support Center, l

Alarms, Workstation, WPIS, and Procedures................

7-35 73 EVALUATIONS FOR CONTROLLING PLANT STATE.............

7-39 73.1 Evaluation Issue 11: Simple Operator-Paced Control Tasks....

7-39 7.3.2 Evaluation Issue 12: Conditional Operator-Paced Control Tasks.

7-42 733 Evaluation Issue 13: Control Using Multiple, Simultaneous Procedures..........................................

7-46 73.4 Evaluation Issue 14: Event-Paced Control Tasks.............

7-50 73.S Evaluation Issue 15: Control Tasks Requiring Crew l

Coordination........................................

7-53 i

i WCAP-14701 August 1996 mA3114w.wpf;1t482296

V TABLE OF CONTENTS (Continued) 7.4 EVALUATIONS FOR CONFORMANCE TO HUMAN FACTORS ENGINEERING DESIGN GUIDELINES.........................

7-58 7.4.1 Evaluation Issue 16: Conformance to HFE Guidelines........

7-58 7.5 EVALUATIONS FOR VALIDATION OF INTEGRATED HSI.........

7-58 7.5.1 Evaluation Issue 17: Validation of Integrated HSI...........

7-59 8

REFERENCES...................................................

8-1 WCAP-14701 August 1996 m:\\3114w.wpf:1b-082296

vi LIST OF TABLES Table 1 Major Evaluation Issues.....................................

5-16 WCAP-14701 August 1996 m:\\3114w.wpf:1b-082296

vil i

LIST OF FIGURES Figure 1 AP600 Concept Testing and Verification and Validation Activities......

1-2 Figure 2 Methodology for Developing Verification and Validation Plan........

4-2 Figure 3 Integration of the Verification and Validation Test Program in the HSI Design Process..........................................

4-3 Figure 4 Testbed Fidelity Dimensions and Evaluation Issues.................

4-7 Figure 5 Mapping of HSI Resources to Operator Decision-Making Model......

5-12 Figure 6 Data Collection and Analysis Process............................

6-4 1

)

i l

WCAP-14701 August 1996 m:\\3114w.wpf:1b482'/96

viii LIST OF ACRONYMS AND ABBREVIATIONS Computerized Procedure System CPS Cathode Ray Tube CRT EOP Emergency Operating Procedure Human Factors Engineering HFE Human System Interface HSI MCR Main Control Room Man-Machine Interface System M-MIS Pressunzed Water Reactor PWR QDPS Qualified Data Processing System Technical Specifications TS V&V Verification & Validation WPIS Wall Panel Information System Visual Display Unit VDU l

WCAP-14701 August 1996 m:\\3114w.wpf:1b-082296

1-1 1

INTRODUCTION This document describes the methodology, analysis, and results of the process used to define the AP600 human system interface (HSI) design test program.

The AP600 HSI design test program consists of two parts:

Concept tests to be performed as part of the HSI design process Verification and Validation (V&V) tests to be performed at the completion of the AP600 design process The AP600 HSI design test program is integrated with the HSI design. Figure 1 summarizes the major elements of the AP600 HSI design test program and their relation to the HSI design process.

As described in the AP600 Standard Safety Analysis Report (SSAR) subsection 18.8.1, concept testing is performed as part of the HSI design process. Dunng the functional design phase, the core conceptual design for an HSI resource and corresponding functional requirements are developed. An integral part of this phase is rapid prototyping and testing of design concepts. Concept testing during the functional design phase serves two purposes:

It provides input to aid designers in resolving design issues that have no well-established human factors guidance.

It establishes the adequacy of the design concept and functional requirements that are produced in the functional design stage. A main objective of concept testing is to t

establish that the conceptual design is adequate to support operator performance in the range of situations that are anticipated.

This document provides an overview of the human performance evaluation issues addressed as part of the AP600 concept testing. The process by which these issues are selected and the general approach to testing these issues is also described. Reference 1 describes the concept tests planned as part of the AP600 HSI design process.

Introduction August 1996 m:\\3114w.wpf;1b-082296

py I

yw c; cr

= 8,

'c N. n 4E d

co a

2 h

HFEVertRcadon and VaNdeMon l

I I

H81 Task HFE I

HSI l

sp g

N HSI I

h Funceanal -_.)

-._gg verflication Verincation l

M" n

(Hardware & Softnere)

I l

o

g i

U 1 P I

N

  • Resolvedesignissues g

Integrated I

Y

  • Design
  • --Y System l
l concepts
  • Estabhshadequacyof g g WRdation
  • Functonal design conceptand E

requirements functionalrequirements ltrainmg simuistor l

oo I

1 P 1 P S

I i

laeue a

Concept Tests l

Resolution I

Wrification d.

g i

O Mar 4the-loop test of concrete I

example of functional design:

1 P l

c g

g

  • Rapidprototypes a
  • Part-tasksimulations I

Final Plant HFE Vestlicatiort i

  • High-fidellty simulator g

h *M'"r plant

, p,,,,,,

{=

acceptance test i

I I

. sua I

acceptance test I

o l

S, i

i t

g, C

l

1-3 The AP600 human factors engineering (HFE) V&V program is performed at the completion of the HSI design when hardware prototypes of the HSI resources are available. The AP600 HFE V&V includes:

HSI task support verification HFE design verification a

Integrated system validation Issue resolution verification Final plant HFE design verification This document provides a description of the human performance issues addressed as part of the AP600 HSI test program, and the general evaluation approach used. A programmatic level description of the activities conducted as part of the V&V program is presented in Reference 2.

i Introduction August 1996 m:\\3114w.wpf:1b 082296

2-1 2

GOALS OF HSI TEST PROGRAM The goals of the HSI test program are to:

Systematically evaluate human factors concerns that affect the plant performance Conduct these evaluations so that test results may be incorporated into the design of the HSI The HSI test program plan:

Describes the process for conducting human factors tests and incorporating results into the HSI design process Identifies human performance issues related to the HSI that are important to safe and efficient operation of the plant Describes the general test approach for the concept and V&V test phases of the HSI design process Goals of HSI Test Program August 1996 l

m:\\3114w.wpf:1b-082296

3-1 3

EVALUATION SCOPE The following items are addressed in the human factors test program for HSI:

Plant Facilities - Facilities included in the scope of the AP600 test program are the main control room (MCR), the technical support center, the remote shutdown facility and local control stations.

Plant Staff Activities - Activities required to operate under normal, abnormal, and emergency conditions are included.

The test program for the AP600 HSI focuses on the following HSI resources:

Plant information system (including functional and physical displays of plant processes)

Alarm system Computerized procedure system (CPS) a Dedicated and soft (computer-based) controls Wall panel information system (WPIS)

Qualified data processing system (QDPS)

=

In the test descriptions that follow, displays that appear on the control room workstation visual display units (VDUs) (for example, plant information system displays) are referred to as workstation displays.

The passive safety features of the AP600 affect operator decision-makmg by affecting the type of information available to the oprator, the alternatives the operators have for responding to plant upsets, and the time requirements for operator response (Ref. 3). These features pose requirements that are of the same type as for traditional plants, but may need some modification for the design of the AP600 HSI to support operator decisions. Consideration of these plant design features is important in the development of scenarios for those evaluations that require simulation of plant dynamics. In addition to evaluations of the control room, this plan addresses the application of human factors design guidelines to the HSI of the remote shutdown room and other operations and control centers.

Evaluation Scope August 1996 m:\\3114w.wpf:1b42296

__j

3-2 Control room personnel addressed by this evaluation include the occupant of the supervisor's console (shift foreman, senior reactor operator license), the reactor operator (s) located at the control workstations, and any additional staff specified as part of control room staffing assumptions for a particular plant mode or condition.

Evaluation Scope August 1996 m:\\3114w.wpf:1t>&2296

l, 4-1 i

4 FRAMEWORK FOR DEVELOPING THE HSI DESIGN TEST PLAN A two-phase process is used to define tests as illustrated in Figure 2. Phase 1 is issue definition. The purpose of this phase is to integrate major operator activities with the HSI resources that support the operator activities in order to establish a set of human performance evaluation issues. Phase 2 addresses test development. The purpose of this phase is to develop testing plans for each of the evaluation issues identified in Phase 1. A detailed description of the Phase 1 and 2 processes is presented in Sections 5.0 and 6.0, respectively.

Phase 2 involves development of test implementation plans. Section 7.0 provides a general description of the test approach to address each evaluation issue. The test implementation details are documented in individual test implementation plans that are prepared for each concept and V&V test near the point when the test is scheduled to be performed.

4.1 INTEGRATION OF THE HSI DESIGN TEST PROGRAM IN THE HSI DESIGN PROCESS Figure 3 depicts the relationship of the HSI human factors test program to the HSI design process. The figure organizes information in six horizontal rows. The second row displays 1

an abbreviated version of the HSI design process. The design process starts with a nussion statement that defines the purpose and goals of the HSI resource. This leads to the establishment of human performance requirements and design bases, which indude operator cognitive activities and behaviors that are supported by the HSI resource to achieve the mission statement. Functional requirements are developed to guide the development of the HSI design to support the human performance requirements. These functional requirements are implemented in the design of HSI components. HSI components are built as prototypes First, they exist as individual and partially integrated HSI prototypes. Finally, they exist as an integrated HSI hardware prototype after the components have been assembled and interfaced. The design process indudes intermediate steps that are not depicted in the figure.

The row above the HSI design process represents those points in which human factors and i

cognitive psychology theory are applied to the design process. The human performance requirements are derived from a review of operating experience, a model of human performance (subsection 5.1), an analysis of major dasses of operator activities (subsection 5.2), an analysis of the impact of changes in technology on performance, and a model of support (subsection 5.3). Next, human factors and cognitive science theory are applied to the development of functional requirements. Inputs indude HSI design principles and guidelines that are obtained from the human factors and cognitive science disciplines.

Examples indude results of research on human-computer interaction and human-centered Framework for Developing the HSI Design Test Plan August 1996 mA3114w.wpf:ll>082296 1

4-2 4

Phase 1. Issue Definition y

Map HSI 4

y Major 4

Define Define HSI Resources Resources Evaluation Evaluation l

to Operator lasues as losues Activities Links Between (Model HSI Resources Employ of Support) and Operator Human Performan i

Performance Model Identify Major Classes of Operator I

Phase 2, Test Development c

Develop Evaluation Define Evaluation Define Evaluation Document issue into Testable

+

Approach for

+

Requirements for +

Evaluation Hypothesis and ConceptTesting and ConceptTesting D w $0ons Performance Performance Testing:

and Performance Requirements Verification Testing:

. Validation Verification Validation Figure 2 Methodology for Developing Verification and Validation Plan Framework for Developing the HS1 Design Test Plan August 1996 mA3114w.wpf:1b-082296

Model of Human Performance Human Factors /

Operating Experience Review Cognithre Psychology impact of Changes in Technology Theory Major Classes of Operator Activities:

Cognitive Demands and Sources of Error -

,r

,r

,r Human n nce HSI Mission

-+

Requirements and Design Process Statement Design Basis Evaluation Tests Evaluation Test Beds Evaluation Type Evaluation Criteria Notes:

" Test beds for Concept Testing range in fdelity from static drawings to rapid display prototypes to high rdelity simulator for similar plant. Representation of plant dynamics range from scripted scenarios to dynamic plant simulations.

" Performance Testing is performed using production prototype components in a full scale, full size plant simulator. Factory Acceptance Testing is also performed at this time.

Figure 3 Integration of the Verification and Validation Test Program in the 11S1 Design Process WCAl'-14701 m:\\3114w wpf:1b-082796

4-3 Human Factors / Cognitive Human Factors / Cognitive Science Principles for Science Principles for HSI Design Analytical and Experimental Evaluation u

v Functional Requirements:

Individual / integrated Individual HSI Requirements HSI Hardware

---.+

Integrated HSI Requirements Prototype Verification of Functional Validation of Human Rese:rch to Guide HSI Concept Tests to Refine Requirements and Performance Development HSI Concepts Human Factors Guidelines Requirements (Man in (Analytical Tests) the Loop Tests)

W HSI Breadboard Designs Full Scope Simulators

  • Concept Testing V&V Testing Presence of significant human Meets functional requirements and performance problems human factors guidelines for individual / integrated HSI ANSTEC APERTURE Meets individual / integrated HSI Assessment of Performance Benefits of attemative concepts CA@

""**"P**""*'*4"***"

Also Avallable on Aperture Card August 1996 1609076013'

4-5 design requirements found in References 4 and 5, and internal design guidelines such as the display design guidelines.

Functional requirements development is guided by Man-in-the-Loop studies designed to test HSI design concepts. The design and analysis of these Man-in-the-Loop concept tests are guided by human factors and cognitive science methods described in References 3,6,7 and 8.

Human factors and cognitive science theory are applied to the design and analysis of V&V tests, which use integrated HSI hardware prototypes in a near full-scope, high fidelity simulator (See Ref. 9).

The third row depicts the types of evaluation tests in the evaluation program, including concept and V&V testing. Concept testing clarifies human performance issues and refines functional requirements for the HSI. V&V testing verifies that the functional requirements have been satisfied in the design and provides evidence that the design satisfies the human performance goals.

Concept testing is conducted during the functional requirements and design phase of the HSI design process. Concept testing involves specific Man-in-the-Loop tests of functional designs of HSI resources. These design examples are referred to as "HSI Breadboard Designs" in Figure 3. Breadboard designs include design concepts represented through static drawings, rapid display prototypes, part task simulations, mockups, or actual HSIs being developed for plants that have similar properties to the proposed AP600 HSI functional design.

The purposes of concept tests are to:

Explore and clarify human performance issues associated with specific design concepts Contribute to the development of functional requirements for the HSI Contribute to the development of criteria for human performance requirements of the HSI Qualitative information gathered through debriefing, discussions, or other means, is analyzed to identify design features that lead to confusion, errors, and slow or awkward actions by the subjects. These performance problems are evaluated in terms of their effect on the successful completion of a task. Functional requirements can then be developed to address those design characteristics that have significant effects on system performance. The intention is to understand the mental burdens that specific design features impose on the users with respect to perception, attention, and memory and to develop functional requirements to systematically address these demands. Quantitative measures of performance are judiciously WCAP-14701 August 1996 m:\\3114w.wpf:1M82796

I 4-6 used as baselines to compare alternative designs and to evaluate performance benefits achieved through refinements of design cencepts.

At the completion of the HSI design, V&V tests are conducted. The V&V tests include:

HSI Task Support Verification Verifies that the HSI design provides the necessary alarms, displays, and controls to support plant personnel tasks HFE Design Verification Verifies that the HSI design conforms to HFE principles, guidelines, and standards Integrated System Validation Validates that the HSI design can be effectively operated by personnel Issue Resolution Verification Verifies that the I'SI design resolves the HFE issues identified in the tracking system Final Plant HFE Design Verification Verifies that the final "as-built" product conforms to the verified and validated design A detailed description of the V&V tests that are part of the AP600 HSI design test program is presented in Reference 2.

4.2 MODEL OF TEST BED FIDELITY Many types of evaluation do not require near full-scope, high fidelity representations of the control room. A set of principles guides the specification HSI test bed fidelity when defining evaluation requirements. A model of test bed fidelity provides this guidance. Figure 4 provides a graphic summary of this model.

The following is a discussion of terminology:

Prototype characteristic consists of two parts: realism and completeness.

Realism refers to the degree to which the prototype resembles (looks and behaves like) the actual system.

Completeness refers to the degree to which the prototype represents the total system.

WCAP-14701 August 1996 m:\\3114w.wpf:1t> 081796

E :$

l

.E a m

3y ti6' Ew A

FIDELITY H

Dhensions

{

REALISM Completeness w1 Physical Functional mE Form Information Dynamics 2.

Content k

e Abstract (e.g., drawing) e Static e Part@sk simulation

, g {, g,

Q e

Representative (e g., mockup) characters) e Static-discrete e Full simulation 9

e Actual (e.g.. prototype)

  • Medum (e.g.,

a o Dynavec-slow, fast sample data)

  • High (e.g..
  • Dynant-realtime o

complete data)

E Perception:

Decision Dynamics integratio' m

Making o-~

er-e-

. ~~ e -n S g.

=,;,,

.,,_,e a, e

e

.se

.,_,acy r_

ts

{.

e Degnosis (e.g.,

e Physicalfatigue Physical Fit

[

information e Satisfaction of mission t

2

[

post constions)

Dynamics statement (validation) e Responsetoplant 3

e Strength dynamics e

comfort

  • * "'"I" '* *d i

e Vigilance

~

e Usatxtity

[

e Navigation f

"a e

y n

4-8 A part-task simulator is a prototype that has limited completeness (it represents a small portion of the entire system) but often has a high degree of realism. Realism can be further broken down into the following two components:

Physical fidelity refers to the degree to which the physical form of the test bed looks and feels like the actual system.

Functional fidelity refers to the degree to which the test bed behaves like the actual system.

Physical form can be characterized by three categories:

i Abstract - A representation that has little resemblance to the actual system (such as a drawing)

Representative - Some relevant physical characteristics are presented (such as, a three-dimensional mockup of a console that is constructed with foam core)

Actual - Actual hardware (such as production prototype equipment)

Functional fidelity has two characteristics: information content and dynamics. Information content pertains to the data and text provided in the HSI test bed. For example, a display system test bed can contain names of actual plant components and realistic values or just strings of random alphanumerics. The fidelity of information content can be characterized in three levels:

Low - Random data or characters are used as place holders to fill the data fields of interest. Data are neither accurate nor complete. This level of fidelity is used for tests of legibility.

Medium - Relevant data fields do not contain accurate and complete data. Data fields are partially filled. Data is random or fictitious. This level of fidelity is used for studies of display space navigation. Subjects use menu headings and other aids to locate a specific position in the display space.

High - Relevant data fields contain accurate and complete data. This level of fidelity is important for evaluations that address complex decision-making.

WCAP-14701 August 1996 m:\\3114w.wpf:1b-082796

4-9 Dynamics refers to the behavior of the HSI as represented in the test bed. At least four levels of representation are possible as follows:

Individual static presentation

(

Sequential static representation (sometimes called a slide show)

{

=

Continuous dynamic, not real-time (such as slow or fast)

Continuous dynamic, real-time f<

Tasks that require physical skills such as reach and dexterity require a high degree of physical fidelity in the prototype. For example, operation of soft controls requires dexterity, speed, and accuracy. Evaluation of alternative soft control methods (such as mouse-driven, poke points, touch screens, and keyboard commands) requires high physical fidelity.

Functional fidelity (that is, how it actually operates) is less important in this instance.

Cognitively demanding tasks require a high degree of functional fidelity to provide a valid test case for operator decisions. Important considerations include provisions for a sufficient data set, so the operator's problem is represented, as well as a data set updated at a sufficient rate to simulate system dynamics and time constraints.

4.3 TESTING DIFFERENT LEVELS OF STAFF INTERACTION Three levels of staff interaction are considered in Lne HSI test program: individual, crew, and plant. Evaluation issues at the individual level ne concemed with the demands that the HSI imposes on basic human capabilities (such as, workload, perception, decision-making, or anthropometrics). Issues at the crew level include these considerations as well as the flow of information and the coordination of work between crew members. Issues at the plant level include coordination of control room tasks with tasks performed in other parts of the plant (such as local equipment panels, and the technical support center). Evaluations of issues at the crew and plant level are performed during the later stages of the design process because a higher level of plant design detail and prototype fidelity are required.

WCAP-14701 August 1996 m \\3114w.wpf:1M82796

5-1 o

5 PHASE 1: ISSUE DEFINITION The objective of Phase 1 of the HSI test plan development methodology is to identify the major evaluation issues to be tested. This involves several activities as shown in Figure 2.

First, the main HSI resources to be included in the evaluation are identified. These are used as a starting point to define how the HSIis intended to support operator performance and to 5ound the evaluation issues considered. Next, a human performance model (adapted from Reference 10) is specified. Reference 26 provides a description of the operator's decision-making model as adapted for the AP600 HFE program. Based on the model, three m ajor classes of operator activities are defined:

Detection and monitoring / situation awareness Interpretation and plannmg Controlling plant state For each major class of operator activity, the types of conditions that can increase task complexity, the cognitive demands posed by these situations, and the potential types of human errors that can result are identified. This analysis draws on operating experience reviews, including analyses of operator performance during actual and simulated emergencies described in References 11 through 14; cognitive task analyses of nudear power plant operator performance discussed in References 15 through 19; and models of human decision-making in complex systems and human error (Refs.10,20, and 21).

The analysis of operator activities and cognitive demands defines:

The major classes of operator activities that the HSI needs to support The types of complex situations that need to be sampled in evaluating the effectiveness of the HSIin supporting each of these three classes of operator activity The HSI resources intended to support each of these operator activities are then identified.

This defines the model of support evaluated as part of the HSI test plan.

The set of issues to be tested is derived based on joint consideration of the HSI resources intended to support each operator activity class and the dimensions of complexity that can arise (Refs. 23 and 24).

The final set of test issues is organized into three categories corresponding to the three major classes of operator activity. Within each class, an attempt is made to start with issues that examine the role of a single HSI resource and then progress to test issues that assess the joint effect of multiple HSI features. A second theme in defining the set of test issues is to start Phase 1: Issue Definition August 1996 m:\\3114w.wpf:1b-082296

5-2 with studies that test the ability of the HSI to support operator performance on straightforward tasks and then to progressively test the ability of the HSI to support operator performance in cognitively complex situations.

The elements of the issue definition process are described in subsections 5.1 to 5.3, and include:

Human performance model used (subsection 5.1)

Major classes of operator activities identified and the cognitive processes that are involved in performing these activities (subsection 5.2)

Mapping of HSI resources to Operator Activities (i.e., how the HSI resources are intended to support the cognitive processes involved in performing the operator activities identified) (subsection 5.3)

Subsection 5.4 presents the list of human performance evaluation issues that results from applying this process to the AP600 HSI design.

5.1 HUMAN PERFORMANCE MODEL The operator decision making model (Refs. 25 and 26), which is adapted from the model of operator decision-makmg developed by Rasmussen (Ref.10), is used to support the design and evaluation of the AP600 HSI.

The human performance model provides a high-level description of the operators' decision-making tasks. The model helps identify a number of performance issues that establish the bounds of the human-machine ev6hiation.

The model identifies four major cognitive activities to be supported: detection and monitoring / situation awareness, interpretation and plannmg, control, and feedback. The major cognitive activities defined by this model are discussed in the following subsections.

5.1.1 Detection and Monitoring / Situation Awareness Operators monitor plant parameters to understand the plant state (Ref.19).

Phase 1: Issue Definition August 1996 nu\\3114w.wpf:1b-082296

5-3 In emergency or abnormal situations, operators are alerted to (detect) a disturbance that leads to monitoring of plant parameters to identify what is abnormal. Operators may try to answer questions such as:

Where is the mass in the system?

Where is the energy in the system?

What is the reactivity?

Where is the radiation?

What critical safety functions have been violated?

A second concern in this stage of decision-making is data quality. The reliability and validity of plant state indications is assessed.

This description of detection and monitoring is oriented primarily toward emergency operations. That is, detection and monitoring are initially driven by a cue that something is abnormal. It is important to support detection of abnormal states, but it is also important to maintain an awareness of plant status and system availability under normal and outage conditions.

The model is defined brcadly to address detection and monitoring during each plant condition. It includes active monitoring guided by procedures or a supervisor, and monitoring that is passive, such as board scanning. It also includes monitoring to support awareness of the goals and activities of other agents, both people and machines.

Based on the results of these monitoring activities, operators develop an awareness of plant state that is referred to as " situation awareness." Situation awareness has been defined as "the perception of the elements of the environment within a volume of time and space, the l

comprehension of the meaning, and the projection of their status in the near future" (Ref. 27).

5.1.2 Interpretation and Planning The most critical components of decision-making are correct situation assessment and identification of the most appropriate response plan (procedure), given the current state of j

the plant. In some cases identification and procedure selection is straightforward. This corresponds to Rasmussen's rule-based level of performance. In other cases, operators may have to integrate multiple information sources for correct situation assess, ment and make tradeoffs among operational goals. That is,if multiple failures occur, more than one procedure may be indicated or the standard procedure may need to be adapted to maintain safety functions. The identification of a response plan becomes difficult in the case of multiple failures. It can become even more difficult under severe accident conditions when multiple safety systems may be lost and system data may be unreliable. These cases Phase 1: Issue Definition August 1996 nu\\3114w.wpf 1t>082296

M correspond to Rasmussen's knowledge-based level of performance. The AP600 HSIis designed to support both rule-based and knowledge-based performance.

Coordination between operators, and between operators and automated systems, is considered in this area of decision-making. The need for coordination in carrymg out procedures is not explicit in the performance model. The process of initial allocation to human and automated resources, and later coordination of tasks (goals to be addressed) is included in the interpretation and planning area of the model. The HSI model makes explicit the monitoring of goal achievement, which is a means to assess how well each operator or automated system is progressing in achieving goals.

5.1.3 Control Control involves decisions in the initiation, tuning, and termination of plant processes.

Control is simpler for operators when they control the pace of an event. Control becomes more difficult when multiple individuals or autonomous systems must be coordinated to execute a task.

Controls, indicators, and procedures may exist in software space, and are accessed by being called to the screen. This type of access of displays and controls may cause a burden on the operator to efficiently find a control or display. While the control area of the model does not explicitly call out the process of locating procedures, controls, and displays, they are considered part of this area of the model.

5.1.4 Feedback Feedback occurs at several levels. Initially, operators need to verify that the control is implemented. Second, operators need to monitor the state of plant parameters and processes to determine whether the actions are having the intended effect. The final, and most critical, level of feedback is an evaluation of whether the operational goal is achieved. Operators may ask, "Is the goal satisfied?" or "Is the current procedure achieving the desired purpose?"

5.2 MAJOR CLASSES OF OPERATOR ACTIVITIES Based on the human performance model, three major classes of operator activities that the HSI support are defined:

Detection and monitorirg/ situation awareness Interpretation and planmng

=

Controlling plant state Phase 1: Issue Definition August 1996 m:\\3114w.wpf:ltHM2296

5-5 These classes correspond to the first three major performance areas of the operator decision-making model. Feedback is dealt with as an element of the controlling plant state activity, rather than as a stand-alone activity, because it is primarily associated with control activities.

Because these classes of activities are derived from a human performance model, they are able to characterize a broad range of operator tasks. As a result, they are jointly able to encompass each type of activity that arises during operation.

Analysis of the cognitive demands associated with these activities provides a basis for defining the human performance requirements for plant operation and for mamtaining plant safety. This defines the range of activities and situations that the HSI supports.

The following attributes are d'scussed in this section:

The main characteristics of each class of operator activities The types of conditions that can arise that increase task complexity The potential types of human errors The analysis of the cognitive demands draws on analyses of v rm performance during actual and simulated emergencies (Ref.11 through 14); cogn.

4 analyses of nuclear power plant operator performance (Refs.15 through 19); and i a of human decision-making in complex systems and human error (Refs. 10,20,21, and 22).

The analysis of the dimensions of task complexity draws on a framework for such complex systems (Ref.17). This analysis defines sources of task complexity that include:

l Characteristics of the task (dynamism, many highly interacting parts, risk, and uncertainty)

Characteristics of the agents (multiple agents, both human and autonomous machines)

Characteristics of the HSI (system functions such as information integration requirements and display access requirements)

The analysis of operator activities and cognitive demands defines the major classes of operator activities that the HSI supports, and the types of complex situations that are sampled in evaluating the effectiveness of the HSI in supporting each of the classes of operator activity. The descriptions of operator activity and potential sources of performance problems, together with the descriptions of how the AP600 HSI features mitigate these performance problems, provide the basis for defining the major evaluation bsues to be tested and the range of complex situations that need to be sampled during testir.g.

Phase 1: Issue Definition August 1996 m:\\3114w.wphib482296

5-6 5.2.1 Detection and Monitoring / Situation Awareness This class of operator activities encompasses those activities that are concerned with obtaining information about plant status. It includes the periodic active and passive operator j

monitoring that determines current status and availability (such as assessing plant status for power level, temperature, pressure, and systems available); monitoring needed to detect malfunctions or trends that are too small to activate an alarm; the proceduralized monitoring that accompanies shift turnover; and monitoring directed by queries about plant parameter values.

A distinction is made between " active" and " passive" monitoring. Active monitoring refers to obtaining information about plant state through active manipulation of the display system interface. Passive monitoring refers to maintaining an awareness of plant state with mimmal manipulation / navigation of the display system. It is analogous to the practice by operators of traditional control rooms in maintaining situation awareness of changes in plant state by monitoring the control board. In the AP600, the primary HSI resource that supports broad situation awareness is the wall panel information system (WPIS).

Dimensions of Task Complexity The following factors contribute to the complexity of detection and monitoring:

Many plant indications are available at different levels of abstraction (such as equipment status, process status, function status and goal status)

Normal parameter values vary with plant conditions Appropriate parameters for determining plant status vary with plant conditions Some expected, plant parameter behavior is difficult to assess; relevant parameter information needs to be inunediately available to be called up Relevant data may be distributed across individuals Some goals and status of automated systems are difficult to observe The following are potential types of human error:

Failure to detect / observe relevant plant parameter values (an error of omission)

Misreading relevant plant parameter values (an error of commission) l Phase 1: Issue Defirtition August 1996 m:\\3114w.wpf;1tr082896 1

5-7 l

1 Failure to identify or misinterpreting plant state or implications of plant state Failure to identify goals and activities of other agents (person or machine)

Failure to communicate to other personnel (for example, during shift turnover) plant state or system information (either an error of omission-not mentioning information, or an error of commission-mentioning incorrect information) i 5.2.2 Interpretation and Planning i

The interpretation and planning dass of operator activities encompasses those activities concerned with situation assessment and response planning. The focus is on situations that i

require responding to plant disturbances. In exploring this dass of activities, the emphasis is on identifying plant disturbances, assessing their implications for plant functions and goals,

)

and selecting / formulating a response plan. The focus is on the cognitive activities underlying intention formation, rather than remonse execution.

While response execution is an important part of handling emergencies, it is also central to j

controlling the plant during normal operation. Therefore, response execution is covered as 1

]

part of the controlling plant state dass of activities.

u In evaluating the extent to which the HSI supports operator intention formation during plant d

disturbances, the range of plant disturbances that may arise is considered.

Small Upsets - These disturbances do not lead to a plant trip and can indude disturbances that lead to alarm response procedures.

Controllable Upsets - These disturbances lead to a plant trip but are the result of a single malfunction that is recoverable using emergency procedures.

i Multiple Fault Accidents - These disturbances require identification of multiple faults that can mask each other and/or require consideration of multiple constraints (side-effects) in formulating recovery strategy.

Severe Accidents - These disturbances are more difficuh, in that they require additional personnel to diagnose and handle (that is, a need for coordination of multiple personnel and engineenng expertise) and they are not addressed by formalized procedures (therefore, a need for knowledge-based behavior).

The HSI evaluation covers the types of disturbances described above. Induded are cases that involve malfunctions in automated systems requiring the operator to identify a need for manual override.

Phase 1: Issue Definition August 1996 m:\\3114w.wpf:ll>082296

5-8 i

Dimensions of Task Complexity The following factors contribute to the complexity of this activity:

Multiple faults may produce large numbers of alarms, making the detection of a particular alarm difficult (due to attention overload)

Evidence of plant disturbance may be missing or obscured (that is, masked or altered by another fault)

Changes in plant state may make familiar cues inappropriate (such as sensors that may have different significance under different plant conditions)

Multiple faults may create goal conflict situations requiring tradeoffs among competing goals Information on the goals and status of automated systems may be difficult to assess The following are potential types of human error:

Failure to observe or recognize an abnormal plant state or system malfunction Failure to develop a correct system understanding (perhaps due to a failure to correctly interpret the evidence)

Fixation errors (ignoring evidence that is inconsistent with the hypotheses that are being entertained)

Over reliance on familiar cues or response plans Missing negative side effects associated with a response plan; missing goal conflicts Making inappropriate goal tradeoffs 5.2.3 Control Plant State The controlling plant state class of activities is concerned with making changes in plant state, including tuning plant parameters, in plant mode (such as startup, shutdown, intermediate power changes), performing switch overs (from one source / train to another), performing surveillance tests, and taking systems out of operation (switching out or tagging out). For this class of activities, the emphasis of the evaluation is on the planmng and the execution of responses.

Phase 1: Issue Definition August 1996 m:\\3114w.wpf:1b.082296

5-9 A distinction can be made between operator-paced (procedure-paced) control activities and event-paced (plant dynamics-paced) control activities. Operator-paced activities are activities where the rate at which a maneuver is performed is determined primarily by the operators performing the task. Event-paced control activities are activities where the rate is primarily determined by the process dynamics of the event controlled.

A second distinction can be made between maneuvers that can be performed by a single individual versus maneuvers that require the coordination among multiple individuals and/or automatic systems. Manual plant startup is an example of an activity that is both event paced arid requires coordination of multiple operators. Automatic plant startup is an example of an activity that is event-paced and requires supervisory control of autonomous systems.

The simplest case of control execution occurs when there is ample time, control actions are discrete (all-or-none actions, such as turnmg on a pump), control actions can occur in any order, little or no coordination is required, and control actions have no side effects that impact other plant processes or plant operability. Complications set in as this simplest case is altered: time becomes short; controls are used in a fixed order; controls are at physically disparate locations; or control actions are continuous and require small tunmg adjustments; there are lags between the time a control action is taken and when an indicator reflects the change or control actions require strict coordination between operators or between an operator and an automated process.

An important aspect of control execution is the need to obtain feedback from the system that the action has been successfully executed. This feedback can occur at several levels. First, there is an indication from the control itself that an action is taken. In the hard-wired environment, a light changes state or a toggle switch changes position. With soft controls, the change may be more transient and less noticeable. Next, there must be an effect on the parameter or display that is manipulated. Time lags may exist that make this detection more difficult. Finally, the plant process or system that the operator is intending to control shows a response to the control action to close the feedback loop.

1 With supervisory control of automated systems, there is a need to assess what goal the automated system is attempting to achieve. That is, whether the automated system is j

performing correctly or whether intervention is required, and if so, what manual actions are taken.

Phase 1: Issue Definition August 1996 m:\\3114w.wpf 1t482296

5-10 Dimensions of Task Complexity The following factors contribute to the complexity of this activity:

Complex process dynamics (such as rapid process changes or long lags) nay speed constraints on operators and/or require open-loop responses Actions of multiple operators may be interdependent, requiring communication / coordination among multiple individuals (such as assessing plant state, anticipating future plant state or preventing working at cross-purposes)

Actions may have negative side effects requiring assessment of preconditions before

=

action is taken, and assessment of post-conditions and execution of additional actions after the original action is taken. For example, when tagging out a train or system, the operator must be cognizant of preconditions that must be satisfied before the train or system is taken out of operation, such a. plant specification requirements for plant operability. The operator must also be cognizant of post-conditions that result from taking the system out of service, such as limits on plant operation and constraints on which additional systems can be taken out of service and automated systems may malfunction or fail to keep up with process dynamics.

The following are potential types of human error:

Failure to check preconditions, anticipate side effects and post-conditions Failure of execution (that is, either an error of omission-not taking a required action, a

or an error of commission-taking the wrong action or taking actions in wrong sequence)

Failure to observe feedback of actions (that is, monitor that the action was properly executed; monitor that the action had the desired effect on the plant parameter, plant process, and goal hierarchy)

Failure to keep pace with process dynamics i

Failure to coordinate and/or communicate with other crew members Failure to monitor automated systems and take manual intervention when required 1

Phase 1: Issue Definition August 1996 m:\\3114w.wpf:lt>482296

5-11 5.3 MAPPING OF HSI RESOURCES TO OPERATOR ACTIVITIES (MODEL OF SUPPORT)

Subsections 52.1 through 5.2.3 describe three classes of operator activities that are supported i

by the HSI and the major cognitive processing stages that underlie these activities. These subsections identify the scope and boundaries of the tasks to be included in the evaluation of the HSI. They also identify the dimensions of task complexity and human error. This foundation allows one to tie the various HSI resources to tasks. That is, each HSI resource is intended to support human performance in simple and complex tasks and to reduce error.

In this subsection, links are drawn between the HSI resources and the operator activities to show how the HSI resources support control rocm performance. This supports the development of evaluation issues for testing those relationships. More specifically, the j

evaluation issues link an activity, one or more HSI resource, and a performance measure.

The human performance evaluation issues are discussed in subsection 5.4.

The mapping of AP600 HSI resources to operator activities is accomplished by reviewing the rationale for each HSI resource. An understanding of each resource's rationale provides a means for relating it to the human performance model and to the operator activities. Because the design of the HSI resources ir not complete, there are limits on the detail that can be assigned to the model at this time.

i Figure 5 identifies the primary mappings between HSI resources and the elements of the operator decision-making model that they are intended to support.

The following subsections describe mappings between the operator activities and HSI resources that are important for supporting the development of evaluation issues and the HSI resources.

5.3.1 Detection and Monitoring / Situation Awareness Wall Panel Information System (WPIS)

The WPIS provides high-level information about the status of safety and availability goals, allowing operators to quickly identify violations. The WPIS also indicates plant operating mode and a corresponding set of plant parameters that are important to monitor. This aids operator monitoring by bringing together the most meaningful data in a central location.

Functionally Organized Alarm System The value of the functionally organized alarm system for detection and monitoring lies in focusing attention on the most significant alarms Therefore, data overload is reduced. The Phase 1: Issue Definition August 1996 m:\\3114w.wpf:1b-082296

5-12 i

f 4

Detection and Monitoring / Situation Awareness Alert Observe identify State Ahrm System WPIS WPIS WPlS Plant Information System Alarm System l ODPS Plant inkmnatbn System QDPS i

Computertzed Procedures 1

i 2

interpretation / Planning implications Goal Selection Plan Success Select /

of State Computerland Procedure.

Path Formulate i

Plantinformation System Actions pane m p p

Com W Procedures Plant Irdormation System Computertzed Procedures PlantInformation System i

Control Execute Actions t

Sonconvois Dedicated Contmas i

Feedback Monitor Goal Monitor State Verify Action Achievement WPiS San Coneow h Sys h mm Alarm System Plant Information System gg QDPS Plantinformation System QOPS Figure 5 Mapping of HSI Resources to Operator Decision-Making Model Phase 1: Issue Definition August 1996 m:\\3114w.wphib-082296

5-13 alarm system removes redundant or less meaningful alarms from the set of alarms that are activated.

Plant Information System and QDPS The functional and physical displays support operators in monitoring plant data. The functional and physical displays provide detailed information by allowing access to any parameters through a network of displays that can be obtained by the operator. These displays provide indication of data quality (such as failed or unreliable sensors) and context for plant data by linking the physical views with the functional views. They also support the monitoring of automated systems.

The remaining major HSI resources, dedicated and soft controls, and the procedures are not tied to supporting detection and monitoring.

5.3.2 Interpretation and Planning Functionally Organized Alarm System The alarm systern aids the operators in selecting appropriate views of the plant and appropriate procedures for mitigating the abnormal event. The alarm system reduces confusion by subordinating alarms that are misleading or secondary to the primary disturbance. It also cues operators to multiple fault situations and/or situations where multiple safety goals are compromised.

Plant Information System and QDPS The functional and physical displays aid situation assessment and planning by encouraging operators to take a functional view of the plant that is tied to the physical view. The functional view presents information about the current goal, goal violations, processes required to satisfy the goal, and potential side effects. The intent is to provide a tool for planning activities that reduces the likelihood that the operator loses sight of the larger picture when engaged in control activities.

Computerized Procedures The procedures created for MCR operators formahze the set of appropriate control actions that are avaih.ble to achieve safety and availability goals. These are the set of actions opera-tors should take. An important operator cognitive activity in using procedures is selecting the appropriate procedure and periodically evaluating its appropriateness. The procedures aid the operators in making these decisions.

Phase 1: Issue Definition August 1996 m:\\3114w.wpf:ll>082296 e v w

M r -

r

5-14 Wall Panel Information System (WPIS)

The WPIS maintains a high-level view of safety and availability goals so that operators can assess how well the current response plan is achieving its goal. The WPIS reflects significant changes in plant status that are tied to the appropriateness of the procedure. This overview system also lets crew members in the main control area share information about current goals and responses.

The remaining major HSI resources, dedicated and soft controls, are not strongly tied to supporting interpretation and planning.

5.3.3 Controlling Plant State Dedicated and Soft Controls The control devices clearly communicate to operators the available control actions. They also provide feedback to the operator indicating that a control action is successfully performed.

For example, a control should provide a visual, auditory, or tactual cue to indicate a change in setting. Operators should not become confused when locating, selecting, or executing a control action.

Computerized Procedures The procedures are the specific instructions for execution of the control activities. These are clear and concise, avoiding confusion or underspecification of control actions or their criteria.

The procedures also clearly indicate their intent so that operators can more easily determine the procedure's appropriateness.

Wall Panel Information System (WPIS)

The WPIS provides an overview of plant status to control room personnel in the main control area, including those with no access to a compact workstation. It provides a source of information on the control activities of other operators to support crew communication and coordination. It also supports feedback on control actions, particularly at the level of monitoring plant state and goal achievement.

Plant Information System The plant information system provides a means for each operator to view the activities of other operators involved in coordinated or related control activities. The value of this viewing is related to error detection, control action timing, and feedback on the effects of multiple control actions.

Phase 1: Issue Definition August 1996 m:\\3114w.wpf:lt>462296 u

5-15 The remaining major HSI resources, alarm system and QDPS, provide feedback about the success of control actions.

5.4 HUMAN PERFORMANCE EVALUATION ISSUES A set of human performance evaluation issues is derived based on consideration of the major classes of operator activity, the HSI resources intended to support each operator activity class, and the analysis of dimensions of complexity that could arise to increase performance demands. This subsection presents the results of applying this analysis.

The evaluations are organized into three mam groups corresponding to the three major classes of operator activity. Within each group an attempt is made to start with issues that examine the role of a single HSI resource and then progress to issues that assess the joint effect of multiple HSI resources. A second theme in defining the set of issues is to start with studies that test the ability of the HSI to support operator performance on straightforward tasks and then to progressively test the ability of the HSI to support operator performance in cognitively complex situations.

In addition to these groups, two additional evaluations are specified. The first is conformance to human engineering design guidelines, included to address existing design guidelines that apply to control rooms. This corresponds to the HFE verification task described in the Programmatic Level Description of the AP600 Human Factors Verification and Validation Plan (Ref. 2). The second is validation of the integrated HSI, included to address the requirements for a validation of a fully integrated HSI. This corresponds to the integrated system validation task described in the Programmatic Level Description of the AP600 Human Factors Verification and Validation Plan (Ref. 2). These five groups and corresponding evaluation issues are shown in Table I.

The AP600 V&V plan includes three additional V&V tasks: task support verification, issue resolution verification, and final plant HFE verification. Additional descriptions of these V&V tasks are provided in Reference 2.

Phase 1: Issue Definition August 1996 m:\\3114w.wpf:1b-082296

5-16 Table 1 Major Evaluation Issues Operator Activity: Detection and Monitoring Issue 1 Does the WPIS and the workstation summary and overview displays support the operator in maintaining an awareness of plant status and system availability without needing to search actively through the workstation displays?

Issue 2 Does the WPIS support the operator in getting more detail about plant status and system availability by directed search of the workstation functional and physical displays?

Issue 3 Do the HSI features support efficient navigation to locate specific information?

Issue 4 Do the HSI features effectively support crew awareness of plant condition?

Operator Activity: Interpretation and Planning Issue 5 Does the alarm system convey information in a way that enhances operator awareness and understanding of plant condition?

Issue 6 Does the physical and functional organization of plant information on the workstation displays enhance diagnosis of plant condition and the plannmg/

selection of recovery paths?

Issue 7 Does the integration of alarms, WPE, workstation, and procedures support the operator in responding to single-fault events?

Issue 8 Does the integration of alarms, WPIS, workstation, and procedures support the operator in interpretation and planning during multiple-fault events?

Issue 9 Does the integration of alarms, WPIS, workstation, and procedures support the crew i

in interpretation and planrung during multiple-fault events?

Issue 10 Does the integration of alarms, WPIS, workstation, and procedures support the crew in interpretation and plannmg during severe accidents?

Phase 1: Issue Definition August 1996 m.\\3114w.wpf:1b 082296

i 5-17 Table 1 Major Evaluation Issues (cont.)

Operator Activity: Controlling Plant State Issue 11 Do the HSI features support the operator in performing simple, operator-paced control tasks?

Issue 12 Do the HSI features support the operator in performing control tasks that require assessment of preconditions, side effects, and post-conditions?

Issue 13 Do the HSI features support the operator in performing control tasks that require multiple procedures?

Issue 14 Do the HSI features support the operator in performing event-paced control tasks?

Issue 15 Do the HSI features support the operator in performing control tasks that requires coordination among crew members?

Conformation to HFE Design Guidelines Issue 16 Do the HSI components satisfy relevant HFE design guidelines?

Validation of Integrated HSI Issue 17 Does the integration of HSI components satisfy requirements for validation of control room functions and integrated performance capabilities?

Phase 1: Issue Definition August 1996 m:\\3114w.wpf:lt42296

6-1 6

PHASE 2: TEST DEVELOPMENT This section discusses the development of tests to examine the human performance issues identified in Phase 1. Test development involves developing:

Testable hypotheses and performance requirements from the evaluation issues Approaches for conducting each evaluation Requirements for conducting each evaluation Written descriptions of each evaluation Each of these activities is depicted in the lower portion of Figure 2. They are described in the following subsections.

6.1 TESTABLE HYPOTHESES AND PERFORMANCE REQUIREMENTS The first activity of test development is to develop testable hypotheses and performance requirements from the human performance evaluation issues. The evaluation issues defined in Phase 1 describe characteristics of operator performance that are critical to safe and efficient operation of the AP600. For each issue, the following are developed:

A testable hypothesis Performance requirements

=

Each testable hypothesis and performance requirement specifies how the HSI resources are expected to support operator performance with respect to a particular evaluation issue. A testable hypothesis is simply a statement that guides the evaluation of design concepts.

Quantitative and qualitative data are collected to determine whether a design concept supports the hypothesis, and if so, how well it supports the hypothesis. This process is

{

used to:

Explore and clarify human performance issues associated with specific design concepts Contribute to the development of functional requirements for the HSI Contribute to the development of criteria for human performance requirements of

=

the HSI Performance requirements are statements of man-machine system behavior that the HSI supports to provide safe and efficient operation of the AP600. For each performance Phase 2: Test Development August 1996 m:\\3114w.wpf-It>082296

6-2 2

i requirement, performance measures need to be developed. These are objective, measurable dimensions by which performance is evaluated.

6.2 EVALUATION APPROACH i

The second activity of test development is to define the evaluation approach. For each evaluation issue, an evaluation approach is defined to guide the development of concept i

testing. The following factors are considered:

j Dimensions of task performance to be addressed, induding types of scenarios and j

dimensions of task complexity Types of performance measures to be collected, induding errors, response time, and l

operator understanding of plant condition i

Evaluation method to be used, induding expert review, walk-through, simulation, and

]

decision tracing i

1 j

Evaluation criteria, induding absolute and relative measures of perfornumce i

j Implications of the results, including selection of design alternatives, darification of l

performance issues, refinement of functional requirements and HSI design criteria Section 7.0 provides an overview description of the proposed evaluation approach for each of the HSI evaluation issues identified in Table 1.

i

]

6.3 EVALUATION REQUIREMENTS i

j The third activity of test development depicted in Figure 2 is to define evaluation j

j requirements. For each evaluation a set of requirements is defined, induding test bed fidelity requirements, the required stage of development of the HSI, and subject characteristics. Test bed fidelity is discussed in subsection 4.2.

The concept and V&V testing is coordinated wuh the HSI design process. Concept testing is performed with breadboard designs that demonstrate relevant aspects of the design concept.

The evaluation hypotheses and approach determine the required level of test bed fidelity.

The level of test bed fidelity determines the required stage of development of the HSI design.

For example, the concept testing of Evaluation 3 is not conducted until design concepts exist for display formats, navigation aids, and display selection mechanisms. Also, preliminary decisions must be made regarding the display system hardware. V&V testing is performed using production prototypes or equipment that emulates production prototypes.

Phase 2: Test Development August 1996 j

m:\\3114w.wpf:1b-082296

6-3 The role of test participants during the concept testing evaluations is to represent characteristics of plant operators. While plant operations experience is desirable, it is not required for concept testing. Test participants may include nuclear operations training instructors, engineers, and designers. The individuals who participate in V&V testing evaluations are nudear power plant operators trained in the use of the AP600 HSI. Test participant requirements, including training and experience, are defined for each concept and V & V test.

6.4 EVALUATION DESCRIPTIONS The last activity of Test Development in Figure 2 is to document the evaluation descriptions.

Overview descriptions of the general test approach are presented in Section 7.0. The test implementation details are documented in individual test implementation plans that are prepared for each concept and V&V test.

6.5 DATA ANALYSIS AND FEEDBACK TO THE DESIGN PROCESS This subsection describes the data collection and analysis approach that is used to analyze data from the Man-in-the-Loop tests conducted during the concept testing phase and the integrated system validation phase of the V&V program.

The methodology is adapted from a method that was introduced by Hollnagel, Pederson and Rasmussen (Ref. 28) and extended and refined in several empirical studies including a Man-in-the-Loop evaluation of safety parameter display systems (Ref. 25). Other discussions of methods for empirical evaluation studies are included in References 6,7,12,14,19, and 29.

There are two key elements of the methodology. The first is to use multiple, convergent performance data to develop a description of performance and the factors that contribute to that performance. The second is to use conceptual models to focus data collection and analysis activities, and to enable aggregation and generalization across specific cases.

The main steps in the methodology are presented in Figure 6. The first step is to collect multiple, convergent performance data. The objective is to chart not only what actions are taken, but also the decision process and context that led to the actions. The emphasis is on recording process measures as well as outcome measures of performance. Outcome measures include the accuracy and completeness of the response made and the time to respond. Process measures address how the operator reached the outcome.

Phase 2: Test Development August 1996 m:\\3114w.wpf:1142296

64 Conclusione about effectiveness of HS) in supporting human performance requirements; Confext /ndependent sources of human performance dif5culty; and (GeneralConclusions) enhancements to HSI required to improve human performance Interpretation j

i i

Prototypical Performance description general conclusions re0arding performance based on aggregating simliar pattoms of performance across individuals and cases 4L Aggregation &

Generalization Abstrachon and Formal Performance A,,,. en for the u

hralizah specific case Saeed on application of human performar c,0 model and model of support Analysis Description of Actual Performance in a specific case Multiple convergent performance Uta

. accuracy and completoesee or response

. response time Context Speciffc

. orrors (Speunc Cases) record ofin6erac60n weh HSi

.decesor>4ece protocol

. record or crew communicaban

.debneting intentimes Figure 6 Data Collection and Analysis Process Phase 2: Test Development August 1996 m:\\311tr..q'1b-082296

o 6-5 Data sources include:

Direct observation of participant behavior Records of interaction with the HSI Traces of actions taken on the underlying process Records of the dynamic behavior of critical process variables Records of verbal communication among team members or via formal communication media i

Verbal reports made during debriefing interviews following the performance Measures of workload and situation awareness Commentaries on operator performance made by other knowledgeable observers j

a The specific data collected for each evaluation is listed as part of the individual test descriptions in Section 7.0.

l i

Data on how the operator reaches the outcome provides crucial information on where and why a particular interface helps or fails to help user performance. Outcome measures alone are not powerful enough to determine which of the multiple potential factors active in any evaluation of interface systems contribute to a specific outcome result. In testing a display interface concept, it is important to differentiate the effects due to fundamental elements of the concept from the effects due to incide ital details of the implementation. For example, if performance problems are detected v,ith a new display concept that resources flow path coding (such as energy flow or material flow), it is important to establish whether there are j

fundamental problems with the concept (flow coding does not improve performance), or whether the performance problems are due to details of the implementation (display legibility or specific flow coding technique) and performance would improve if the implementation details were changed.

Data on the background and context of a user problem can help localize the factors that contribute to successful or unsuccessful human performance. Therefore it can help identify potential improvements or additions to interface systems. This is important for providing design feedback during the concept testing phase. It is also important for evaluating the location and severity of problems identified during the performance testing phase.

Phase 2: Test Development August 1996 nu\\3114w.wpf:1t>082296

6-6 In the next step, the multiple data sources are correlated and combined to produce a description of performance for a specific individual (or crew) in a specific case. The experimenter actively cross-references the different lines of evidence to establish a trace of l

pr.rticipant behavior and cognitive activities. This cross-checking and integration can help support the validity of the description of actual performance generated from the data.

According to the Reference 28 methodology, the description of actual performance is followed by successive stages of progressively more concept-dependent, context-independent levels of analysis. Conceptual models are used to produce a more formal description of performance. By describing performance in formal terms that are context independent, it becomes more possible to generalize across specific individuals and cases. For example, concepts about human error have characterized specific instances of human performance errors as " slips" and " mistakes" (Ref. 30). This allows diverse cases to be combined with and generalized across individuals and situations. In this case, the human performance model adapted from Rasmussen and the model of support described in Section 5.0 are used to produce formal descriptions of operator performance in particular cases and to aggregate and generalize across cases.

Once performance is described in formal terms, it becomes possible to identify similar patterns across individuals and cases. For example, multiple instances of display navigation problems may be identified and aggregated. The result of the aggregation is a description of prototypical performance including common performance difficulties, confusions, and errors.

The description of prototypical performance is then used to draw conclusions about the effectiveness of particular HSI resources in supporting human performance; the factors that contribute to human performance difficulty; and enhancements to the HSI required to improve human performance.

l Phase 2: Test Development August 1996 m:\\3114w.wpf:lt482296

7-1 7

EVALUATION ISSUES AND DESCRIPTIONS This section describes the general test approach used to address each of the 17 major human performance evaluation issues defined in Phase 1 and presented in Table 1.

The 17 evaluation issues are organized under five headings:

Evaluations for detection and monitoring Evaluations for interpretation and plannmg Evaluations for controlling plant state Evaluations of conformance to human factors engineering (HFE) design guidelines Evaluations for validation of the integrated HSI l

The first 15 evaluations are grouped into the first three headings. Each of these 15 evaluations addresses a human performance evaluation issue.

Evaluations 16 and 17 describe evaluations that are performed as part of the AP600 V&V.

A more complete description of the V&V tests is provided in Reference 2.

The evaluation issue descriptions provided in subsections 7.1 through 7.5 are intended to provide a description of the testing approach and requirements for addressing each of the major evaluation issues. The actual number and content of tests that are performed depends on the schedule of development of individual HSI resources and availability of rapid prototypes and simulations to serve as testbed platforms. It is possible to address more than one evaluation issue in a single concept test. Conversely, several concept tests may be performed that address different aspects of a single evaluation issue. Human performance evaluation issues that are not addressed by concept tests are addressed in the final, integrated system validation. Additional information on the concept tests that are planned to be performed as part of the AP600 HSI design process are provided in Reference 1.

The test implementation details are documented in individual test implementation plans that are prepared for each concept and V&V test.

7.1 EVALUATIONS FOR DETECTION AND MONITORING The purpose of evaluations in this subsection is to provide confidence that the design of the HSI supports the operators in mamtaining an awareness of plant condition. It includes j

periodic active and passive monitoring by operators to determine current status and j

availability of plant systems; periodic monitoring needed to detect malfunctions or trends j

that are too small to activate an alarm; the more proceduralized monitoring that accompanies Evaluation Issues and Descriptions August 1996 I

m:\\3114w.wpf:1b 082296

+

k

-+

r e

-w

,.- +

7-2 shift turnover; and monitoring directed by queries about specific plant parameter values.

These issues are relevant to individual operators as well as to crews of operators.

The following set of evaluations are designed to test the ability of the HSI to support four categories of detection and monitoring. These categories increase in complexity in terms of the level of detail of plant data and the degree of interaction between operators. The first category, addressed in Evaluation 1, tests the ability of a single operator to develop a high-level understanding of plant condition from the wall panel information station and workstation without excessive manipulation of the HSI to retrieve data.

The second category, which is addressed in Evaluation 2, tests the ability of a single operator to use the cues provided by the wall panel information station to obtain more detailed plant data from the workstation. This evaluation tests the coordination of data presentation between the wall panel information station and the workstation.

The third category, addressed in Evaluation 3, tests the ability of a single operator to obtain detailed plant data from the workstation based on a request from a supervisor or a procedure. This evaluation tests the ability to use the navigation aids of the displays presented on the workstation to find detailed data.

The fourth category, addressed in Evaluation 4, tests the ability of operators to coordinate information to mamtain crew awareness of plant condition. Three situations are addressed:

the informal transfer of information to a new person entering the control room, the formal transfer of information to a new crew entering the control room during shift turnover, and the coordination of information among crew members during ongoing detection and monitoring.

Evaluation issues are the following:

Issue 1: Do the wall panel information station and the workstation summary and overview displays support the operator in maintaining an awareness of plant status and system avail-ability without needing to search actively through the workstation displays?

Issue 2: Does the wall panel information station support the operator in getting more detail about plant status and system availability by directed search of the workstation functional and physical displays?

Issue 3: Do the HSI features support efficient navigation to locate specific information?

Issue 4: Do the HSI features effectively support crew awareness of plant condition?

Evaluation Issues and Descriptions August 1996 l

m:\\3114w.wpf:1142296

7-3 7.1.1 Evaluation Issue 1: Passive Monitoring of WPIS and Workstation Displays l

Do the wall panel information station and the workstation summary / overview displays support the operator in maintaining an awareness of plant status and system availability without needing to search actively through the workstation displays?

Relevant HSI Resources:

WPE (plant parameter data and alarm data)

Workstation summary displays and display navigation features Specific Concerns:

Do the WP5 and the workstation summary displays present sufficient information about plant state and system availability?

Do the overview displays effectively call more attention to more important information?

Do the WPIS and workstation summary displays help reduce the likelihood of omitting critical information in plant state assessment?

Approach The WPIS and workstation summary displays provide plant condition overview information to operators. This overview information is used by the operators to ascertain plant state and current status of operating equipment, to anticipate alarms and disturbances, to identify plant systems or components that have become unavailable for use, and generally to stay "in touch" with the plant conditions. The WP5 display and workstation summary displays must be complete, correct, and well-designed to depict an overview of the plant. This is necessary to allow operators to maintain an awareness of plant status and system availability. Ideally, operators can obtain this overview from a passive monitoring of these displays. That is, the operators should not have to select and browse within a set of workstation displays.

Concept Testing Hypothesis The WPB and workstation summary displays provide operators with an accurate overall understanding of plant state and system availability.

Evaluation Issues and Descriptions August 1996 m:\\3114w.wpf:lt>482296

I 7-4 Experimental Manipulations This evaluation includes reviews of the display content of the WPIS and default workstation displays to determine whether they contain sufficient information to allow operators to assess overall plant condition. Participants are shown static views of these displays and asked to infer the condition of the plant. These reviews occur early in the design process with low fidelity test beds to refine functional requirements regarding the types of data and data format that must be provided for various plant modes.

l Next, the effectiveness of these displays is evaluated empirically. Participants are shown overview displays for a brief period, and then the displays are removed. Participants are then asked to describe current plant state and conditions as thoroughly as possible.

Participants are asked to describe the implications of plant conditions including potential future problems and parameters that are approaching alarm conditions. Following this " free recall" session, participants are asked to reconstruct, either verbally or with sketches, the arrangement of plant data from the workstation and WPIS displays. Well-designed displays organize plant data in meaningful groups that facilitate operator understanding and recall.

Dependent Measures and Evaluation Criteria This evaluation uses breadboard designs consisting of static drawings and computer-based rapid display prototypes. The evaluation investigates human factors issues related to the ability of operators to extract summary-level information about plant conditions from overview displays. During the display reconstruction task, evaluators analyze which groups of plant data the participants are able to recall easily and which they have difficulty recalling.

This leads to better understanding of effective data display formats. Protocol analysis and debriefings are used to identify characteristics of the design concepts that support operator understanding as well as characteristics that lead to confusion and errors by the subject.

Objective measures collected during the review and reconstruction task may include:

Number of plant conditions correctly identified Correct identification of implications of plant conditions Time required to complete the task These measures provide performance baselines for comparing altematives and for evaluating the benefits of display modifications.

Implications of Results The purpose of this evaluation is to contribute to the development of functional requirements for the design of overview displays for the WPIS and workstation. Results are used to assess Evaluation Issues and Descriptions August 1996 j

m:\\3114w.wpf:1b-082296 j

i

7-5 and refine functional requirements for information content. The display reconstruction task is used to identify display arrangements that support operator understanding. The results support development of general guidelines for grouping and highlighting data in the overview displays.

Stage of Development of the HSI This test is conducted during the functional requirements phase of the WPIS design. The HSI design needs to be at a phase where the information content for the WPIS and/or workstation overview displays is defined.

Test Bed Requirements:

Physical form - Test beds may be drawings or computer-based rapid display prototypes. Displays have formats that are representative of alternanve display concepts for the plant HSI.

Information content -Information developed for displays is sufficient to assess plant conditions for a number of normal, abnormal, and emergency states. Displays contain realistic, meaningful values, and not random values. The display parameters and values do not have to be AP600-specific.

Dynamics - Static displays may be used. If rapid display prototypes are used, display animation, such as blinking and flashing, may be used.

Participant Characteristics Participants may include personnel who are familiar with important pressurized water reactor (PWR) plant operating parameters, including operator trainers, operators, and knowledgeable engineers and designers.

7.1.2 Evaluation Issue 2: Directed Search for Information Within the Workstation Displays Based on WPIS Displays 1)oes the WPIS support the operator in getting more detail about plant status and system availability by directed rearch of the workstation displays?

Relevant HSI Resources:

WPIS Workstation displays and display navigation features Evaluation Issues and Descriptions August 1996 m:\\3114w.wpf1t>082296

~_ _ _

7-6 Specific Concerns:

When given a WPIS cue for more detailed monitoring, how accurately and efficiently can operators locate and select this information from the workstation displays?

What types of confusions are caused by the WPIS displays when it indicates the need for active search of the workstation?

What types of navigation errors are made with the workstation displays?

Approach The WPIS and the workstation displays work together to support the operator in actively obtaining a picture of the plant condition. This active monitoring is driven by cues on the WPIS that indicate a potential problem, such as a plant parameter trending toward an alarm condition. From this cue, the operator navigates through the displays to locate more detailed information about the status of a particular process or parameter. (Operator response to plant alarms is addressed by a set of experiments under evaluations for interpretation and planning, SSAR subsection 18.11.)

The intent of this expenment is to test operator / ability to do this display navigation and selection efficiently with the AP600 display systems. Participants are given a plant state scenario that indicates the need for more detailed monitoring. They are asked to use the workstation to find the functional or physical display (s) that are most useful for more detailed monitoring.

Concept Testing Hypothesis The WPIS display and workstation display system support the operator in efficiently locating and selecting the display (s) that contain greater detail about plant parameters required for maintaining awareness of plant state.

Experimental Manipulations This experiment addresses a majority of plant conditions that require or cue the operator to obtain additional information from the workstation displays. These conditions may include:

Improvement or deterioration of power generation goals as indicated by the WPIS Improvement or deterioration of plant safety as indicated by the WPIS

=

Evaluation Issues and Descriptions August 1996 m:\\3114w.wpf:lt42296

7-7 i

Changes in the operating status of plant equipment (such as activation / deactivation of automatically controlled systems) as indicated by the WPIS or other information sources in the MCR l

Experimental manipulations address the full range of HSI display devices (with the exception of alarms) that cue operators to seek additional information about plant state.

Dependent Measures and Evaluation Criteria This evaluation uses breadboard designs to investigate human factors issues related to directed search through large sets of displays. Qualitative information is gathered through protocol analysis or debriefing discussions with the participants. The intention is to identify characteristics of the design concepts that lead to confusion, errors, and slow or awkward actions by the subject.

Objective dependent measures may include:

How many displays are accessed before selecting the correct display or displays?

Which displays are selected and in what order (the navigation path)?

The degree to which the relevant information is located The success in retuming from search to a designated location Time required to complete the task i

Implications of Results i

The purpose of this evaluation is to contribute to the development of functional requirements for the design of display navigation aids that support the operator in information gathering.

The qualitative information gathered through protocol analysis or debriefing discussions are analyzed to identify design features that lead to confusion, errors, and slow or awkward actions by the participants. Functional requirements are developed to address those design characteristics that had significant effects on the participants' performance of the display navigation task, including display navigation aids that:

Inform the operator via the WPIS that detailed information can be retrieved from the workstation Direct the operator via the WPIS to relevant categories of information or display space locations in the WP5 Support the operator in scanning through potential information fields and selecting the required data J

Evaluation Issues and Descriptions August 1996 m:\\3114w.wp61b482296

7-8 Quantitative measures of the participants' performance, such as number of displays accessed and task completion time, are used as baselines to compare altemative designs and evaluate j

performance benefits achieved through subsequent refinements of design concepts.

Stage of Development of the HSI This test is conducted during the functional requirements phase of the HSI design process.

The HSI design needs to be at a phase where design concepts exist for display formats, navigation aids, and display selection mechanisms. A set of detailed workstation displays are needed to provide meaningful display navigation tasks. Preliminary decisions regarding workstation display system hardware need to be made prior to these evaluations.

Test Bed Requirements:

Physical Form - The displays are representative of the style used in the AP600 HSI in

=

terms of appearance, including display format and use of windows. The workstation displays are computer-based.

Information Content -Information developed for displays is sufficient to generate a significant number of workstation displays. Participants have to navigate through a substantial set of displays to establish a fair test. The plant parameters presented on the displays do not have to be AP600-specific. The values presented for plant parameters do not have to be realistic because the participants are required to retrieve but not interpret the data.

Dynamics - Static displays may be used. In some cases, display animation such as blinking and flashing, may be used. The workstation display selection mechanisms need to be operational.

Participant Characteristics Participants may include designers, engineers, operator trainers, and operators. Participants need to have familiarity with the operation of the WPIS and the workstation displays.

7.1.3 Evaluation Issue 3: Directed Search for Information within the Workstation Displays Based on a Request Do the workstation displays support efficient navigation to locate specific information?

Evaluation Issues and Descriptions August 1996 m:\\3114w.wpf:H>482296

7-9 Relevant HSI Resources:

Workstation displays and display navigation features Specific Concerns:

How accurately and efficiently can operators, when given a specific request, locate and select the correct workstation display?

What types of navigation errors are made with the workstation displays?

e Approach The workstation functional and physical displays are intended to support the operator in searching for specific parameter values and other indicators of plant status that are not part of the default displays. This is the case of directed search of the workstation displays. In l

many cases, this search is directed by a request from a supervisor (or other technical staff) or by a procedure. From this request, the operator navigates through the displays to' determine the status of the requested process or parameter. This directed search must be efficient and

]

not detract from other duties. The intent of this experiment is to test operators' ability to use the workstation display system to efficiently perform display navigation and selection task.

Participants are given a parameter or process name and asked to use the workstation displays to determine the current value and then return to the display from which they began.

Concept Testing Hypothesis The workstation display system supports the operator in efficiently determining the current value of plant parameters and processes not represented in the default displays.

1 Experimental Manipulations Manipulations involve the complexity of navigating through the display system. In some cases, the required navigation is brief, and in other cases, the most complex navigation is required.

i Dependent Measures and Evaluation Criteria l

This evaluation uses breadboard designs to investigate human factors issues related to i

navigation through large sets of displays. Qualitative information is gathered through Evaluation Issues and Descriptions August 1996 m:\\3114w.wpf:1t>482296 i

l

7-10 protocol analysis or debriefing discussions with the participants. The intention is to identify characteristics of the design concepts that lead to confusion, errors, and slow or awkward actions by the subject.

Objective dependent measures may include:

How many displays are accessed before selecting the correct display or displays?

Which displays are selected and in what order (the navigation path)?

The degree to which the relevant information is located The success in retuming from search to a designated location Time required to complete the task Implications of Results The purpose of this evaluation is to contribute to the development of functional requirements for the design of display navigation aids. These aids are for the workstation displays including display hierarchy / network structure, menu design, cross-references between displays, audit trails of display navigation paths, display space " landmarks" and orientation aids, content overlap of related displays, and user interface mechanisms for display selection.

The qualitative information gathered through protocol analysis or debriefing discussions are analyzed to identify design features that lead to confusion, errors, and slow or awkward actions by the participants. Functional requirements are developed to address those design characteristics that have significant effects en the participants' performance of the display navigation task.

The quantitative measures of the participants' performance may be used as baselines to compare alternative designs and evaluate performance benefits achieved through subsequent j

refinements of design concepts.

Stage of Development of the HSI This test is conducted during the functional requirements phase of the HSI design process.

The HSI design is at a phase where design concepts exist for display formats, navigation aids, and display selection mechanisms. Preliminary decisions regarding display system hardware need to be made prior to conducting this evaluation.

Test Bed Requirements:

Physical form - The displays are representative of the style used in the AP600 HSI in terms of appearance,induding display format and use of windows. The workstation displays are computer-based.

Evaluation Issues and Descriptions August 1996 m:\\3114w.wpf;1b 082296

7-11 Information content -Information developed for displays is sufficient to generate a

=

significant number of workstation displays, since participants must navigate through a substantial set of displays to establish a fair test of the display system. The plant parameters presented on the displays do not have to be AP600-specific or realistic.

Dynamics - Static displays may be used. In some cases, display animation, such as blinking and flashing, may be used. The workstation display selection mechanisms need to be operational.

Participant Characteristics Participants may include designers, engineers, operator trainers and operators. Participants need to have familiarity with the operation of the workstation displays.

7.1.4 Evaluation Issue 4: Maintaining Crew Awareness of Plant Condition Do the HSI features effectively support crew awareness of plant conditions?

Relevant HSI Resources:

W PIS

=

I Workstation displays Paper-based / computer-based operating and administrative procedures a

Specific Concerns:

Does the HSI:

Support the operating crew in maintaining awareness of plant conditions and their implications?

Support the crew in maintaining awareness of each others' actions, intents, and information needs?

Support effective and efficient shift turnover?

Support new personnel entering the MCR to develop an awareness of plant conditions and their implications?

Evaluation Issues and Descriptions August 1996 m:\\3114w.wpf:H>482296

~.

=

7-12 Approach This evaluation addresses three situations for crew awareness:

Orientation of a new person entering the MCR Shift turnover Ongoing detection and monitoring by the crew Ongoing detection and monitoring by the crew requires that the crew members maintain awareness of plant conditions and the implications to operational goals. It also requires that crew members be aware of information that is relevant to other operators' responsibilities.

The design of the HSI supports each operator in:

Detecting and monitoring parameters relevant to his own task Identifying parameters that are relevant to other crew members Checking that those parameters relevant to other crew members are being addressed In this evaluation, participants carry out the activities associated with a new person entering the MCR, shift turnover, and ongoing detection and monitoring using defined plant scenarios.

Concept Testing Hypothesis The HSI supports the crew in maintaining awareness of the plant condition.

Experimental Manipulations Tests are conducted for normal, abnormal, and emergency plant states using defined scenarios. Plant conditions include:

Normal states, plant maneuver in progress Normal states, with certain equipment indicated as unavailable Normal states, with regular changes in actuation and termmation of automated systems Normal states, with parameters trending toward abnormal Outage state, for tag outs or tests in progress Evaluation Issues and Descriptions August 1996 m:\\3114w.wpf:1b-082296

4 7-13 Abnormal states Emergency states Participants walk through these scenarios using a test bed that may consist of static displays and mockups of the workstation consoles and the WPE. Alternative design concepts may be tested for displays, workstation console, the WPIS, and the relative position of these components in the main control area. Factors that affect crew awareness include:

Display content and format Operator's view of the WPIS Operator's view of other operators and their workstations Operator's ability to communicate data verbally Operator's ability to communicate data by other means Arrangements of workstation consoles and the WPIS may affect the operators' ability to view the WPIS, other operators, and their workstations and to communicate verbally. The effect of these arrangements on crew awareness of the plant is evaluated. The effect of alternative display concepts on crew awareness of the plant also may be evaluated.

Dependent Measures and Evaluation Criteria Qualitative information is gathered using protocol analysis or debriefing discussions with the participants. The intent is to identify characteristics of the design concepts that lead to confusion, errors, and slow or awkward use. In addition, participants assess plant conditions at the end of each trial to evaluate their degree of understanding of plant condition.

Additional measures to collect for the tests of shift turnover may include:

Time to complete shift turnover Number of required plant parameters addressed Number and types of omission errors made Accuracy errors made in reviewing plant parameters Additional measures to collect for ongoing detection and monitoring may include:

Identification of plant parameters relevant to the operator's responsibilities Identification of plant parameters relevant to the responsibilities of other crew members Evaluation Issues and Descriptions August 1996 au\\3114w.wpf;1M82296 rr

7-14 Implications of Results The primary purpose of this evaluation is to contribute to the development of functional requirements for:

The design of the workstation and WPIS displays The design and layout of the workstation consoles and the WPIS

=

Functional requirements related to the design of the workstation and WPIS displays address the organization and format of plant data for supporting crew awareness of plant condition.

These functional requirements address the design of the summary and default displays and the presentation order of the data for shift tumover. Functional requirements related to the design and layout of the workstation consoles and the WPIS address design characteristics that support the communication of information. These functional requirements address operator headphones to facilitate communication, design of workstation consoles to support use by two or more people, and mechanisms for coordinating views of the data among operators.

This evaluation also contributes to the development of functional requirements for the operating and administrative procedures and logs that contribute to coordinating information among crew members.

Stage of Development of the HSI This test is conducted during the functional requirements phase of the HSI design process.

The HSI design needs to be at a phase where design concepts exist for display formats and content, workstation console design, WPIS design, and MCR layout. Preliminary decisions regarding display system hardware need to be made prior to conducting this evaluation.

Test Bed Requirements:

Physical form - The displays are representative of the style used in the plant HSI in terms of appearance, including display format and use of windows. The workstation displays are computer-based.

Information content -Information developed for displays needs to be sufficient to generate a significant number of workstation displays. The plant parameters presented on the displays do not have to be AP600-specific. The values presented for plant parameters must be realistic.

j Dynamics - A dynamic simulation of plant behavior is not required. Static displays may be used. For tests of ongoing plant monitoring, a series of static displays may be Evaluation Issues and Descriptions August 1996 m:\\3114w.wpf It@82296

7-15 1

used. Display animation, such as blinking and flashing, may be used. The workstation display selection mechanisms need to be operational.

Participant Characteristics Participants may include designers, engineers, operator trainers, and operators. Participants needs to have familiarity with the operation of the workstation functional and physical displays.

7.2 EVALUATIONS FOR INTERPRETATION AND PLANNING The purpose of evaluations in this subsection is to provide confidence that the HSI design supports situation asser,sment and response planning. The focus is on situations that require a response to plant disturbances and significant plant accidents. Responding to plant disturbances covers the stages of cognitive processes. The emphasis of this set of evaluations is on identifying plant disturbances, assessing their implications for plant functions and goals, and selecting, evaluating, and,if necessary, adapting a recovery procedure.

The following set of evaluations are designed to test whether the HSI features, individually and in combination, support operator response to single-fault, multiple-fault, and severe accident events. They test the ability of the HSI to support both rule-based and.

knowledge-based performance, including supervisory control of automated systems during emergencies. In addition, they address the ability of the HSI to support crew problem solutions and coordination during plant disturbances.

Evaluation issues are the following:

Issue 5: Does the alarm system convey information in a way that enhances operator awareness and understanding of plant conditions?

Issue 6: Does the physical and functional organization of plant information on the workstation displays enhance diagnosis of plant condition and the plannmg/ selection of recovery paths?

l Issue 7: Does the integration of the alarms, WPIS, workstation and procedures support the operator in responding to single-fault events?

Issue 8: Does the integration of the alarms, WPIS, workstation and procedures support the operator in interpretation and planning during multiple-fault events?

Issue 9: Does the integration of the alarms, WPIS, workstation and procedures support the crew in interpretation and planning during multiple-fault events?

- Evaluation Issues and Descriptions August 1996 m:\\3114w.wpf:1ba2296 i

i 7-16 Issue 10: Does the integration of the alarms, WPIS, workstation and procedures support the crew in interpretation and planning during severe accidents?

7.2.1 Evaluation Issue 5: Detecting and Understanding Disturbances Using Alarms Does the alarm system convey information in a way that enhances operator awareness and understanding of plant condition?

J Relevant HSI Resources:

Overview alarm system

=

Alarm support displays Specific Concems:

Does the alarm system overview organize alarm messages in a way that facilitates the operator's understanding of the alarm state and its implications for the plant's operational goals?

Does the presentation format, including visual coding techniques intended to establish

=-

relative salience (that is, salience coding), enable rapid detection and interpretation of 1

alarm messages?

Does the alarm system prioritization scheme facilitate the operator's understanding of the relative importance of alarm conditions?

Does the alarm system enable operators to identify and interpret the implications of lower priority alarms?

2 Approach The assumption of this evaluation is that a well-structured alarm system presents the most important alarm messages and organizes alarms in a way that is meaningful to the operator.

Redundant and less important messages do not appear. Participants are able to perceive the alarm messages in patterns related to types of plant faults, recognize high-priority goal violations, and are aware of the number and general content of lower priority alarms.

4 A time-step sequence of alarm patterns, corresponding to an evolution of an accident scenario,is presented to a subject. The subject is asked to indicate the alarm messages that 4

are presented, and their priority for response (from most to least important). The subject is Evaluation Issues and Descriptions August 1996 m:\\3114w.wpf:1b-082296

7-17 also asked to describe the implications of the alarms to plant safety and productivity goals and any causal interrelationship among alarms. Next, the subject is asked to identify other alarm conditions that may have existed but were not displayed because they were not shown by the prioritization scheme. Lastly, the display of lower priority alarms is presented, and the subject is asked to describe the implications of these alarms.

Concept Testing Hypothesis The alarm system supports identification and prioritization of alarm messages.

Experimental Manipulations In the concept testing phase, the alarm pattern for each time step is presented for a fixed length of time and then removed. The premise is that if the alarm system is well organized, it results in meaningful alarm patterns. An individual rapidly identifies the alarms present and recalls them once they are removed.

Underlying plant upsets vary in severity from single malfunction events to multiple-failure events. Alarm mesuges vary in number and level of abstraction (such as from equipment state to goal state). Upsets include:

I Cases where a single fault leads to a cascade of alarms; where the objective is to

=

determine whether the subject can correctly assess the interrelation between the original fault and the consequent disturbances Multiple-fault cases, where the objective is to determine the ability of the subject to

=

identify, prioritize, and track the implications of multiple, functionally unrelated alarms Cases where lower priority alarm queues of varying types and number exist.

Dependent Measures and Evaluation Criteria This evaluation uses breadboard designs to investigate the effectiveness of the alarm organization and prioritization scheme in supporting operator's in identifying, prioritizing, and assessing the implications of alarms. The intention is to identify characteristics of the design concepts that lead to confusion, missed alarms, misinterpretation of alarm messages, misinterpretation of interrelation among alarms or incorrect / incomplete understanding of alarm priorities.

Evaluation Issues and Descriptions August 1996 m:\\3114w.wpf lb-082296

7-18 This is assessed through objective performance measures, as well as the participant's subjective assessment obtained during debriefing interviews.

Objective-dependent measures may include:

The number of alarms correctly identified The subject's assessment of alarm priorities compared to the priorities that are assigned during the development of the test scenario Tne extent to which the implications of the alarms for present and future plant state are correctly assessed The extent to which the causal interrelation among alarms is recognized The ability to infer lower priority alarms The ability to interpret displayed and lower priority alarms Implications of Results The purpose of this evaluation is to contribute to the development of functional requirements for the alarm system. Quantitative measures (such as the number of alarm messages identified and successful identification of alarm implications) may be used to evaluate and refine alarm organization and prioritization concepts. Issues include orgamzation of alarms, number of slots available in parallel for alarm messages, alarm prioritization rules used, and meaning of alarm messages. Participants' comments regarding lower priority alarm messages may be used to assess the alarm prioritization schemes. For example, if operators indicate that the lower priority alarms contain important information that needs to be more available, then the alarm prioritization scheme may be revised. Qualitative results (such as comments regarding salience coding, alarm message format, and lower priority alarm queue format) may also be used to refine display format functional requirements.

Stage of Development of the HSI This test is conducted during the functional requirements phase of the HSI design process.

The HSI design needs to be at a phase where prelimmary design concepts, with respect to alarm system organization and alarm priority rules, exist.

Evaluation Issues and Descriptions August 1996 m:\\3114w.wpf:llH)82296

7-19 l

\\

I Test Bed Requirements:

Physical form - The alarm display test bed accurately reflects the spatial organization of alarm messages, message wording, and alarm prioritization rules. The alarm display may be presented on cathode ray tubes (CRTs), plasma panels, or on paper.

Information content - The information content is essential for evaluating the usefulness of the alarm messages and prioritization schemes. Only a subset of the alarm messages are required for this test to be performed. However, a complete set of alarm messages must exist for each plant upset condition tested. The information content need not be from the AP600, but should be representative of the AP600.

Dynamics - A series of static displays corresponding to discrete time steps during an accident scenario may be used. A dynamic plant simulation is not needed to drive the alarm system.

Participant Characteristics Participants may include designers, engineers, operator trainers, and operators. Participants need to have familiarity with the alarm system concept.

7.2.2 Evaluation Issue 6: Interpretation and Planning Using Workstation Displays Does the physical and functional organization of plant information on the workstation displays enhance interpretation of plant condition and the plannmg/ selection of recovery paths?

Relevant HSI Resources:

Physical and functional displays of operator workstation Specific Concerns:

Do the functional displays support the operator in assessing goal satisfaction?

Do the functional displays support the operator in assessing whether currently active processes are performing correctly?

Do the functional displays support the operator in assessing whether automated systems are performing correctly?

Evaluation Issues and Descriptions August 1996 nt\\3114w.wpf:1b-082296

7-20 Are the implications of plant state for operational goals conveyed effectively via j

functional displays?

Do the displays support the operator in identifying the plant condition that caused an alarm?

Do the displays support the operator in understanding interrelations among systems and processes?

Do the displays support operator understanding of interrelations among observed disturbances due to process interactions?

Do the displays support the operator in assessing validity of data?

Is equipment status conveyed effectively via the physical displays?

Do the functional displays support the operator in assessing the availability of alternative processes for achieving a given goal (success path monitoring)?

Do the functional displays support the operator in making choices among alternative processes (success path choice)?

Do the displays support the operator in assessing the effect of the selected recovery path on other plant goals (side effects)?

Are operators able to effectively coordinate physical and functional displays?

Approach:

The purpose of this evaluation is to determine whether operators can efficiently extract the necessary information from the physical and functional displays of the operator workstation.

An individual is asked to interpret, track, and indicate a response strategy for an evolving plant upset by examining workstation displays. In the concept testing phase, the displays may be static representations that correspond to discrete time steps through the evolving upset. A set of probe questions is used to test the ability of participants to extract j

information from the displays.

Initial plant conditions are described to the subject. The subject is then presented with an alarm message, either verbally or via a static display. The subject is then taken through a series of discrete time steps through the evolving plant upset. At each time step, the subject accesses physical and functional displays to answer a set of questions about plant state and its implications.

Evaluation Issues and Descriptions August 1996 m:\\3114w.wpf:1b482296

e 7-21 Concept Testing Hypothesis The functional and physical displays support the operator in interpreting plant state and plannmg recovery action.

Experimental Manipulations Underlying plant upsets vary in severity from single-fault events, for which diagnosis and plannmg is straightforward, to multiple-fault events, for which diagnosis and recovery planning is complex.

Upsets involving complex diagnosis include multiple-failure mode accidents in which important plant indications are disguised or obscured. Upsets involving complex recovery path planning require monitoring of side effects to evaluate undesirable effects on other parts of the plant (conflicting goals). Cases include sensor failures and invalid data, as well as automated system failures that require decisions regarding manual intervention. ' Alternative display concepts may be tested and compared.

Dependent Measures and Evaluation Criteria This evaluation uses breadboard designs to investigate the ability of the physical and functional displays to support interpretation and planning. Qualitative information is gathered through protocol analysis of participant's comments during the testing and debriefing interviews following the test. Responses to questions about the plant state and status of operational goals are also evaluated.

Questions to participants may include their perceptions of:

Existing plant disturbances Causes and interrelations among these disturbances Consequences of these disturbances for plant operational goals Alternative processes available to achieve plant operational goals

=

Status of automated systems and whether manual intervention is required Competing goals that need to be satisfied (such as side effects)

Appropriate recovery actions that must be taken j

The participants' responses are compared to a predefined set of correct responses.

1 Evaluation Issues and Descriptions August 1996 m:\\3114w.wphib 082296

+

d

7-22 Implications of Results The purpose of this evaluation ic, to contribute to the development of functional requirements for the workstation functional and physical displays. The philosophy behind the functional and physical display system is based on Rasmussen's abstraction hierarchy. A major goal of the display system is to support functional reasoning and knowledge-based decision-making.

The objective is to reduce errors such as fixation effects, in which operators concentrate on one set of symptoms to the exclusion of other more relevant symptoms; and missing side effects, in which operators fail to notice that their chosen recovery strategy may have negative consequences for other plant systems. A primary focus of concept testing is to provide feedback on the effectiveness of the display system in fostering a broad view and supporting knowledge-based reasoning. Specific attention is paid to cases where operators make errors in plant state assessment (such as fixation errors); cases where operators fail to understand the status of automated system and/or anticipate automatic system action; and cases where they make planning errors (such as missing side effects). These are used to test and revise the functional requirements for the display system.

Stage of Development of the HSI This test is conducted during the functional requirements phase of the HSI design process.

The HSI design needs to be at a phase where preliminary designs for content and layout of the functional and physical displays are available.

Test Bed Requirements:

Physical form - The plant displays are generated on a VDU screen using rapid display prototyping software. Some ammation is displayed. However, other static media, such as color drawings, could be used. Display representations convey design features such as salience coding and grouping of data.

Information content - Displays for the plant functions and systems that are relevant to the plant upsets used in the study need to be prepared. For the issues being tested, the displays need not be AP600-specific.

Dynamics - Because plant upsets may be presented as a series of discrete time-steps, a near full-scale simulation of plant dynamics is not required.

Participant Characteristics Participants may include designers, engineers operator trainers, and operators. Participants need to have familiarity with the operation of the workstation functional and physical displays.

Evaluation Issues and Descriptions August 1996 m:\\3114w.wpf:lt>cS2296

l 7-23 7.2.3 Evaluation Issue 7: Interpretation and Planning During Single-Fault Event Using Alarms, Workstation, WPIS, and Procedures Does the integration of the alarms, WPIS, workstation, and procedures support the operator in responding to single-fault events?

Relevant HSI Resources:

WPIS Alarm system Workstation displays Computer-based and/or paper-based procedures Specific Concerns:

Does the integration of alarms, displays, controls, and procedures support the operator in:

Obtaimng detailed information concerning alarm messages Retrieving the appropriate procedure in response to plant condition Performing actions indicated in procedures Assessing goal threats and achievement Approach The purpose of this test is to provide confidence that operators can use the alarms, procedures, displays, and controls as intended to respond to straightforward, single-fault plant upsets. Operator response is primarily procedure-based. In Rasmussen's terminology, this corresponds to rule-based behavior. This experiment focuses on the performance of individual operators (such as a reactor operator) and does not focus on the interaction of multiple operators. The study is performed using a crew size consistent with the AP600 MCR manning assumptions for handling emergency events. The subject is presented with an alarm or set of alarms. Then the HSIis used to select the appropriate procedure, select the appropriate plant displays, and execute the procedure.

Concept Testing Hypothesis

'Ihe integrated HSI supports operators in handling single-fault events.

Evaluation Issues and Descriptions August 1996 m:\\3114w.wpf:1M82296 1

7-24 Experimental Manipulations Test scenarios are based on a variety of plant faults for a variety of plant conditions (normal, abnormal, and emergency).

Dependent Measures and Evaluation Criteria This evaluation uses a breadboard design to investigate the effectiveness of the integrated HSI in supporting operator response to single-fault plant upsets. This is assessed through objective performance measures, as well as the participant's subjective assessment obtained during debriefing interviews.

Subject decisions and actions are analyzed using decision tracing and analysis of task completion time. The evaluation focuses on errors of intent and execution for both control / display navigation and plant control.

The participants' performance in responding to the plant upset is compared to an ideal response path defined by experts. Performance may be assessed in terms of:

Successful task completion (such as selection of proper procedures and displays, and the proper execution of procedures)

Task completion time Errors (such as incorrect intentions and incorrect execution of actions)

Inefficiencies (such as delays or wasted actions, including excessive transitions between displays, induced by HSI design)

Implications of Results The purpose of this evaluation is to contribute to the development of functional requirements to support the integration of the different HSI features. The focus is on the points of interface among the different HSI features and how effectively they work in combination to support rule-based performance.

The results are analyzed to identify design features that lead to confusion, errors, and slow or awkward actions by the participants. Functional requirements are developed to address those design characteristics that have significant effects on the subject's performance.

Evaluation Issues and Descriptions August 1996 m:\\3114w.wpf:1b 082296

7-25 Stage of Development of tha HSI This test is conducted during the functional requirements phase of the HSI design process.

The HSI design needs to be at a phase where preliminary design concepts exist for the WPIS, the alarm system workstation displays, and procedures.

Test Bed Requirements:

Physical Form - The WPIS, alarm system, workstation displays, and computerized procedure displays are representative of the AP600 HSIin terms of appearance. This includes display format, use of windows, display navigation mechanisms, and links among the different HSI resources (for example, mechanisms linkmg alarm messages to particular workstation displays or procedures). The HSI features need not be high-fidelity with respect to physical scale, for example, the WPIS and alarm system could be simulated on a VDU.

Information Content - A set of displays is developed to cover the set of faults included in the test, as well as to provide a set of realistic displays to test the adequacy of navigation.

Dynamics - Static displays may be used. The displays need not be AP600-specific.

Participant Characteristics Participants may include designers, engineers, operator trainers, and operators. Participants need to have familiarity with the AP600 HSI features.

Performance Testing Verification Design features of the hardware and displays are exammed and evaluated against functional requirements using a checklist-type procedure. This evaluation focuses on the functional requirements that were defined during the concept test phase to support integration of HSI features, especially those developed during the concept testing phase of this evaluation. This test is conducted with equipment that emulates production prototype hardware for the workstation. Deviations from the functional requirements are documented and then evaluated.

Evaluation Issues and Descriptions August 1996 m:\\3114w.wpf:ll>-082296

7-26 l

Validation l

This test is a validation that integrated HSI supports trained operators in responding to single-fault events.

l l

Requirement: Prompt and correct interpretation of alarm messages Measures:

l Operator report of fault and implications Task completion time Requirement: Prompt retrieval of detailed information from workstation regarding alarm messages Measures:

Successful retrieval of required information l

=

Information retrieval time l

l i

Requirement: Prompt and correct selection of procedure Measures:

Successful retrieval of procedure Procedure selection time l

i Requirement: Prompt and correct selection of controls and displays Measures:

Successful retrieval of controls and displays Control and display selection time Requirement: Prompt and correct assessment of goal threats and goal achievement Measures:

l Operator assessment of goal threats and goal achievement Task completion time i

Evaluation Issues and Descriptions August 1996 m:\\3114w.wpf:ltM)82296

7-27 For each scenario used in the validation study, a description is created of how the operator must respond to the event. It includes a description of the alarms that are identified, how they are interpreted, what workstation displays are accessed, what conclusions about plant state and implications for eperational goals are drawn, what procedures must be accessed, and what control actions are taken. Operator performance is compared against this

]

description. The performance criterion is the correct response. Criteria for task completion time are determined at a later point.

Experimental Manipulations The types of plant upsets presented are the same as in the concept testing phase.

Stage of Development of the HSI This test is conducted after the design of the WPIS, alarm system and workstation hardware, software and information content have been completed. This test is conducted using a near full-scope simulator consisting of equipment that emulates the HSI hardware.

j Test Bed Requirements:

Physical form - The hardware emulates HSI equipment in the relevant respects.

Information content - The information content of the HSI is representative of AP600 interfaces in content and format.

Dynamics - A high-fidelity, near full-scope AP600 MCR simulator is used.

Participant Characteristics (Validation)

Participants are experienced operators who have a basic understanding of the AP600 control requirements. They also have familiarity with the operation of the AP600 HSI.

7.2.4 Evaluation Issue 8: Interpretation and Planning During Multiple-Fault Events Using Alarms, Workstation, WPIS, and Procedures Does the integration of alarms, WPIS, workstation and procedures support the operator in interpretation and plannmg during multiple-fault events?

Evaluation Issues and Descriptions August 1996 m:\\3114w.wpf:1M82296

i 7-28 l

Relevant HSI Resources:

a WPIS Alarm system j

Workstation displays j

Computer-based and/or paper-based procedures j

Specific Concerns:

a 4

Does the integration of alarms, displays, and procedures support the operator in:

1 i

Diagnosing multiple-fault plant conditions i

Plannmg/selectmg the most appropriate recovery path when multiple safety goals need to be considered Assessing the effect of the selected recovery path on other plant goals (side effects) j 4

l!

Supervising automated systems and determinmg when manual intervention is

]

required

{

Approach 4

The purpose of this test is to provide confidence that operators can use the alarms, procedures, displays, and diagnostic aids to select and maintain the appropriate response path in multiple-fault situations. The test assesses:

l System understanding for diagnostically complex cases Success path planning for cases where the recovery path is complex Operator response is guided by emergency response procedures, although knowledge-based skills are required for interpreting plant status indications and for evaluating the performance of automatic control systems, the validity of process data, alternative response paths, and the effectiveness of the current procedure.

This test focuses on the performance of individual operators (such as reactor operator) and does not focus on the interaction of multiple operators. The study is performed using a crew size consistent with the AP600 MCR mannmg assumptions for handling emergency events.

The subject (s) is presented with a complex alarm condition and selects and executes the j

appropriate response using procedures and displays of the HSI.

Evaluation Issues and Descriptions August 1996 m:\\3114w.wpf:ll>482296

7-29 Concept Testing Hypothesis The integrated HSI supports operators in handling multiple-fault events.

I Experimental Manipulations A variety of multiple-fault plant conditions are included to test:

System understanding in diagnostically complex cases (such as masked symptoms 1

and obscured evidence)

Success path plannmg in complex cases (such as complex constraints, side effects, and conflicting goals) i Ability to provide supervisory control of automatic control systems, to assess when

=

.l intervention is required, and to have them take over effectively l

Dependent Measures and Evaluation Criteria This evaluation uses a breadboard design to investigate the effectiveness of the integrated l

HSIin supporting operator response to multiple-fault plant upsets. Particular attention is focused on the ability of the integrated HSI to support knowledge-based reasoning. This is 1

assessed through objective performance measures, think-aloud protocol during task performance, and the participant's subjective assessment obtained during debriefing interviews.

Subject decisions and actions are analyzed using decision tracing and analysis of task completion time.

The subject's performance in responding to the plant upset is compared to an ideal response j

path defined by experts. Performance may be assessed in terms of:

Successful task completion (such as selection of proper procedures and displays, and proper execution of the procedure)

Task completion time Errors (such as incorrect intentions and incorrect execution of actions) j i

l Evaluation Issues and Descriptions August 1996 m:\\3114w.wpf:1b.062296

7-30 Inefficiencies (such as delays or wasted actions, including excessive transitions between displays, induced by HSI design).

Implications of Results The purpose of this evaluation is to contribute to the development of functional requirements for the integrated HSI to support operator response to multiple-fault events. The focus is on the points of interface among the different HSI features and how effectively they work in combination to support knowledge-based performance.

The results are analyzed to identify design features that lead to confusion, errors, and slow or awkward actions by the participants. Particular attention is paid to the interpretation of plant state and response planning by the participants. Instances of errors of intention are analyzed in detail to determine HSI characteristics that might have contributed to the error f

and improvements that could be made to the HSI to reduce this type of error.

This evaluation leads to the following types of recommendations:

Ways of presenting alarm and procedure information that assist operators in determining the appropriate priority of multiple alarm messages Ways of presenting information on the WPIS, the workstation displays, and in the procedures to reduce the likelihood of operator fixation on a single fault Ways of presenting information on physical and functional plant displays, the WPIS and in procedures to assist operators in determinmg the cause and consequences of j

plant component malfunctions Ways of presenting alarm and procedure information that assist operators in determming appropriate goals for plant recovery Ways of presenting information on physical and functional plant displays, the WPIS and in procedures to mamtain operator awareness of side effects (consequences of plant recovery path that may violate other safety goals)

Ways of presenting information on physical and functional plant displays, the WPIS and in procedures to support operator supervisory control of automated systems, and to assist operator identification regarding when manual intervention is required Evaluation Issues and Descriptions August 1996 mA3114w.wpf:1t>082296

7-31 Stage of Development of the HSI This test is conducted during the functional requirements phase of the HSI design process.

The HSI design needs to be at a phase where preliminary design concepts exist for the WPS, the alarm system, workstation displays and procedures.

Test Bed Requirements:

Physical form - The WPE, alarm system, workstation displays, and computerized procedure displays are representative of the AP600 HSI in terms of appearance. This includes display format, use of windows, display navigation mechanisms and links among the different HSI resources (such as mechamsms linking alarm messages to particular workstation displays or procedures). The HSI features need not be high-fidelity with respect to physial scale. For example, the WP5 and alarm system could be simulated on a VDU.

Information content - A set of displays is developed to cover the set of faults included in the test, as well as to provide a set of realistic displays to test the adequacy of navigation.

Dynamics - Static displays may be used. The displays need not be AP600-specific.

Participant Characteristics Participants may include designers, engineers, operator trainers, and operators. Participants need to have familiarity with the AP600 HSI features.

7.2.5 Evaluation Issue 9: Interpretation and Planning by Crew During Multiple-Fault Events Using Alarms, Workstation, WPIS, and Procedures Does the integration of alarms, WPS, workstation and procedures support the crew in interpretation and planning during multiple-fault events?

Relevant HSI Resources:

WP5 Alarm system Workstation displays Computer-based and/or paper-based procedures Evaluation Issues and Descriptions August 1996 m:\\3114w.wpf:lt42296

7-32 i

Specific Concerns:

Does the integration of alarms, displays, und procedures support the crew in:

Communicating relevant plant state information Developing and maintaining a shared understanding of plant state Allocating and coordinating goals and responsibilities Maintaining awareness of the goals and activities of other crew members Detecting performance errors of other crew members Engaging in group problem solving Maintaining successful role separation (that is, the supervisor is able to mamtain a a

broad view while leaving the detailed monitoring and control activities to control operators)

Approach The purpose of this test is to provide confidence that the integrated HSI supports crew communication and coordination in responding to multiple-fault situations. This study examines crew performance on the same types of plant upsets described in Evaluation Issue 8. The difference is that the emphasis of this study is on crew interaction and joint problem-solving. The study is performed using a crew size consistent with the AP600 MCR mannmg assumptions for handling emergency events. The subject (s) is presented with a complex alarm condition and selects and executes the appropriate response using procedures and displays of the HSI.

Participants are presented with complex multiple-fault events to test several facets of crew communication and coordination. The primary experimental manipulations are:

The type of event presented Whether each of the individuals forming a crew are " participants," or whether only one is the subject and the others are confederates whose actions are determined by scripts Whether the crews are observed responding to the event uninterrupted, or whether the simulation is frozen at specified points and the participants are asked questions Evaluation Issues and Descriptions August 1996 m:\\3114w.wpf:1t>062296

7-33 relating to their knowledge of plant state, the activities of the other individuals, and the implications for safety goal achievement Concept Testing Hypothesis 4

The integrated HSI supports crew communication and coordination during multiple-fault events.

Experimental Manipulations l

During concept testing, two experimental conditions are used. In one condition, multiple individuals participate as a crew in the study, and their interaction and coordination are observed. This condition is more realistic. The second condition is more controlled. In the second condition, one individual is the subject of the study. One or more additional individuals are used to complete the crew, but these additional individuals are part of the experiment team (that is, experiment confederates). Their actions are determined by a script designed to create critical crew interaction situations with the individual who is serving as the subject. For example, a confederate might fail to take an action or may take an action that is incorrect. In this second condition, the question of interest is whether the subject detects the error, brings it to the attention of the confederate, and attempts to resolve the situation.

At various points in the event, the simulation is frozen, and the participants in the study are asked a series of questions designed to assess:

Awareness of plant state Awareness of the response plan being followed Awareness of the activities of the other operator (s)

Awareness of the goals and activities of other crew members Awareness of the impact of the activities of the other operator (s) on theu activity, and 4

I vice versa 2

Dependent Measures and Evaluation Criteria This evaluation uses a breadboard design to investigate the effectiveness of the integrated HSI in supporting crew communication and coordination for interpretation and planning.

This is assessed through objective performance measures, think-aloud protocol during task performance, as well as the participant's subjective assessment obtained during debriefing interviews.

Evaluation Issues and Descriptions August 1996 m:\\3114w.wpf:H> 082296

~

7-34 Subject decisions and actions are analyzed using decision tracing and analysis of task completion time. The subject's performance in responding to the plant upset is compared to an ideal response path defined by experts. Particular attention is focused on analysis of crew communication and coordination activities.

Objective dependent measures may include:

Whether relevant plant status information was conununicated Whether participants maintained a shared understanding of plant state Whether participants successfully allocated and coordinated goals and responsibilities Whether participants successfully maintained role separation Whether participants were able to detect performance errors made by other crew members (such as errors intentionally made by confederates)

Whether participants engaged in group problem-solving, obtained consensus on interpretations and planned decisions.

Implications of Results The purpose of this evaluation is to contribute to the development of functional requirements for the integrated HSI to support crew communication and coordination for interpretation and planning.

The results are analyzed to identify design features that lead to confusion, errors, and slow or awkward actions by the participants. Particular attention is paid to the ability of the integrated HSI to support development of a shared plant state interpretation, efficient task allocation and coordination, effective role separation, and group problem-solving and decision-making. Instances of breakdowns in communication or task coordination are analyzed in detail to determine HSI characteristics that might have contributed to the error and improvements that could be made to the HSI to reduce this type of error.

Stage of Development of the HSI This test is conducted during the functional requirements phase of the HSI design process.

The HSI design needs to be at a phase where preliminary design concepts exist for the WPIS, the alarm system, workstation displays and procedures.

Evaluation Issues and Descriptions August 1996 m:\\3114w.wpf:ltH)82296

7-35 Test Bed Requirements:

Physical form - The WPIS, alarm system, workstation displays, and computerized procedure displays are representative of the AP600 HSIin terms of appearance. This includes display format, use of windows, display navigation mechanisms, and links among the different HSI resources (such as mechanisms linking alarm messages to particular workstation displays or procedures). The HSI features need not be high-fidelity with respect to physical scale. For example, the WPIS and alarm system could be simulated on a VDU Information content - A set of displays are developed to cover the set of faults included in the test, as well La to provide a set of realistic displays to test the adequacy of navigation.

Dynamics - A dynamic simulation is required to drive the HSI. The plant simulation and displays need not be AP600-specic.

Participant Characteristics Participants may include designers, engineers, operator trainers, and operators. Participant:

need to have familiarity with the AP600 HSI features.

7.2.6 Evaluation Issue 10: Interpretation and Planning by Crew During Severe Accidents Using the Technical Support Center, Alarms, Workstation, WPIS, and Procedures Does the integration of the alarms, WPIS, workstation and procedures support the crew in interpretation and planning during severe accidents?

Relevant HSI Resources:

WPIS Alarm system a

Workstation displays Computer-based and/or paper-based procedures a

Specific Concerns:

Does the HSI present information in ways that support interpretation of plant state under-degraded plant information conditions?

Evaluation Issues and Descriptions August 1996 nu\\3114w.wpfilb-082296

......_____J

i 7-36 i

Does the HSI enable the crew to assess data quality and recognize when plant l

parameter measures are unreliable?

Does the integrated HSI support the formulation of a response strategy in cases where l

procedural guidance is not available?

Does the HSI encourage efficient use of information found inside and outside the MCR?

i i

Does the HSI provide confidence of effective communication between crew members and personnel located outside the MCR (such as the technical support center)?

1 1

1 Does the HSI support effective group decision-making?

j Approach The purpose of this test is to provide confidence that the HSI design supports response to I

severe accident events. Severe accidents place increased cognitive demands in several respects. First, because plant sensors can become unreliable, interpreting plant state is more difficult. Second, conditions may arise beyond the scope of emergency operating procedures

-(EOPs), requiring a response strategy to be developed. In Rasmussen's terminology, this means that there is greater emphasis on knowledge-based performance during severe accidents. A third complication is a greater need for communication and coordination with a variety of personnel outside the MCR, including personnel in the technical support center and the offsite emergency response facility.

This study is performed using a crew size consistent with the AP600 MCR manning assumptions for handling severe accident events.

f The participants are presented with a severe accident scenario. Additional personnel resources (such as the technical support center or the offsite emergency response facility) are added, based on the time-frame and manning assumptions for the AP600. Decision-trace methodology is used to trace the information access activities, communication, goal I

formulation, response strategy planning, and decision-makmg activities of the MCR crew and outside support personnel.

Concept Testing Hypothesis l

The integrated HSI supports the coordination of people and information required for plant state interpretation and response strategy planning activities during severe plant accidents.

Evaluation Issues and Descriptions August 1996 m:\\3114w.wptib-082296

1 J

j i,

7-37 i

4 Experimental Manipulations l

l

. A variety of severe accident conditions are included to test:

l System understanding in diagnostically complex cases (for example, masked j

symptoms and obscured evidence due to degraded sensors) i i

Success path planning in complex cases (such as complex constraints, side effects, and i

conflicting goals)

Communication and coordination between the MCR staff the technical support center and offsite emergency center staff Dependent Measures and Evaluation Criteria This evaluation uses a breadboard design to investigate the effectiveness of the integrated HSI in supporting interpretation and planning during severe accidents. Particular attention is focused on the ability of the integrated HSI to support knowledge-based reasoning. This is assessed through objective performance measures, think-aloud protocol during task performance, and the participant's subjective assessment obtained during debriefing interviews.

Subject decisions and actions are analyzed using decision-tracing and analysis of task completion time. The subject's performance in responding to the plant upset is compared to an ideal response path defined by experts.-

Objective dependent measures may include:

Successful task completion (such as the selection of proper procedures and displays, and the proper execution of the procedure)

Task completion time Errors (such as incorrect intentions and incorrect execution of actions)

Inefficiencies (such as delays or wasted actions, including excessive transitions between displays, induced by the HSI design)

Implications of Results The purpose of this evaluation is to contribute to the development of functional requirements for *.he integrated HSI to support severe accident management. The study identifies errors Evaluation Issues and Descriptions August 1996 m:\\3114w.wpf:1b-082296

7-38

. and inefficiencies induced by the design of the HSI that may affect emergency response during severe accidents. This evaluation leads to the following types of recommendations:

Ways of presenting information that promote the efficient formation and testing of hypotheses regarding plant state Ways for verifying that multiple information sources are used effectively Ways for promoting effective communication between crew members and personnel located outside the MCR Ways of enhancing group problem-solving, including understanding plant conditions,

=

plannmg, and coordinating actions Ways of promoting effective group decision-making Stage of Development of the HSI This test is conducted during the functional requirements phase of the HSI design process.

The HSI design needs to be at a phase where prelimmary design concepts exist for the WPIS, the alarm system, workstation displays and procedures. In addition, preliminary concepts for the technical support center and offsite emergency response center manning, responsibilities and resources need to be available.

Test Bed Requirements:

l Physical Form - The WPIS, alarm system, workstation displays, and computerized procedure displays are representative of the AP600 HSIin terms of appearance. This includes display format, use of windows, display navigation mechanisms, and links among the different HSI resources (such as mechanisms linking alarm messages to particular workstation displays or procedures). The HSI features need not be high.

fidelity with respect to physical scale. For example, the WPIS and alarm system could be simulated on a VDU.

i Information content - A set of displays are developed to cover the set of faults included in the test, as well as to provide a set of realistic displays to test the adequacy of navigation.

j Dynamics - A dynamic plant simulation is required to drive the HSI. The plant simulation and displays need not be AP600-specific.

Evaluation Issues and Descriptions August 1996 m:\\3114w.wpf:1b 082296

7-39 Participant Characteristics Participants may indude designers, engineers, operator trainers, and operators. Participants have familiarity with the AP600 HSI features.

i i

a 7.3 EVALUATIONS FOR CONTROLLING PLANT STATE The purpose of evaluations in this subsection is to provide confidence that the HSI supports the operator in making changes in the plant state, induding:

j Control activities that are operator-paced Control tasks that require coordination of multiple procedures Control activities that are event-paced Control activities that require coordination among multiple individuals.

Control activities that require consideration of preconditions, side effects, and post-conditions of control actions The controlling plant state dass of evaluation issues indudes the following:

Issue 11: Do the HSI features support the operator in performing simple, operator-paced control tasks?

e Issue 12: Do the HSI features support the operator in performing control tasks that require assessment of preconditions, side effects, and post-conditions?

Issue 13: Do the HSI features support the operator in performing control tasks that require multiple procedures?

It, sue 14: Do the HSI features support the operator in performing event-paced control tasks?

Issue 15: Do the HSI features support the operator in performing control tasks that require coordination among crew members?

7.3.1 Evaluation Issue 11: Simple Operator-Paced Control Tasks Do the HSI features support the operator in performing simple, operator-paced control tasks?

Evaluation issues and Descriptions August 1996 m:\\3114w.wpf:ll>082296

7-40 l

Relevant HSI Resources:

Workstation displays and display navigation features Soft controls Computer-based and/or paper-based procedures Specific Concerns-Are the procedures well-coordinated with the workstation displays to allow efficient location and execution of control actions?

Do the workstation displays support the operator in efficiently locating relevant displays and executing control actions?

Are the soft controls provided in the workstation adequate for supporting operator execution of control actions (including providing adequate feedback on actuation of control action)?

Approach Control maneuvers (such as taking systems out of operation, or switching systems) represent a primary activity operators perform during normal and abnormal operations. The purpose of this test is to verify that the AP600 HSI can support operators in performing straightforward control maneuvers (that is, maneuvers that can be accomplished by a single operator, are operator-paced, and do not involve consideration of preconditions, side effects, or post-conditions). The HSIis evaluated by recording a number of performance measures l

while participants attempt to perform a series of straightforward control tasks.

Concept Testing Hypothesis The workstation' displays and the soft controls provided in the workstation support operator execution of control actions. These controls minimize errors, provide appropriate feedback on control actuation, and allow the operator to quickly correct actions identified as erroneous.

Experimental Manipulations Mechanisms for VDU-based (soft) controls are tested for various control actions (such as initiation / termination, tumng, or mode selection) and under varying task conditions, including the presence of time pressure and task distractions. Also, the coordination of controls with the displays that provide feedback for control actions is tested.

Evaluation Issues and Descriptions August 1996 nt\\3114w.wpf:1b-082296

7-41 Dependent Measures and Evaluation Criteria This evaluation uses breadboard designs to investigate human factors issues related to the selection of displays and controls and the execution of soft controls. Qualitative information is gathered through protocol analysis or debriefing of participants. The intention is to identify characteristics of the design concepts that lead to confusion, errors, and slow responses by participants in attempting to make an appropriate control action.

Objective dependent measures may include:

Efficiency of navigation - The number of displays traversed to locate a relevant display is compared to the ideal navigation path specified by design engineers.

\\

l Degree of coordination of displays and procedures - The number of shifts in displays i

to accomplish a procedure (particularly shifts back and forth between sets of displays; i

display thrashing) are recorded. This is compared to an optimal standard (such as each display supports several procedure steps; display shifts follow a logical j

4 progression and occur at logical breaks in procedure step grouping and, there is no I

display thrashing).

Number and type of execution errors - Ideally, number, type, and severity (that is, plant control versus navigation) of execution errors observed with the AP600 HSI are compared with number and type of execution errors observed for identical control tasks in a typical MCR (under identical conditions).

Ability to correct control actions not executed correctly.

Anthropometric problems with the soft controls (if any). Any problems locating, activating, or obtaining feedback on soft control activation are recorded.

Time required to complete task.

Implications of Results The purpose of this evaluation is to contribute to the development of functiona' requirements for the design of the physical and functional displays and the soft controls embedded within them. Specifically, the results may guide the design of the display navigation scheme, soft control representation, screen interaction devices, and control selection and actuation.

The qualitative information gathered from concept testing is analyzed to identify design features that lead to confusion, errors, and slowness. Functional requirements are developed Evaluation Issues and Descriptions August 1996 m:\\3114w.wpf:1b-062296 w

,-+w w

w-i-

7-42 to address those design characteristics that had significant effects on the participants' performance on the control task.

The quantitative measures may be used as baselines to compare alternative designs and evaluate performance benefits achieved through subsequent refinements of design concepts.

Stage of Development of the HSI This test is conducted during the functional requirements phase of the HSI design process.

The HSI design needs to be at a phase where design concepts exist for display formats, navigation aids, screen interaction devices and soft control mechanisms. Prelimmary decisions regarding display system hardware are made.

Test Bed Requirements:

Physical form - Computer-based displays are used to simulate the workstation displays. This includes soft controls that have high physical fidelity.

Information content - A set of workstation displays are developed to cover the set of control tasks tested, as well as to provide a set of realistic displays to test the adequacy of navigation.

Dynamics - Static displays may be adequate (that is, no changes in parameter values over time). The only dynamic characteristics required are changes in the display required to provide feedback of soft control actuation (that is, that the control was actuated and that the desired change in plant state took place).

Participant Characteristics Participants may include designers, engineers, operator trainers, and operators. Participants have familiarity with the operation of the workstation displays.

7.3.2 Evaluation Issue 12: Conditional Operator-Paced Control Tasks Do the HSI features support the operator in performing control tasks that require assessment of preconditions, side effects, and post-conditions?

Relevant HS1 Resources:

Workstation displays WPIS Evaluation Issues and Descriptions August 1996 m:\\3114w.wpf:1b.082296

7-43 Computer-based and/or paper-based procedures Soft controls Specific Concerns:

Do the WPIS and workstation displays support the operator in identifying violations in preconditions, side effects of control actions, and post-conditions that result from control actions?

Are the plant specifications (computer-based or paper-based) well coordinated with j

the workstation displays to allow efficient identification of violation of preconditions, side effects, and post-conditions that result from control actions?

Approach Control maneuvers can become complicated when operators need to consider action preconditions, subtle side effects, and necessary post-conditions (such as an action that results in a violation of plant specifications). The purpose of this test is to verify that the AP600 HSI can support operators in performing control maneuvers where preconditions, side-effects and/or post-conditions need to be considered. The tagging out of plant components is an example of this type of situation.

4 The proposed approach to test these issues is to record a number of performance measures while participants attempt to perform a series of control maneuvers that require consideration of preconditions, side effects and post-conditions.

Concept Testing Hypothesis The WPIS and workstation displays support the operator in identifying violations in preconditions, side effects of control actions, and post-conditions that result from control i

actions. Further, the plant Technical Specifications (TS) (computer-based or paper-based) are well coordinated with the workstation displays to support this same function.

Experimental Manipulations j

This evaluation uses breadboard designs of the workstation displays and control displays to investigate human factors issues related to ccmpleting control tasks that can lead to violations. Participants are given control tasks such as tagging out a component.

Participants are not told whether a violation of plant specifications (or other violation) is Evaluation Issues and Descriptions August 1996 m:\\3114w.wpf:ltN

7-44 likely. They have the WPIS displays, which provide them with an overview of current plant status, the workstation displays, and the plant specifications.

The evaluation manipulates the complexity of the control task in terms of the number of displays that must be accessed. The control tasks presented to participants sample from the following situations:

No violations occur from completion of the control task Preconditions for performing the task are violated Negative side effects for other ongoing activities occur through completion of the control tasks Completion of the control task results in plant systems being unavailable and/or operational goals being violated Completion of the task results in plant TS violations Dependent Measures and Evaluation Criteria The subject is asked to perform each of a series of control tasks. Objective, dependent measures may include whether:

There are any violations of preconditions for performing the task Performing the tasks results in negative side effects for other ongoing activities Completion of the task results in plant systems being unavailable and/or operational goals being violated Completion of the task results in plant TS violations The primary, dependent measures are the subject's ability to detect and, if posaible, take action to avoid violations of any kind. Subject comments and reactions to the breadboard HSI features are also solicited during the debriefing following completion of the control tasks.

The intention is to identify characteristics of the design concepts that led to confusion, difficulty, errors, or slowness. Other measures are the following:

Task completion time Display navigation paths Number of inappropriate displays selected Evaluation Issues and Descriptions August 1996 m:\\3114w.wpf:1M62296

7-45 Implications of Results The purpose of this evaluation is to contribute to the development of functional requirements for the design of the WPE, the physical and functional displays, the controls and control displays and the plant TS. Specifically, the results may guide development of functional j

requirements to enable operators to maintain a broad perspective of the plant and the interrelations between their actions and other ongoing activities.

The qualitative information gathered from concept testing is analyzed to identify design features that lead participants to miss preconditions, side effects, or post-conditions associated with their control task. Functional requirements are developed to address those j

design characteristics that have significant effects on the participants' performance on the control task.

)

The quantitative measures are used as baselines to compare altemative designs and evaluate l

performance benefits achieved through subsequent refinements of design concepts.

Stage of Development of the HSI l

This test is conducted during the functional requirements phase of the HSI design process.

j l

The HSI design needs to be at a phase where design concepts exist for the WPE displays, workstation displays, navigation aids, plant TS, rudimentary procedures, and soft control l

mechanisms.

Test Bed Requirements:

1 1

Physical form - Computer-based displays are used to simulate the WPE displays.

l Computer-based displays are also used to simulate the process data displays.

1 Information content - A set of workstation displays are developed to cover the set of control tasks tested, as well as to provide a set of realistic displays to test the adequacy of navigation.

t Dynamics - Static displays may be adequate (that is, no changes in parameter values i

l over time). The only dynamic characteristics required are changes in the WPE and l

workstation displays needed to indicate consequences of control actions on plant state.

These can be simulated by replacing one static display with another. For example, after a control action, a new static WPIS display is presented that provides revised indications of plant state. The display need not be AP600-specific, i

l i

l Evaluation Issues and Descriptions August 1996 l

m:\\3114w.wpf:1M)82296 l

7-46 Participant Characteristics Participants may indude designers, engineers, operator trainers, and operators. Participants must have familiarity with the operation of the workstation displays.

7.3.3 Evaluation Issue 13: Control Using Multiple, Simultaneous Procedures Do the HSI features support the operator in performing control tasks that require multiple procedures?

Relevant HSI Resources:

Workstation displays WPIS Computer-based and/or paper-based procedures Specific Concerns:

Does the design of the procedure display interfaces prevent operators from getting lost in nested procedures?

Does the design of display devices support the concurrent use of multiple independent (not nested) procedures?

Does the coordination of procedure displays with physical and functional displays allow effective use of plant displays during concurrent use of multiple procedures?

Approach Operators may be required to access more than one procedure at a time. There are typically two general cases of multiple-procedure use. One case is the use of independent, concurrent procedures. For example, an operator may be involved in both an operating procedure and a maintenance procedure. The second case is the use of nested procedures, where the first procedure refers the operator to a second procedure. In this case, the operator typically completes the second procedure and then returns to complete the first procedure. Given that these cases exist, the design of the procedure display interface must allow operators to accomplish several feats during control tasks:

Perform steps of procedures with minimal disruptions due to manipulations and adjustments of other procedures and corresponding plant displays Maintain an awareness of the dependent nature of nested procedures Evaluation Issues and Descriptions August 1996 at\\3114w.wpf;16482296

7-47 J

Maintain an awareness of the procedures that are "open" and which steps remain to a

be completed Therefore, success in these tasks requires considerable coordination between displays and procedures. The intent of this evaluation is to test an operator's ability to use multiple procedures in the two general cases described. Participants are given each -

multiple-procedure case and asked to work through the procedures. Performance is evaluated in terms of the subject's ability to maintain an awareness of the status of these procedures and their implications to plant state.

Concept Testing Hypothesis The procedures and workstation displays support the operator's ability to access and use multiple (independent and nested) procedures and to maintain an awareness of the status of these procedures and their implications to plant state.

Experimental Manipulations i

This evaluation uses broadboard designs to investigate alternative procedure display selection concepts. Important characteristics may include bookmarks and cther navigation aids; logical l

branch displays to identify open procedures and steps; control logic for accessing j

corresponding plant displays; and windowing features for depicting open procedures.

The approach for testing the case of mpltiple independent procedures is to have participants work through a plant prmdure(s) (such as a normal operating procedure and maintenance or surveillance procedures) that is unrelated to the procedure currently accessed. For example, while the operator is executing a procedure for change in plant power, he is asked to perform a surveillance procedure that requires a plant system to be realigned. The subject is allowed some flexibility for determinmg which procedure steps to perform first (that is, determming when to switch from one procedure to the other). As in the previous case, the subject is asked, at various predefined points, to identify which procedures are open and the implications of these open procedures to plant state. Performance is evaluated in terms of

]

correct access of procedures and correct response to questions regarding open procedures.

Inefficiencies in procedure and display use (such as excessive search and manipulation of displays) are noted.

The approach for testing the nested procedure case involves having participants work through a scenario that requires the use of nested procedures. The subject accesses the required procedure and a corresponding set of plant displays. Using the procedure and plant displays, the subject explains how each procedure step would be executed. The subject Evaluation Issues and Descriptions August 1996 m:\\3114w.wpf:1b-082296

7-48 accesses other nested procedures as required. At various predefined points in the scenario, the subject is asked to Mentify which procedures are open and the implications of these open procedures to plant state. The participants actions and comments are recorded on video tape. Performance is evaluated in terms of correct access of nested procedures and correct response to questions regarding open procedures. Inefficiencies in procedure and display use (such as excessive search and manipulation of displays) are noted.

After each test condition, the subject is debriefed to identify features of the procedure display system that made the task difficult. In the case of multiple independent procedures, the subject is questioned to determine whether constraints of the procedure display system affected the order in which the independent procedures were performed.

Dependent Measures and Evaluation Criteria Qualitative results include assessment of difficulties, delays, and inefficiencies induced by the i

design or performance of the procedure display system. Qualitative results also include participants comments concerning characteristics of the display perceived to make the task difficult, and comments regarding how the design of the display system affected the way independent procedures were executed.

Objective dependent measures may include:

Number of errors made in procedure and display navigation and selection Correct identification of "in-progress" procedures and steps awaiting completion Subject responses are compared to a predefined set of most correct responses. Performance is analyzed using protocol analysis to identify errors of intent (such as, the subject identified the wrong procedure but retrieved it correctly) and errors on execution (such as the subject identified the correct procedure but made a mistake while retrieving it). The subject's stated intentions and actual behavior are recorded and analyzed to identify the causes of these errors.

Implications of Results The purpose of this evaluation is to contribute to the development of functional requirements for the design of the procedures, particularly the computer-based procedures, and the physical and functional displays. The performance measures may be used to assess the relative merits of different procedure display concepts. The results may be used to identify procedure display concepts that lead to fewer navigation errors and a better awareness /

understanding of the status of active procedure steps.

~

Evaluation Issues and Descriptions August 1996 m:\\3114w.wpf:lt>482296

-. _ - - _ - - = -

7-49 I

The qualitative information gathered from concept testing is analyzed to identify design features that lead to confusion, difficulties, errors, and slowness. Functional requirements are developed to address those design characteristics that had significant effects on the participants' performance on the control task.

The quantitative measures are used as baselines to compare alternative designs and evaluate performance benefits achieved through subsequent refinements of design concepts.

Stage of Development of the HSI This test is conducted during the functional requirements phase of the HSI design process.

Procedures need to be defined for the specific scenarios addressed. Operator actions for accessing additional procedures need to be defined. Plant displays corresponding to the procedure steps need to be established.

Test Bed Requirements:

Physical Form - Procedures, procedure selection, and retrieval aids included in the test bed are of high-fidelity. Wording of text, labels, and titles is accurate. Character sizes and salience coding are well thought out and well executed. Menus are structured well. Procedures, procedure selection information, and plant displays are displayed in the same mode as in the MCR, either on VDUs or plasma panels.

Information Content - Scenarios for these tests are well defined and credible.

Procedures for these scenarios are complete, including full, properly formatted text for procedure steps. Clear criteria for referring the subject to other procedures are very important. Data values for plant parameters are credible, but need not be actual values derived from a simulation.

Dynamics - Computerized plant procedures must have as many operational properties as possible (such as scrolling, book marking, or electronic links between procedures). Individual static displays of the plant may be used. The human-machine interfaces for retrieving, and displaying procedures and plant displays need to be operational. The displays need not be AP600 specific.

Pr.dcipants Characteristics Participants may include designers, engineers, operator trainers, and operators. Participants have familiarity with the operation of the workstation displays.

Evaluation Issues and Descriptions August 1996 mA3114w.wpf:1b-082296

7-50 7.3.4 Evaluation Issue 14: Event-Paced Control Tasks Do the HSI features support the operator in performing event-paced control tasks?

Relevant HSI Resources:

Workstation displays Soft controls WPIS Computer-based and/or paper-based procedures

=

1 Specific Concerns:

Do the workstation displays support the operator in locating relevant displays and executing control actions at a rate that allows the operator to keep pace with the i

event?

'Are the computer-based (or paper-based) control procedures well coordinated with the workstation displays to allow the operator to keep pace with the event?

In cases where event dynamics are slow (that is, long response time to reach desired state for a step in the procedure), do the displays and computer-based control procedures allow the operator to go on to perform the next steps (that is, suspend a step) and return to complete the pending step at the appropriate time?

Do the soft controls support the operator in execution of control actions and

=

evaluation of feedback in pace with the event?

Approach Many operator activities performed during normal and abnormal operation involve dynamic control tasks (such as plant startup, plant mode changes, and load changes) where the operator activity keeps pace with plant dynamics. Keeping pace often refers to the operator's ability to execute actions quickly enough to stay ahead of the dynamics of a plant state progression. However, problems can also arise when the event moves slowly. In this case, the operator may have to suspend one procedural step (such as wait for a parameter value to reach a threshold) but not other steps subsequent to it. The operator may continue to complete steps subsequent to the suspended step. This decision creates a need for the operator to remember to complete a step that is no longer cued by the procedures.

Therefore, when the condition is satisfied (such as the value reaches the threshold), the operator returns to the step and executes it.

Evaluation Issues and Descriptions August 1996 m:\\3114w.wpf:1M82296

..s 7-51 l

The purpose of this test is to verify that the AP600 HSI can support operators in performing event-paced control tasks. The proposed approach to test these issues is to have participants attempt to perform a series of event-paced control tasks and to record a number of performance measures.

Concept Testing Hypothesis The workstation displays, the computer-based procedures, and soft controls support the operator in efficiently locating relevant displays and executing control actions in pace with process dynamics. In cases where event dynamics are slow, the operator is able to perform subsequent steps (that is, suspend a step) and return to complete the pending step at the appropriate time.

Experimental Manipulations The ploposed approach to test these issues is to have participants attempt to parform a series of event-paced control tasks. Two types of control tasks are used. The first type of control task involves rapid process dynamics (such as manual feedwater control during startup),

requiring skilled operator response to keep up with process dynamics. The second type of control task involves slow dynamics (processes with a long response time to reach the desired state) so that operators are required to initiate processes, go on to other activities, and then return to confirm that the process goal states are achieved and complete pending steps.

Dependent Measures and Evaluation Criteria Qualitative results include assessment of difficulties, delays, and inefficiencies induced by the design or performance of the control and displays systems. Qualitative results also include comments from the participants concerning characteristics of the displays or controls that were perceived to make the task difficult, and comments regarding how the design of the display system affected the way independent procedures were executed.

Objective dependent measures may include:

Ability of the operator to keep up with process dynamics (Are process goal states reached in adequate time? Are process limits exceeded?). Evaluation criteria require determmation of time limits within which control tasks must be completed, and control boundaries that are not exceeded (such as trip setpoints)

Number and types of errors of execution (Are steps omitted? Does the operator fail to return to complete pending steps? Are boundary limits exceeded?)

r 1

Evaluation Issues and Descriptions August 1996 m:\\3114w.wpf:lt>482296

7-52 Whether operators detect and correct errors of execution when they occur Anthropometric problems with the soft controls (if any)

Degree of coordination of displays and procedures. The number of shifts in displays to accomplish a procedure (particularly shifts back and forth between sets of displays

- display thrashing - are recorded)

Implications of Results The purpose of this evaluation is to contribute to the development of functional requirements for the design of the controls and the physical and functional displays. The performance measures may be used to assess the relative merits of different control and display concepts, including the following:

Inforrration on the ease of locating relevant displays, controls, and procedures in pace with process dynamics and improvements that may be needed Information on whether the displays and procedures are well coordinated, allowing operators to keep pace with process dynamics and improvements that may be needed Effect of the HSI on errors of execution and the ability to detect and correct errors of execution and improvements that may be needed Adequacy of anthropometric characteristics of soft controls (such as size, shape, or saliency) for supporting event-paced control activities and improvements that may be needed The qualitative information gathered from concept testing is analyzed to identify design features that lead to confusion, difficulties, errors, and slowness. Funictional requirements are developed to address those design characteristics that are found to have significant effects on the participants' performance on the control task.

The quantitative measures may be used as baselines to compare alternative designs and evaluate performance benefits achieved through subsequent refinements of design concepts.

Evaluation Issues and Descriptions August 1996 m:\\3114w.wpf:1b 082296

7-53 Stage of Development of the HSI This test is conducted during the functional requirements phase of the HSI design process.

The following components need to be available:

Workstation displays for the event-paced control tasks selected; a dynamic prototype of the workstation that includes a set of realistic displays and navigation mechanisms to test the adequacy of navigation A dynamic prototype of the workstation that includes soft controls that have high physical form fidelity (that is, size, shape, saliency, actuation feedback characteristics)

Procedures (either paper-based or computer-based)

A high-fidelity plant simulation that models the plant dynamics for the event-paced control tasks selected is necessary to drive the displays. The displays need not be AP600-specific.

Test Bed Requirements:

Physical form - Computer-based dynamic displays are used to simulate the workstation display. This includes soft controls that have high physical fidelity.

Information content - A set of workstation displays is developed to cover the set of control tasks tested, as well as to provide a set of realistic displays to test the adequacy of navigation.

1 Dynamics - A dynamic plant simulation is required to drive the workstation displays

]

to simulate the plant dynamics involved in the event-paced control tasks selected.

The displays need not be AP600-specific.

Participant Characteristics Participants may include designers, engineers, operator trainers, and operators. Participants need to have familiarity with the operation of the workstation displays.

7.3.5 Evaluation Issue 15: Control Tasks Requiring Crew Coordination Do the HSI features support the operator in performing control tasks that require coordination among crew members?

l i

Evaluation Issues and Descriptions August 1996 m:\\3114w.wpf:1t>482296 1

4 7-54

{

Relevant HSI Resources:

1 i

Workstation displays j

Soft controls l

WPIS Computer-based and/or paper-based procedures i

Specific Concerns:

i Do the WPIS displays, workstation displays, and procedures allow an operator to:

Maintain awareness of control actions of other personnel working in parallel Provide a common frame of reference and promote common mental models of plant state Anticipate the consequences of the control actions of other personnel working in parallel i

Coordinate activities with other personnel working in parallel Develop control strategies that take into account the control actions of other personnel working in parallel (that is, build on the activities of the other personnel rather than work at cross-purposes) i Monitor performance of others to verify actions and identify and correct errors Allocate tasks among crew members as plant conditions change to improve efficiency of performance, provide assistance, and/or avoid reaching undesirable plant states Approach The purpose of this test is to verify that the AP600 HSI can support operators in performing control tasks that require coordination among multiple individuals. Coordination supports increased error checking, more efficient use of human resources, and better response to changing plant conditions.

Participants are placed in control tasks requiring coordination to test several facets of crew coordination. The primary experimental manipulations are:

Type of control task presented, including simultaneous but related and simultaneous but unrelated procedures Evaluation Issues and Descriptions August 1996 m:\\3114w.wpf 1b-082296

l 7-55 Whether each of the operators forming a crew are participants or whether only one is the subject and the others are confederates whose actions are determined by scripts Whether the crews are observed performing the control task uninterrupted or whether the simulation is frozen at specified points and the operators are asked questions relating to their knowledge of the activities of the other operators and their consequences Concept Testing Hypothesis

^

The WPIS displays, workstation displays, and the computer-based procedures support crew coordination in control tasks by enabling crew members to:

Maintain awareness of control actions of other personnel working in parallel (that is, they provide a common frame of reference and promote common mental models of plant state)

Anticipate the consequences of the control actions of other personnel working in parallel

)

Coordinate activities with other personnel working in parallel Develop control strategies that take into account the control actions of other personnel working in parallel (that is, build on the activities of the other personnel rather than work at cross-purposes)

Allocate tasks flexibly (dynamically) among themselves in order to improve efficiency

=

of performance and/or avoid reaching undesirable plant states.

Experimental Manipulations This evaluation uses concept designs of the workstation, the WPIS, and the procedures to test the human factors issues related to crew coordination in control tasks. The approach is to have participants attempt to perform a series of control maneuvers that require coordination among multiple individuals, and examine whether operators are able to maintain awareness of the activities of others and coordinate with them effectively. This requires setting up a dynamic test bed that enables multiple operators to interact with plant processes simultaneously. Two experimental conditions are proposed. In one, multiple operators participate as a crew in the study and their interaction and coordination is observed. This condition is more realistic. In the second condition, which is more controlled, one operator is Evaluation issues and Descriptions August 1996 mA3114w.wpf:lt482296

7-56 the subject of the study. One or more operators are used to complete the crew required to perform the control task, but these additional individuals are part of the experimental team (that is, experiment confederates). Their actions are determined by a script designed to create critical crew interaction situations with the operator who is serving as the subject. For example, the confederate might take an action that has undesirable effects on the process controlled by the subject. The subject detects this error, brings it to the other operator's attention, and attempts to resolve the situation.

l l

At various points in the control maneuver, the simulation is frozen, and the operators participating in the study are asked a series of questions designed to assess:

Awareness of the activities of the other operator (s) 1 l

Awareness of the impact of the activities of the other operator (s) on their activity, and l

vice versa l

Ability to anticipate the future consequences of their activities on the activities of the other operator (s) and vice versa Ability to formulate coordination strategies that build on the activities of the other operators rather than working at cross-purposes Dependent Measures and Evaluation Criteria j

Qualitative results include assessment of difficulties, delays, and inefficiencies induced by the l

design or performance of the control and displays systems. Qualitative results also include l

comments from the participants concerning characteristics of the displays or controls that they felt made the task difficult, and comments regarding any effect that the design of the display system had on the way they executed independent procedures.

Objective dependent measures may include:

The adequacy of performance on the control task (that is, the ability and time to achieve goal states, and to limit violations)

The degree of crew communication and coordination (that is, using decision-trace methodology) l The responses to questions relating to operator awareness of activities of the other l

operators and their consequences 1

Evaluation Issues and Descriptions August 1996 m:\\3114w.wpf:1b 082296

... ~

l 7-57 Criteria may be set for adequate response (such as maximum time to achieve goal state, or no

' limit violations). Another criterion is the subject's success in detecting and describing the actions of crew members that conflict with his actions.

Implications of Results t

The purpose of this evaluation is to contribute to the development of functional requirements for the design of the controls, the WPE, and the physical and functional displays. The performance measures may be used to assess the relative merits of different control and display concepts.

The qualitative information gathered from concept testing is analyzed to identify design features that lead to confusion, difficulties, errors, and slowness. Functional requirements are developed to address those design characteristics that had significant effects on the participants' performance on the control task.

The quantitative measures may be used as baselines to compare alternative designs and evaluate performance benefits achieved through subsequent refinements of design concepts.

Stage of Development of the HSI This test is conducted during the functional requirements phase of the HSI design process.

The following components need to be available: multiple workstations to support multiple operators working in parallel, workstation displays for the control tasks selected, and procedures (either paper-based or computer-based).

A high-fidelity plant simulation that models the plant dynamics for the event-paced control tasks selected is necessary to drive the displays.

Test Bed Requirements:

Physical form - Multiple workstations are high-fidelity with respect to physical form and layout in the MCR. A WPE is high-fidelity in physical form (such as size, location relative to workstations, and display characteristics).

Information content-A set of WPE and workstation displays is developed to cover the set of control tasks tested.

Dynamics - A dynamic plant simulation is required to drive the WPE and workstation displays to simulate the plant dynamics involved in the operator coordination control tasks selected. The plant simulation need not be AP600-specific.

Evaluation Issues and Descriptions August 1996 m:\\3114w.wphit>062296

7-58 Participant Characteristics l

Participants can include designers, engineers, operator trainers, and operators. Participants have familiarity with the operation of the workstation displays.

l 7.4 EVALUATIONS FOR CONFORMANCE TO HUMAN FACTORS I

ENGINEERING DESIGN GUIDELINES The purpose of these evaluations is to provide confidence that the HSI features satisfy l

relevant HFE design guidelines and operator requirements for comfort and ease of use.

The HFE evaluation issue follows:

l Issue 16: Do the HSI components satisfy relevant HFE criteria for acceptability?

7.4.1 Evaluation Issue 16: Conformance to HFE Guidelines Do the HSI components satisfy relevant HFE criteria for acceptability?

Approach This evaluation corresponds to the HFE design verification task described in Reference 2.

The objective of the HFE design verification is to verify that all aspects of the HSI (for example, controls, displays, procedures, and data proce: ting) are consistent with accepted HFE guidelines, standards, and principles. Reference 2 provides a description of the activities performed as part of the HFE design verification task.

7.5 EVALUATIONS FOR VALIDATION OF INTEGRATED HSI This evaluation corresponds to the integrated system validation described in Reference 2.

The purpose of this evaluation is to provide confidence that the integration of the HSI features satisfies the design mission of supporting safe and efficient operation of the AP600 in a variety of plant conditions.

Validation of the integrated man-machine interface system (M-MIS) is:

Issue 17: Does the integration of HSI components satisfy requirements for validation of MCR functions and integrated performance capabilities?

l l

Evaluation Issues and Descriptions August 1996 m:\\3114w.wpf-It>o82296

7-59 7.5.1 Evaluation Issue 17: Validation of Integrated HSI Does the integration of HSI components satisfy requirements for validation of MCR functions and integrated performance capabilities?

Relevant HSI Resources:

Plant Information System (including functional and physical displays of plant processes)

Alarm system Computerized procedure system and/or paper-based procedures Dedicated and soft (computer-based) controls WPIS Qualified data processing system Specific Concerns:

Does the integration of HSI components in the MCR support operator performance requirements for normal, abnormal, and emergency conditions?

Approach This evaluation corresponds to the integrated system validation task described in Reference 2.

The objective of integrated system validation is to ensure that the functions and tasks allocated to the plant personnel can be accomplished with the HSI design implementation.

Explicitly included in the integrated system validation is validation of the AP600 EOPs.

An implementation plan is developed that specifies a methodology for integrated system validation prior to test performance.

Evaluation Issues and Descriptions August 1996 m:\\3114w.wpf:lt>482296

lJ 8-1 8

REFERENCES j

1.

Kerch, S., Roth, E. M. & Mumaw, R. J., Man-in-the-Loop Test Plan Description, WCAP-14396, Rev. O,1996.

j 2.

Roth, E. M. & Kerch, S., Programmatic Level Description of the AP600 Human Factors Verification and Validation Plan, WCAP-14401,1995.

I 3.

O'Hara, J. M. and Wachtel, J.,1991, " Advanced Control Room Evaluation: General Approach and Rationale" in "Proceedmgs of the Human Factors 35th Annual Meeting," pp. 1243-1247, (Santa Monica, CA, Human Factors Society).

4.

Woods, D. D. and Roth, E. M.,1988, " Cognitive Systems Engineering," Helander, M.

(ed.), " Handbook of Human-Computer Interaction," pp.3-43, (New York, NY, Elsevier l

Science Publishing Co., Inc.).

5.

Helander, M. (ed.),1988, " Handbook of Human-Computer Interaction," (New York,

)

NY, Elsevier Science Publishing Co., Inc.).

6.

Woods, D. D.,1992, " Process Tracing Methods for the Study of Cognition Outside of i

the Experimental Psychology Laboratory" in " Decision Making in Action: Models and Methods," Klein, G., Calderwood, R., and Orasanu, J. (eds.), (Norwood, NJ, Ablex).

7.

Woods, D. D. and Sarter, N. B.,1992, " Evaluating the Impact of New Technology on Human-Machine Cooperation," Wise, J. A., Hopkins, V. D. & Stager, P. (eds.),

]

pp.133-158, (Berlin, Germany, Springer-Verlag).

{

8.

Stubler, W. F., Roth, E. M. & Mumaw, R.,1992 " Integrating Verification and Validation with the Design of Complex Man-Machine Systems," Wise, J. A.,

l Hopkins, V. D. & Stager, P. (eds.), pp. 159-172, (Berlin, Germany, Springer-Verlag).

9.

Meister, D.,1987, " Systems Design, Development and Testing" in " Handbook of Human Factors," Salvendy, G. (ed.), pp.17-42, pp.1271-1297, (New York, NY, John Wiley & Sons).

10.

Rasmussen, J.,1986, "Information Processing and Human-Machine Interaction, Approach to Cognitive Engineering," (New York, North-Holland).

11.

Woods, D. D., Wise, J. A., and Hanes, L. F.,1982, " Evaluation of Safety Parameter Display Concepts," NP-2239, (Palo Alto, CA, Electric Power Research Institute).

References August 1996 m:\\3114w,wphib 082296

8-2 12.

Woods, D. D. and Roth, E. M.,1982 (unpublished study, Proprietary), " Operator Performance in Simulated Process Control Emergencies," (Pittsburgh, PA, Westinghouse Science and Technology Center).

13.

Woods, D. D. and Roth, E. M.,1986, "The Role of Cognitive Modeling in Nudear Power Plant Personnel Activities," NUREG-CR-4532, Volume 1, (Washington, DC, U.S.

Nudear Regulatory Commission).

14.

Roth, E. M., Mumaw, R. J., and Lewis, P. M.,1994, "An Empirical Investigation of Operator Performance in Cognitively Demanding Simulated Emergencies,"

NUREG/CR-6208, (Washington, DC, U.S. Nudear Regulatory Commission).

15.

Roth, E. M. and Woods, D. D.,1988, " Aiding Human Performance: I. Cognitive Analysis" in "Le Travail Humain," Volume 51 (1), pp. 39-64.

16.

Woods, D. D., and Holinagel, E.1987, " Mapping Cognitive Demands in Complex Problem Solving Worlds" in " International Journal of Man-Machine Studies,"

Volume 26, pages 257-275. (New York, Academic Press).

17.

Woods, D. D.,1988, " Coping with Complexity: The Psychology of Human Behavior in Complex Systems" in Goodstein L. P., Andersen, H. B., and Olsen, S. E. (eds.), " Tasks, Errors, and Mental Models," (London, UK, Taylor & Francis).

18.

Mumaw, R. J., Swatzler, D., Roth, E. M., and Thomas, W. A.,1994, " Cognitive Skill Training for Deosion Making," NUREG/CR-6126, (Washington, DC, U.S. Nudear Regulatory Commission).

19.

Vicente, K. J., Burns, C. M., Mumaw, R. J. and Roth, E. M.,1996, "How Do Operators Monitor a Nudear Power Plant? A Field Study," Proceedings of the 1996 American Nudear Society International Topical Meeting on Nudear Plant Instrumentation, Control and Human-Machine Interface Technologies, pp.1127-1134. (NPIC&HMIT'96, La Grange Park, Illinois, American Nuclear Society).

]

20.

Woods, D. D., Roth, E. M., and Pople, H. E., Jr.,1987, " Cognitive Environment Simulation: An ArtificialIntelligence System for Human Performance Assessment,"

NUREG-CR-4862, (Washington, DC, United States Nudear Regulatory Commission).

21.

Reason, J. T.,1990, " Human Error," (Cambridge, UK, Cambridge University Press).

l References August 1996 m:\\3114w.wpf Ib 082296

- _ = _ _ _ _ _

8-3 i

22.

Taylor, J. H., O'Hara, J., Lucks, W. J., Parry, G. W., Cooper, S. E., Roth, E. M.,

Bley, D. C., and Wreathall, J.,1996." Frame-of Reference Manual for ATHEANA: A Technique for Human Error Analysis," Tech. Rep. L-2415/96-1, (Upton, New York,

]

Brookhaven National Laboratory).

23.

Stubler, W. F., Roth, E. M., and Mumaw, R. J.,1991, " Evaluation Issues for Computer-Based Control Rooms" in " Proceedings of the Human Factors Society 35th Annual 3

Meeting," pp. 383-387, (Santa Monica, CA, Human Factors Society).

i 24.

Mumaw, R. J., Roth, E. M., and Stubler, W. F.,1991, "An Analytic Technique for Framing Control Room Evaluation Issues" in " Proceedings of the IEEE International Conference on Systems, Man and Cybemetics," pp. 1355-1360, (Charlottesville, VA, The Institute of Electrical and Electronic Engineers).

]

1 25.

Woods, D. D.,1982, " Application of Safety Parameter Display Evaluation Project to Design of Westinghouse SPDS," Appendix E to " Emergency Response Facilities Design and V&V Process," WCAP-10170, submitted to the U.S. Nuclear Regulatory Commission in support of their review of the Westinghouse Generic Safety Parameter

)

Display System Non-Proprietary, (Pittsburgh, PA, Westinghouse Electric Corp.).

26.

Roth, E. M. " Description of the Operator Decision-Making Model and Function Based Task Analysis Methodology," WCAP-14695,1996.

j 27.

Endsley, M. R.,1995, "Towards a Theory of Situation Awareness in Dynamic l

Systems," Human Factors,3Z,65-84.

j 28.

Holinagel, E., Pederson, O. M., and Rasmussen, J.,1981, " Notes on Human Performance Analysis," Tech. Rep. RISO-M-2285, (Roskilde, Denmark, RISO National Laboratory).

29.

Roth, E. M., Bennett, K. B., and Woods, D. D.,1987, " Human Interaction with an Intelligent Machine" in " International Joumal of Man-Machine Studies," Volume 27, pp. 479-525.

30.

Norman, D. A.,1981, " Categorization of Action Slips" in " Psychological Review,"

Volume 88, pp.1-15.

References August 1996 m:\\3114w.wpf.lb 082796 a

__ -