ML20117F092

From kanterella
Jump to navigation Jump to search
Man-In-The-Loop Test Plan Description
ML20117F092
Person / Time
Site: 05200003
Issue date: 08/31/1996
From: Kerch S, Reid J
WESTINGHOUSE ELECTRIC COMPANY, DIV OF CBS CORP.
To:
Shared Package
ML20117F084 List:
References
WCAP-14396, WCAP-14396-R01, WCAP-14396-R1, NUDOCS 9609030298
Download: ML20117F092 (37)


Text

'

.:.e..t.:p.'....(,....:.

? j. W.y : L:,.: %.'i'.,. :., g.l 8...:.,. w'

  • 3. :.y ' ; -* :. '

f.(.:*. e ;' ' ': ::

..; *..;..~.

u.x.

. e-* ? *a.

..uss..,

e_%...* Q,. ;;;.\\

..'.'4,..

i.

L. :

%N 1..,:y:.

3...h

' e.g $.1 g i.., si n a. s;. M*.%..gi.;y;f b* ;,.

ti,.. '. :,[a.:.?.

r

.;.... p. u ' r g,; - y :u s: "?'.:V,...y.y : '.Li.i.. ?* :.s

.*.% <. p n.y.j,.'".*

.L *. *1

7
:.y.: l4

.v

. ::' w : W ~ >.<

e

),:q:1,'; ;?:;;Q[); ' 4 l v:: ;;;:._ ; %,.,,g' n.,:%..., ? ;e..., '.....?:.':;s :X,*. '. Q..

.e ' : %;i..R,." f: p,#

,i. :: '.;J:::.: :

N o

i -

.o;.;,. !.*([ E **;:::9 yl.:y!,:.(.?f

y. 5i.y*b : ' _..

j 3

q.,y

^p,, [

' 1*:'q

. f'. ' ;. ;^l, : i' ',iy,Q.l).;} s. j_ ? ':Q:f 4' :l;,, ' _ li.. :%.,' }'!,*...Q.,(fi.'. -;/' f ls ': :[J.,:.l; g fr., ;.,.3:

\\ if M[.[fj$*.[ <*y...

.., Y ;'

  • t!.bi, } : M*"h
    • , i y. l" ;:s
  • ,?l'. : Nr r

y '}*.; o'hl?

p ;.: [' ' :

g..

g

, *. *.',.q 4 ':.J, ;,:. ( :

. b

?.

A

?

.'.,0.: &:. f ll, e: "

I

  • E/:, **"*: :.. ~...
  • .Q ?'.f; A"'-WS&')lp $.hN.W.?!?Sl 's llYI.. N k: ?. {.' Y..?'.*.,. h, p i l':. j)t.'' -

,':..' :l E. '*,' k,'.%. 1*l '*V' 1 I:

$ :.'.h*[.?. :'Y. '!AN ?

sjo...:?.,; 5..T'a?

1 b%

ME j. \\,.

' j *:1^ ' l'..* 3 :..

f.*..

-('.. - -Q ^ :-'

? ** '., C,l, }.c.' $'f' '.,

C,. - L..-.....:..,..s...

f.,.,.,

.q

.,x.... j,;.

...'***.;.....~s....sy n *:

g,,:-

s.s l.L*Q.5 4%:l:y? y [i.9QlYj. N,}' * '.?9: 'f'k.,.q,fQ A

$... QgQ.T... ;;.q::y W::: ' & ' L.: &.";.. ';. '.":2.1;;. ; ;* y:f..... f...,;,..1.f:p.D

.s

,.s
.s:..

g

. f: 1.,f@l.]f1:lM

?.' *

'.f. ' '

't*.,.fh..' : :

Q s.. - (
lf..
,;,4
}f; 4 Q

.!,f.,

M;.',. ; p

.; l

,,; f,j i isj::A :.n,Qg,, t:, J.'gv;i:l. y;,*.".. d, * <y:.7,,r'*L%,. _

u _ a.47..n p

>. ;n, i.

9.h: y @ '.y

,,.y..

7 *.

.g v

^ [,W..;...l: f.' c.:.? :;,.sfl... {g.7... : ;;Y. J',, '.jQ i,'..,,%. _z..( a,e e%... _,.r g.. :.., ',. : f..p y,%;.,

. l.;;\\;,;g..:!f.,y. V..f n.q:.

e q 5. f..

r 1 :.

.l'

.. f.

.:.y>.

.. g,....

3g, 3,,.

'._ ';);t

.f 3,,...

Q.. c j.p s 4 y

..s

....,...z.... ;.; ;

a.,.. _..q.

.r

.....z

. w

,.,:.....,ll:..,...,... ;... g.... > m.a. j,.d.l,.;,*.;_':V._d';&

. g.... :, ; _,..;

y..,

....,...g

,. '.};. } '[.,j,. ):G.9 y,.'::, p l.?} ;'.l;}.l;,..,.s.

5..
f. y..,;.

v

_...,I.j,7,*,','.;*.

'l-l:... I }., * ';'.: ' %..*

.,6I ::

s.'-. '!

l1.

y _..., :*ic..., f, i, ; *,. b... *.. ' f.v,

-l. e t

,- l ' ?.;: ~ ;...he.

j;g '. ?: W '.*V. ",',;i.' '.

f..

.-, e

    • ,v

. s f ;". *!.; 3' ' * : :,!y; 'e.. ' * >{' ;_.,M,

,; '* _ i -,;

. Y' f. ' %

s.*

.r k " * :;q :4.,.,'. ; ',.j., ' 'hs. ',.*

. ?. '.:,.

  • ..s,
  • sI
r, ". l.

. l,

  • ^.,

'. ? ' i

,l*;.; n. ' { ',,. ' ( '., s. '. ;.;.-

'.l.',.,yl,,'...._.

g;.',"p' r

-, a, g, s

. '~ ' n 'I;;!.. :::i f **Q

,,:.. c:.. #

.r i. -

p

.s

~ s e.:..

. ': ;;. '-:1;.,3,;.',. :l s: t-

e s

't.i

s. -

%%,. t " * -! F, *,,f.t f + _.. *.-, '.,

'.:; 1.'.)

. ; ;' -.T

\\..

)

s' e..f :l '. )

... ;; y.....'.' L.

  • :..r" 1
, s ?*

.a l' l ) ;,.

< r :.,..< y :,, ;. * :. ' ;(.. *.:.,.'t..s', e ' '.

.oe..,1,...~..'...

-ls,

. ;..,,. %. l ^

b. i:

t<-.*

l' :......,, '.

t..

.-..:),. '

p-

.1 ; ~-.

.............y..,

?,.

s.

%. e'>r ;..,.,., s,:h '.., n

' 9...

.. ". c '.: "e :.'.., 4..'s.,...

.....:.,yf..'-
  • .s

' ' ' M *.; *. } g. r. 'q

.y'..'.

.,; ;. ny

.,& \\ ; '

" n*

, :... ' '.,' i, :.... ;;

c. ' ":.$ * '_ *.;<. " :
: '.:,.',..... :)..

..,.t*..'..,; '. 7 : '.t.,.,s..W., #

.sti. 1 Q..

n o

.X t. '.' t ' e* s L

.1.. ' 7,&

^ ' <.g*

'q. _

' ~

.,l: g '. ',.

s..

.. q;,q ;......,.,,.. ' i, z;;

w

. y,,,

' 43 's..,... *. ~.

e

,' ' l

,c-,:

.m

,4

.n a

. ::.;., i,,:,',

,1

.?:'y ?, '. l ,-,

,m l* " '

  • l :.. '

. -c..l,.,:,,,p.., :...

'p. ', j *. ', '. %.

'l. '

,
. ;.:
  • 3 -,'
, c,-). y - _ ',. -l *
.. ',,.pdj.4

'~'^

ef.

,, [c,.,

.,2;

-,.,.,,7

,,,.,,..,;#g :;..#

r ; ;

i.

.,:f' s

b.

.,:f,

  • l '.>. o.' '. ;'.:- ';.:..'..

,,"l i c.;;'.

  • l'%*
  • ..";.
  • e '.3. S

.,.'\\*-

  • 4

, %;., '. :r..%....,.... Y. *l:, ;'.'

.r',.,

l:.

.' 4

',6;

. 2.'*' p' \\ ;,. s..,. : '..:'n'.. )"c ll'* l: f c., '

-.~ '

g-

. ;,4

...o,.

,.,,:....tg,...,... :. i,..,'n.;.' :.;...,4. : ;,,....

. M

,u.-

'y n-

... i.. :

s

. L,{

... 10.. ' e.r..%;1 '. :: ',.;,t.-

"5

.';",,'.,;.,'..'-.,.,-..,',.;.,.v..

  • c. '.;.e

,.. ;)

,.....,..v..,

s

.ns -...'...,*:':,*

s:.

.,_1,

...f. - t :* '",

,,s r

p,.,,

~ e.

,1 s.,..,

,.. C

,1

.......i,-

.,,.,.cf.,.

s ;s

'C D-

,...,/ }

5

{,,,,,

  • ,' ', *['

i

Q !

'..,g*.,-

c5

- *? ;.

  • l,~.

o.'.5.,.

' 5 '

[,

d,,

.-.,*,,,,,,.,:.,.,,.......,'.s.

9.......,,

h yj

+

,.'.,..,*~..L.,.,.,.,-

-..i

.j.,;.,,

e

.,.jg

,c, ;

,1(..,..,,

a.

'..t. :.A

.s. ' '. (; ;.,':,n.,

  • f.

a*

,..,.,, y.

x..,.s.,,

.,*1 i,,.,.,,,

r4

., ' ".,,,..,.'.,"..,..,y.

. ". G..'.,...,,...,.

..:.,. 4.,"..,,,;..

$.,.l ;

'.; %l;.,{,,l l

... ',..;.,. s :..' '. z.,1.,,f.. )3j

,1,..".e

, s'.

5

. g. :,.

,.,/

6'1

".,.o,,

,g..4

...,,.,...4,.es's.

.,/...,a p,a.

..i: q,.. _ -..

g,.,t.

.,...*a. : ;. s.,...*,

......r,.,....

~

b.

.~'

,.,. -o,.......,

.s.'..,,......r.....:

  • ;. ' 1,. y.i.sy

-'t,

..,r,..,.,.<s

.s%.,,.t~:....

.i

.s

- ;, _. * ')(:;,'q..~j..,.....',e, :y

-f :.a.,

.~"g

  • ....u,r..

.=

....e;,

,- ;.u.

,y,,,..a,-,...<

,.;-..,e,.

l.

v,9, o-y..,;..+

,,;,,.-:,t

_.. n. :,;s.. _....,;_.-

,....p..

s

.y,s ;> '.

- ;... _ ; m, ;;.,;y ;

y~,,s:e..,7 g: ; :..;:y,....c..p *., q '. ', ;G..,, _, :,, g;

..s;.,,., "%...,;. ;. - _..; ; ;, ~_:,

e

-. y-

-e*

q,

r. ;.,..... * * *. *.......,:'

'.,s,

.,.<.,;...5.,'...s

)

4....

(

.s:7..'. ;. ;.

_. -..?. s _ e s.-

M

==.,s.,,. ;,.,...,

....n...,,',__';..,.,..,'.q9..,.*,**

_.,, '., 4

.g

<g

+

4,

,/*

....p.;.......:.,,..,q,...

k. : ;;:..v'.

^.

s

-.n,":... : s ;- *' M;'; b* l : \\

s.

r

~*.

[._

'...;...,"o...;..'

.o s,

' ' '.s,i.

. h. c.. :. <:...

!,.:'.. l

. ;_ a '.., y g:,'...:...

..J.:.y. f:.: :n a . t:-

, ~., ".- -

.} A

.c

.< ~.-

,.
. s*,. '
t..;.
  • v

'P, ; ;.

_ a. :... >

,a.

p.

..:.. y c.

,,1 y,.- r.,' '. %.,l.

-e

,.,.c,y..::

  • ?;., e. *l,;:,. ;..

l. ;

,) * ~ i:..*

u..

q
h....,..:; _:.

c;. _. ;

  • 7;....,

.g_'_,,,'.:

-'b

  • c.. :

..~,.,,',,',s"._.

a n.

.'\\,v

,...: -...f,,..,

,, l.

..e ;!.'.f.. [*. ' ::.< l'. :, <

g, -

r -.

  • s..

,...>..s.. :...;,.. 1,..c..,{,.,,',,.,u

>r....

"..l,,..,*...J:s'.,r,e-

  • ...g y:l. ';9.:

.a.,.

e

-a......

....,g..,_..n..-,,.,

,5

..,.:. :..a

,y -

.3)..'

U.

s.,.

..y.;.,,...

.f;] 'l 'l., ;..I%

)

u ';.. ,, '] ' :-; ', - a ~...Mc '

Q.7, h '. '4 '. :. 'm. ' <

_...Y r..

A

.. l. : ',,.:_

J...

i..:..
  1. .: ' 4 '~,. :. y ".*R.{>'f.'?.....'.,;,.- ,

.?.'

l 5 %.'.[ " l r

'. :s:. :

s

,..'a1,,"_','.; ". ;:.l\\.D'.C.[l'll,;:..JN;K.:..;y.'.T1

' lIi.'.,>.O.' ** l#.,N. f.: : -

(

. &:.,. ' '... '. l '.l -

i:).:

".C

' f

.. ; ;.. 'g' *.

.?" '.**? '.p".

,n'

?.s..... W.. > i. ';".t. '%. w.. - f Nr ;. ?. Y:. M.t : ?. s ' ' ; :Q.h;. i.; ll.

.,:;,p:; ;?. k'u

>.. -. ;,\\. -

.,.~.......,~.,...,'.g,........i*'*.,;,..i

\\

t;.p

', ~;. : ;, -.:.:... ; -

..s.:..

?:.....

_ *:*+.y

. v_ ~;. ;.. ~, m -

' s ;..!.:

.:s 1::...i..

i..: a

, - y..t,

.. j;

.7,..

,' y s

r..,..s....
t.

~ ;..

' ::*....,,.,s,t..; :..-

s,

... * * +

..:..~..

s

..j.

....,...af.'..>;:.,....,

.,. ~-

-....n,

. ;.;' g

.. m., : s

.;J,s,.

.....<;.....q ". Q <... :.5.....*........

.:.c

-*r..-

.h.. * *:.,.' ; r.

a

...a.~'

y :.g,,;1-s...c~:w'.

,-4..,."'.._.

. i.;l. t.

v. g.

...c..%

s.

.t..,

?'

  • .'.,;*...k,

.'< tj.;.'.

r

'!;. ' ', y.L.ll, ;,',* ; ;. 4 ' ';, _, ':,... ;,;. '., ;;.? '.':. :, ;;,.

._ ;.. :v,. }..,j';

n';

o

^c o,,'.y l, \\

  • f. :. p. m. ':.. _.....-.*....,;ble_

-. x,.e

..L,.,..:'..;.;.'.,g.,['...*,,.-

.g g: ;

v.,,, _ _ :ln.

-* 4,.'.,1

2 --.

.. -' :. p.s.\\,','s.f..,.,,.,._*,,:,'s,g,"..'-

.c g,.,.,.;,',,t w

=.t.., s. t.,;,;,.,,; :,..,

...,4.,,*....g.,;..:

s

, ;,.. z.

r.. ;.

.\\. ;_,,

8

./

.;.<.'.p

-.. -,e e...,.

s;.v,,

p

..r.<

  • . ; : l. ', l.,.,.
' x

.. ' ', ' '**g;.

., ] ;.,1'. a'. s

'.ll,,,'...

l'. ':

. ', '. - ",i',,'.**..; ' : '...,;t~..',*:>l...-

.l g, l <,,

.s..

s.

., i,. '.,

,..L.'._

  • u, ;s.

+

y.',.l_...,,..;,.,_1.r,1',,....,-;.y+.,

,j..,..f.-

.O,

  1. s,,,

.;y:...:" f.. ' ' *,

q: -

qA-.

....,;. : p. 3' '.

...g' " \\ E;*.. fj,. 4., j, -'.,,,..,. ';.

4 C

. l'

.': i:-

'.;:.,l *. =

-i.n. ' ;:.~;..n '..*. ' ::': :. ' ;,

.u e;. :.:. y;... ;!.'::. -... ;.; :o.. ',. -.. " ' L. ; : ~.,: ~.

t ^:.[.. &.......Y ::. ': ? -l

....v 4

... '... ; ' :,r ::

?.: <*,.',:..,,

..i.

y..'_.

.....*ss'"....;g.'..,...',

f,,...':.',c..,,, f:: h,(

.'..v ; '.,'.h

.'.,e-

.s t

p....-

,,.,,_V

^

o

. a _3

  • a..,

p. p.,

s ::,.,, *

..r.,

c.:

~*

f.,.,N ^ ^;.
9. ':<;, ?, '.. ', e. 5l '. ? '....., -

f.

I, h'.ls, l

0 <_._.. f. ?^

ll, ;Y., '; a-.;,

l'.

%l.

.s

..,;. N. :,. e:.

,E'.'

'4.".-.

v.... ;;

,n.. \\;;..._. ; :,..c. ;, ;..V :.'.h M;... 45. :,. : ;

.%,._.:%.,w; 7.,i.:.Y...;: : ',si. :y:;.cf..'::; q,-l;. ).J.).'a.

... w,<

.. ';.g. r *1 *.. 'p

'= L. *. : * '

~ %; c'4 :.. *'; ').

,;;. s !,: _. ';,[' 4.,s;} :... ~_ ' :.,,

? :t

, p*... ik :.~,.

.. 9. ' : :.

.b,

;,.; ;.y d_ y.:..,?.,L:? q: : ::::

y e.

v. ;.. >,y...%. *: _...,.

'?.

cy... v",_,. :

<.,. -. o - ;...

n

. E '..h.. '.. ',. ' gs, 4:. ?

a ~S ' f_.. : - n,...

"y_...,

..z a.:

g
.. ;.

\\. '..?:':Qy ). ll.'

... L

.a

I ';.

% :L..Y ?.. h ; L.. '. ; E-.1Vi<y'h.f'[*[.", nN:

, ", ' i.. ' ' ^ <k k.

' '. i.,. ;. :

. '. '. 7."

  • r..

, $. e,..

f; 'r,'

, > s.','i, ).' '

,,P.,

[

- i.h,.,p."( ',kf... ".3..

4,', )

} ;, l, ; 3.. I. }p '1% Q.'.M.' kh. N;

( h. f..

"... o '

-l\\,f {.

I

..7

..<.s m

c..,,..: ' ;, ji. 4:.,..,, } "NJ:-

s

. *g

,(: ;f " g? ':"l.'c?-

,(f,; ;,g *i.. ',}E A 00, ;Q'( (),ip.,;, g,)' ' ' W W.,:.NSh;'g, ;t; g,

..'f.,

.,.... yr t
f. Q;':c...:

'.% A. &:t. L '.N

/./;E *'$,, 0.N';' i.$.%. c 'io

' A;. :....,,,f]sbM

%..'> ;v'. W;:%..;;:.^.

, m:b....g

e p 4 g' v
^,..:.,. ?.. h. '. 'M

. n.S:y W.'.;;:f,.s.f. )... 4

'. ^ff V l-1

e... ; ;; 'y,,

e,"c c.- 4

??

1

. g +%uyp 'W%:r.<s 3.:i; i-w

..,m 's;~.n ':.... 1',1 J

4 */.MM!,..;W e.,+:i *,1. g... A.

n "

y.p.
, '* ~.pe @o.

e r

f%o -... -

+

- r< r..,.. n j'.

.J, w?qw s:'$)ans;4,g.;[p;.rQ:),. frfk.t:p)h$:.&p a.fg -

>-%.,M. U:

. m 5...? ! a:

j U,..: " -;f c:h':'

m...$.

n t c...;

.:::g:.. ~

p- - p.d

-.. y '.

... n..p

&. ',g:G Q. 1.*(t~:.? &p..y W,u.q',r A:..;

.q'.,.

y

.:.,yy

,.s v.t.3 ;

'h:+.;:fny

W;n$[N. s 'y.:&Q
4 g e s.g.

m:.M; #,QM-fV:Q'l Mp gN

.q:M W,;. O!

.. :,fR ll.

f

}W2P:f.P.fh q,V;ps q/ds; AR'.f;;.;

t

.WP.:.M P v 2; )",

v:

N.w.A r:gq

awm..

.ww h

' - ~

a w

qn.g.q q.g m um.y _y:g y ngpas:& p.w & ;.Wg..

.s::tyew.;

., s

_ gen 1l&g

.4::

c:.;:u _g:w cdm.u s m ;m;gn s qp m ud:y n.u.ype:ya; g.....;.w

..u g s..

,y m w a.

..s. w :.... %.k..%. N,,q.;. e,yU..*g,p w'y p ell.f!

Y,j, Ac.. :.,'[..g: Y:

,&(j;,%(&h, ff%g4@a "4W$] _ _ ;c.2.':pg.FXg G;} N;$;W.$!

Y'~ W

  1. I g l

$.S.W.M9%..WUW$$hy&hhMA,f ?:hh:py&m F..

4

y. gy.

D s' h

pp.

p

.,:;w

n q.

Mf&

W h. &. & h;:& & &; h g j &g2f 1' N.62 i.

-~

. AP600 DOCUMENT COVER SHEET TDC:

IDS: 1 S

Form 58202G(5/94) [t:\\xxxx.wpf:1x)

AP600 CENTRAL FILE USE ONLY:

0058.FRM RFS#:

RFS ITEM #:

AP600 DOCUMENT NO.

REVISION NO.

ASSIGNED TO AP600 Doc. No. OCS-TS-001 0

Pa081 of 34 Reid ALTERNATE DOCUMENT NUMBER: WCAP-14396, Rev.1 WORK BREAKDOWN #: 3.3.2.4.5 DESIGN AGENT ORGANIZATION: Westinghouse Electric TITLE: Man-in-The-Loop Test Plan Description a

4 i

ATTACHMENTS:

DCP #/REV. INCORPORATED IN THIS DOCUMENT REVISION:

l:

5 CALCULATION / ANALYSIS

REFERENCE:

i ELECTRONIC FILENAME ELECTRONIC FILE FORMAT ELECTRONIC FILE DESCRIPTION m:\\2158w.wpf:1b Word Perfect l

(C) WESTINGHOUSE ELECTRIC CORPORATION 1996ar [F9)-

0 WESTINGHOUSE PROPRIETARY CLASS 2 This document contains informatkin proprietary to Westinghouse Electric Corporabon: it is sutetted in con 6dence and is to be used solely for the 1

purpose,for which it is fumished and retumed upon request. This document and such informabon is not to ba reproduced, transnutted, disclosed or used othenvise in whole or in part without prior written authorization of Westinghouse Electnc Corporation, Energy Systems Business Unit, subject to the legends contained hereof.

O WESTINGHOUSE PROPRIETARY CLASS 2C This document is the property of and contains Proprietary informabon owned by Wesbnghouse Electric Ccipu,4,06 and/or its subcontractors and I

suppliers. It is transmitted to you in con 6dence and trust, and you agree to treat this document in stnct accordance with the terms and condstions of the agreement under which it was provided to you.

@ WESTINGHOUSE CLASS 3 (NON PROPRIETARY)

C*MPLETE 1 IF WORK PERFORMED UNDER DESIGN CERTIFICATION 011 COMPLETE 2 IF WORK PERFORMED UNDER FOAKE.

1 @ DOE DESIGN CERTIFICATION PROGRAM - GOVERNMENT LIMITED RIGHTS STATEMENT [See page 2)

Copynght statement A license is reserved to the U.S. Govemment under contract DE-AC03-90SF18495.

DOE CONTRACT DELIVERABLES (DELIVERED DATA)

@ Subject to spec:6ed excephons, disclosure of this data is restncted until September 30,1995 or Desig 90SF18495, whichever is later.

EPRI CONFIDENTIAL: NOTICE: 1 U 20 3 4

5 CATEGORY: A*.B CO D E

F0 2 O ARC FOAKE PROGRAM - ARC LIMITED RIGHTS STATEMENT [See page 2)

Copynght statement A license is reserved to the U.S. Govemment under contract DE-FC02-NE34267 and subcontract ARC-93-3-SC-001.

O ARC CONTRACT DELIVERABLES (CONTRACT DATA)

Subject to specified excephons, disclosure of thss data is restncted under ARC Subcontract ARC-93-3-SC-001.

ORIGINATOR SIGNATUR ATE S. P. Kerch 2,

AP600 RESPONSIBLE MANAGER SIGNATURE *

~

APPROVAL'DATE y((

J. B. Reid

  • Approval of the responsable manager signifies tnat document efc6mplete, ag3Riuired reviews are complete, electrorwc file is attached ano document is released for use.

d AP600 DOCUMENT C3VER SHEET Page 2 Form 58202G(5/94)

LIMITED RIGHTS STATEMENTS

\\

DOE GOVERNMENT UMITED RIGHTS STATEMENT (A)

These data are submitted with lirrwted rights under govemment contract No. DE ACO3-90SF18495. These data may be reproduced and used by the govemment with the express limitabon that they will not, without wntten permission of the contractor, be used for purposes of manufacturer nor disclosed outside the govemment; except that the govemment may disclose these data outside the govenynent for the following purposes, if any, provided.that the govemment makes such disclosure subject to prohibition against further use and disclosure:

(1) This ' Proprietary Data' may be disclosed for evaluation purposes under the restrictions above.

(11) The ' Proprietary Data' may be disclosed to the Electnc Power Research Institute (EPRI), electric utility representatives and their direct consultants, excluding direct commercial competitors, and the DOE National Laboratones under the prohibitions and restnctons above.

(B)

This notice shall be marked on any reproduction of these data, in whole or in part.

ARC UMITED RIGHTS STATEMENT:

This proprietary data, fumished under Subcontract Number ARC-93-3-SC-001 with ARC may be duplicated and used by the govemment and ARC), subject to the lirrutabons of Arbcle H-17.F. of that subcontract, with the express linutations that the proprietary data may not be disclosed outside the govemment or ARC, or ARC's Class 1 & 3 members or EPRI or be used for purposes of manufacture without prior perrrussion of the Subw,L wicr, except that further disclosure or use may be made solely for the following purposes This proprietary data may be disclosed to other than commercial competitors of Subcontractor for evaluation purposes of this subcontract under the restnction that the proprietary data be retained in confidence and not be further disclosed, and subject to the terms of a norxtisclosure cgreement between the Subcontractor and that organization, excluding DOE and its contractors.

DEFINITIONS CONTRACT / DELIVERED. DATA - Consists of documents (e.g. specifications, drawings, reports) which are generated under the DOE or ARC contracts which contain no background proprietary data.

EPRI CONFIDENTIALITY / OBLIGATIONNOTICES NOTICE 1: The data in this document is subject to no confidenhality obligations.

NOTICE 2: The data in this document is propnetary and confidential to Westinghouse Electric Corporanon and/or its Contractors. It is forwarded to recipient under an obligation of Confidence and Trust for limited purposes only. Any use, disclosure to unauthonzed persons, or copying of this document or parts thereof is prohibited except as agreed to in advance by the Electric Power Research inshtute (EPRI) and West: house Electric Corporation. Recipient of this data has a duty to inquire of EPRI and/or Wesanghouse as to the uses of the informaton contai herein that are permitted.

NOTICE 3: The data in this document la proprietary and confidential to Westinghouse Electric Cc+eiskun and/or its Contractors. It is forwarded to recipient under an obligation of Confidence and Trust for use only in evaluabon tasks specifically authonzod by the Electnc Power Research Institute (EPRI). Any use, disclosure to unauthonzed persons, or copying this document or parts thereof is prohibited except as agreed to in advance by EPRI and Wesbnghouse Electric Corporacon. Recipient of this data has a duty to inquire of EPRI and/or Wesenghouse as to the uses of the informaton contained herein that are permstted. This document and any copies or excerpts thereof that may have been generated are to be retumed to West;nghouse, direc9y or through EPRI, when requested to do so.

NOTICE 4: The data in this document is proprietary and confidental to Westinghouse Electric Corporanon and/or its Contractors. It is being revealed in confidence and trust only to Employees of EPRI and to certain contractors of EPRI for hrruted evaluation tasks authonzod by EPRf.

Any use, disclosure to unauthonzed persons, or copying of this document or parts thereof is prohibited. This Document and any copees or excerpts thereof that may have been generated are to be retumed to Westinghouse, directly or through EPRI, when requested to do so.

NOTICE 5: The data in this document is proprietary and confidential to Wesenghouse Electric Corporation and/or its CunLEters. Access to this data is given in Confidence and Trust only at Wesbnghouse facilites for tirruted evaluston tasks assigned by EPRI. Any use, disclosure to unauthonzod persons, or copying of this document pr parts thereof is prohibited. Neither this document nor any excerpts therefrom are to be removed from Westnghouse facilites.

EPRI CONFIDENTIALITY / OBLIGAT*0N CATEGORIES CATEGORY *A* -(See Delivered Data) Consists of CONTRACTOR Foroground Data that is contained in an issued reported.

CATEGORY *B* -(See Delivered Data) Consests of CONTRACTOR Foreground Data that is not contained in an issued report, except for computer programs.

CATEGORY *C"- Consists of CONTRACTOR Background Data except for computer programs.

CATEGORY *D* - Consasts of computer programs developed in the course of performing the Work.

CATEGORY *E'- Consists of computer programs developed prior to the Effective Date or after the Effective Date but outside the scope of the Work.

CATEGORY *F* - Consists of admirustrative plans and admirustrative reports.

I 1

1 WESTINGHOUSE NON-PROPRETARY CLASS 3 WCAP-14396 Revision 1 MAN-IN-THE-LOOP TEST PLAN DESCRIPTION 1

S. Kerch E. Roth R.Mumaw 1

August 1996 AP600 Document No. OCS-T5%

i i

)

e Approved:

4 R. M. Vijuk WESTINGHOUSE ELECTRIC CORPORATION Energy Systems Business Unit Nuclear Technology Division P.O. Box 355 Pittsburgh, Pennsylvania 15230-0355 C1996 Westinghouse Electric Corporation All Rights Reserved m:\\3123w.wpf:ll>480996

4

~

1 j

i TABLE OF CONTENTS I

Secti.on

.T!!Le PREe.

i 1.0 SCOPE AND PURPOSE OF DOCUMENT 11 2.0 CONCEPT TESTING AS PART OF HUMAN FACTORS ENGINEERING (HFE)/

2-1 HUMAN SYSTEM INTERFACE (HSI) DESIGN PROCESS 2.1 Role of Concept Testing in the HSI Design Process 2-1 2.2 Overview of Concept Tests to be Performed 2-1 l

2.3 Testbeds for Concept Testing 2-2

{

2.4 Test Participants 2-3 2.5 Number of Participants 2-4 i

3.0 FORMAL V&V OF FINAL HSI DESIGN 3-1 l

3.1 Crew Sampling 3-2 3.2 Scenario Sampling 3-2 I

4.0 MAN-IN-THE-LOOP CONCEPT TEST DESCRIPTIONS 4-1 4.1 Soft Controls.

4-1 Concept Test 1. Simple control action execution and feedback

- Prelimmary design concepts 4-1 Concept Test 2. Simple control action execution and feedback j

  1. 600 Functional Design 4-3 i

Concept Test 3. Keeping pace with plant dynamics 4-3 4.2 Workstation Displays 4-4 Concept Test 4. Ability to navigate displays to find information 4-4 Concept Test 5. Coordiction of physical and functional displays 4-5 4.3 Computenzed Procedures 4-6 Concept Test 6. Usability of computerized procedures 4-6 Concept Test 7. Coordination of computerized procedures with workstation displays and soft controls 4-8 4.4 Wall Panel Information System (WPIS) 4-9 Concept Test 8. WPIS support for situation assessment 4-9 Concept Test 9. Ability of WPIS to support multiple crew member situation awareness 4-10 4.5 Alarm System 4-11 Concept Test 10. Alarm system organization and prioritization scheme 4-11

5.0 REFERENCES

5-1 mui23..wpt:Im196 iii I

_.. _ - -. ~. _..

t' LIST OF TABLES Table ~

Title h

2-1 List of Concept Tests 2-5 2-2 Mapping of Concept Tests to SSAR Evaluation Issues 2-6 1

i 4

l.

I f

i

.i l

f I

I

.i mA3123w.wpf:Ib4801%

jy w

r~

4 LIST OF FIGURES h

Title h

2-1

- The Role of Concept Testing in the HSI Design Process 2-7 P

i 1

l i

I l

mS123w.wpf:lt480196 y

i i

LIST OF ACRONYMS CMT Core Makeup Tank EOP Emergency Operating Procedure HSI Human System Interface LOCA Loss-of-Coolant-Accident NPP Nuclear Power Plant PCD Process Control Division PRA Probabilistic Risk Assessment PRHR Passive Residual Heat Removal PWR Pressurized Water Reactor SGTR Steam Generator Tube Rupture SSAR Standard Safety Analysis Report TS Technical Specifications V&V Verification & Validation VDU Video Display Unit WPIS Wall Panel Information System l

l l

l l

I l-3 I

i s.

mM123w.wpf;1t480996 vi l

l

1.0 SCOPE AND PURPOSE OF DOCUMENT This document describes the Man-in-the-Loop tests that are planned as part of the AP600 human system interface (HSI) design process. It is intended as an adjunct to the HSI Design Verification and Validation (V&V) process provided in Chapter 18 (Section 18.11) of the Standard Safety Analysis Report (SSAR).

This document provides information supporting preparation for and scheduling of the Man-in-the-Loop I

tests to be conducted as part of the HSI design process. The document:

Specifies the set of the Man-in-the-Loop tests that are to be conducted as part of the AP600 HSI design process Provides methodological details supporting preparation for conducting tests, including:

- Number and type of participants that will be required for different tests

- Hardware / software requirements for prototyping the HSI resources to be tested, and collecting human performance data i

- HSI resources that will need to be designed, and specific displays that will need to be l

created to support testing.

In addition, this document clarifies which of the set of 17 Evaluation Issues specified in Chapter 18 of the SSAR are to be addressed as part of the HSI design process, and which are to be conducted as part of the formal human factors V&V to be performed on the final AP600 HSI design.

This document does not provide detailed test procedures for the Man-in-the-Loop tests. At this stage in the AP600 HSI design process, it is not possible to develop complete and detailed specifications for all tests to be performed, specifically because the AP600 HSI has not been fully designed and test details will depend on the HSI design and the operational philosophy developed for how the HSI resources are to be used. Detailed test procedures will be prepared for each test when the test is to be conducted.

This is intended to be a living document. Details of the Man-in-the-Loop tests are expected to be updated as the design evolves and feedback from Man-in-the-Loop tests are obtained.

There are two testbeds available for implementing and testing AP600 HSI design concepts: (1)a low-fidelity testbed that has been implemented in the AP600 Main Control Room Test Facility at the Energy Center, and (2) a higher-fidelity, dynamic testbed being iniplemented at the Waltz Mill site.

m:\\3123w.wpf:lb-080996 1-1

b The AP600 Main Control Room Test Facility established at the Westinghouse Energy Center was used to conduct the Concept Test I (Soft Control test). Refer to " Effects of Control Leg and Interaction Mode on Operator's Use of Soft Controls," OCS-J1-008, for the report on the results of this test.

Several HSI resources included in the AP600 control room are relatively mature designs that are in the process of being implemented as retrofits in conventional pressurized water reactor (PWR) plants.

Examples of mature HSI resources include computerized procedures, the alarm system, and workstation display system. These HSI resources have been installed in a high-fidelity training simulator at the Westinghouse Waltz Mill site under a separate program funded by a utility customer for acceptance testing of the HSI resources, and requalification training for their operator crews. This facility will be used to conduct Man-in-the-Loop tests of the HSI resources under dynaraic plant simulation-driven conditions.

I l

I I

I i

mA3123w.wpf:Ib.0801%

l-2

]

7 2.0 CONCEPT TESTING AS PART OF HUMAN FACTORS ENGINEERING (HFE)/HSI DESIGN PROCESS 2.1 Role of Concept Testing in the HSI Design Process Figure 2-1 shows the phases in the HSI design process and the role that concept testing plays in the design process. Concept testing is performed as part of the functional design phase of the HSI design process. It is during the functional design phase that the core conceptual design for an HSI resource and corresponding functional requirements are developed. An integral part of this phase is rapid prototyping and testing of design concepts.

Concept testing during the functional design phase serves two purposes:

1) It provides input to aid designers in resolving design issuesfor which there is no well-established humanfactors guidance. An example might be assessing whether a system response time is adequate to support operator performance on a time-critical task. A second example might be aiding a designer in selecting between alternative design concepts that have different strengths and weaknesses. An example might be determining whether it is preferable to include soft controls in information displays or dedicate a separate video display unit (VDU) for control stations.
2) It establishes the adequacy of the design concept andfunctional requirements that are produced in thefunctional design stage. A main objective of concept testing is to establish that the conceptual design resulting from the functional design stage is adequate to support operator performance in the range of situations anticipated to arise. A major element in designing a concept test is to specify test scenarios representative of the range of situations and task complexities that can arise in actual events, ensuring that the HSI conceptual design is adequate to support operator performance in those complex cases.

1 2.2 Overview of Concept Tests to be Performed The staning point for the concept tests to be performed is the Human Factors V&V process description provided in Chapter 18 of the SSAR where 17 Evaluation Issues are defined. The first 15 Evaluation Issues are organized around particular human performance issues. Issues 16 and 17 correspond to formal verification and validation of the final HSI design, respectively.

The concept tests described as part of Evaluation Issues 1-15 in the SSAR constitute the basis for the Man-in-the-Loop tests that are proposed as pan of the HSI design process. These tests are consistent with rapid prototyping and testing activities described under Element 7 of the proposed NRC Human Factors Engineering Program Review Model for Advanced Nuclear Power Plants (O'Hara, Higgins &

Stubler,1994). In Chapter 18 of the SSAR these concept tests are organized around human performance issues ensoring that all major human performance issues are covered. The present I

document takes the same performance issues to be evaluated, and reorganizes them around specific I

mA3123w.wpf;1b.o801%

2-1

HSI resources. This was done to enable concept testing to begin early in the HSI design process before all HSI resources have been fully designed and integrated.

He HSI resources addressed in these Man-in-the-Loop tests are:

Soft controls Workstation displays

=

Computerized procedures Wall panel information system (WPIS)

Alarm system Human performance issues addressed in Evaluation Issues 1-15 of SSAR Chapter 18 were reviewed and reorganized around the specific HSI elements associated with those issues. Based on this analysis, Man-in-the-Loop concept tests were defined that could be conducted to examine human performance issues associated with individual HSI resources, and Man-in-the-Loop concept tests that examined the coordination of two or more HSI resources. The goal was to define Man-in-the-Loop tests that were narrowly focused, enabling early testing of HSI design concepts.

Table 2-1 lists the 10 concept tests proposed as part of the AP600 HSI design process. Table 2-2 shows the correspondence between the concept tests and evaluation issues presented in Chapter 18 of the SSAR. The majority of evaluation issues identified in Chapter 18 of the SSAR are addressed in one or more of the 10 concept tests proposed. The exceptions are Evaluation Issues 7-10 and 15.

l These evaluation issues examme the impact of all the HSI resources on crew performance using a j

high-fidelity dynamic plant simulation. These issues are beyond the scope of this revision to the test plan, but will be included in a future revision.

3 2.3 Testbeds for Concept Testing The HSI resources included in the AP600 control room vary in their design maturity levels. The alarm system, computerized procedures, and workstation displays, and much of the functional design activity has already been completed as part of earlier Westinghouse projects. Concrete examples of these HSI resources exist for conventional pressurized water reactor (PWR) plants, Other HSI resources to be included in the AP600, specifically the WPIS and soft controls, have not been previously implemented in Westinghouse Nuclear Power Plant (NPP) control room designs. These HSI resources are in an earlier stage of the functional design process. Early concept testing will focus on the less mature HSI resources, since this is the area where input from concept testing can be most beneficial.

I An AP600 Main Control Room Test Facility, established at the Westinghouse Energy Center, included capabilities for rapid prototyping of HSI design concepts. This facility enabled execution of the first soft control test (Concept Test 1) under low-fidelity conditions. The AP600 Main Control Room Test Facility has been moved to the Waltz Mill site (April 19%) to take advantage of the high-fidelity SNUPPS/Beznau plant simulator located at that site.

m\\3123w.wpf:lt480996 2-2

Versions of the advanced alarm system and workstation display system have been installed in the high-fidelity SNUPPS/Beznau training simulator at Waltz Mill. 'Ihis facility is being developed under a separate program funded by a utility customer for the purposes of acceptance testing of the HSI resources, and requalification training for their operator crews. Current plans are to conduct these acceptance tests in the simulator at Waltz Mill during June through August 1996. This high-fidelity simulation facility will enable Westinghouse to conduct AP600 HSI resource tests under dynamic, plant simulation-driven conditions.

While the application being implemented at Waltz Mill is a conventional PWR plant rather than a passive plant design, key elements important to human performance (e.g., characteristics of the HSI, plant dynamics, operatior,a1 philosophy) are sufficiently similar in the two types of plants that the results of concept tests performed at the Waltz Mill simulator facility will be valid for the AP600 plant.

2.4 Test Participants in human factors literature, a distinction between two types of evaluations is made. There are:

)

l

1. Heuristic Evaluations - where experts in HSI design, human factors, and operations review prototypes that illustrate conceptual design j

i

/

l

2. User Tests - where individuals from the target user population (i.e., actual NPP control room ope ators) are observed as they try to use a prototype under controlled conditions Several human factors studies have been conducted examining the relative value of Heuristic Evaluations and User Tests (Jeffries, Miller, Wharton, & Uyeda,1991; Jeffries & Desurvire,1992).

l Generally these techniques are complementary, and a test plan combining both evaluations are advocated. Heuristic Evaluations identify serious problems early. User Tests are more defm' itive in establishing design adequacy.

In the AP600 design, the distinction between Heuristic Evaluations and User Tests is obscured because no one has operatmg experience on AP600 plants. There is a proposal to use Westinghouse personnel from the training, procedwes development, and HSI design groups who have NPP operational expenence as test participants in the early concept tests. In later concept tests, operators from existing NPPs will be used, since they bring a complementary experience base and perspective.

2.5 Nussber of Participants An issue surrounding the number of participants that should be included in each concept test has surfaced. Recently, research has explicitly exami:ni the proportion of usability problems detected as a function of number of users tested or heuristic evaluators employed (Virzi,1992; Nielson & Landauer, f

1993). A general finding is that the relationship between the number of test participants and j

proportim of usability problerns identified is well modeled as a Poisson process.

mA3123w.wpf:lt4801%

2-3

These studies suggest that five participants in an evaluation or test typically is ssfiicient to detect most (70 to 90 percent) of the serious HSI problems (Virzi,1992). This is particularly true with an iterative evaluation approach where repeated evaluations are conducted until the number of problems identified is very low.

As an initial estimate, there will be at least five participants in each concept test. In later Man-in-the-Loop evaluations, the number of participants in a study may be adjusted to achieve a specific level of thoroughness in detecting problems that may exist.

The math model parameters for the best-fit Poisson curve specific to AP600 Man-in-the-Loop testing will be better estimated when proceeding with the evaluations. Once these parameters are estimated, it is possible to specify a priori the number of participants that need to be included in a Man-in-the-Loop test to identify a specific percentage of usability problems that exist in an HSI design (Nielson &

Landauer,1993). For example, the number of participants that need to be included in the Man-in-the-Loop test in order to catch 95 percent of the usability problems could be specified.

i mM123w.wpf:1 bas 0196 2-4

TABLE 2-1 LIST OF CONCEPT TESTS Soft Controls Concept Test 1: Simple control action execution and feedback (Pretminary design concepts)

Completed September 1994 Concept Test 2: Simple control action execution and feedback (Prototype based on functional requirements document)

Concept Test 3: Keeping pace with plant dynamics Workstation Displays Concept Test 4: Navigating displays to fm' d information Concept Test 5: Coordinating physical and functional displays Computerized Procedures Concept Test 6: Usability of computerized procedures Concept Test 7: Coordinating procedures, displays, and soft controls Wall Panel Information System Concept Test 8: Support for situation assessment (single person)

Concept Test 9: Support for multiple crew member performance Alarm System Concept Test 10: Alarm organization and prioritization scheme m:\\3123w.wpf.!M801%

2-5

d i

TABLE 2-2 i

MAPPING OF CONCEPT TESTS TO SSAR EVALUATION ISSUES Concept Test f

Soft Workstation Computerized Alarm SSAR Evaluation Issues Controls Displays Procedures WPIS System 1.

Passive Monitoring Test 8 2.

Directed Search Test 8 (WPIS) 3.

Directed Search Test 4 (Request) 4.

Group Situation Test 9 Assessment 5.

Alarm System Test 10 6.

Interpretation and Test 5 Planning 7.

Response to Single-Fault Events 8.

Response to Multiple-Fault Events 9.

Interpretation and Planning - Crew

10. Interpretation and Planning - with TSC
11. Simple, Operator-Tests 1,2 Tests 6,7 Paced Control
12. Conditional Operator-Test 2 Tests 6,7 Paced Control
13. Multiple Procedures Tests 6,7
14. Event-Placed Control Test 3 Tests 6,7
15. Control Tasks with Crew Coordination ar\\3123w.wpf;1ba0196 2-6

l l

l l

B d

I Ei

.E 4:

5 I

M-MIS m

W3 i

{

M-MIS Function-Based Functional Im >lernentation 8"

+

Task Analysis Design

+

(Hart are S ware

+

8" g

+

b a

l g

g Y

e M

k i k s,

. Design concepts

  • Resolve design issues nl
  • Functional
  • Establish adequacy of n

.g requirements design concept and functional requirements 4

I f H

E s-90 if Concept Test s

z:

Man-in-the-loop test of concate example 3

of functionaldesign:

{

= Rapid pmtotype f

= ActualM-MIS forsimilarplant E

1 I

1 3.0 FORMAL V&V OF FINAL HSI DESIGN Evaluations 16 and 17 in Chapter 18 of the SSAR represent the formal HFE/HSI design V&V. They will be performed when an AP600 plant has been purchased. The tests will be conducted using an AP600 dynamic, high-fidelity, training simulator.

j i

1 The formal V&V evaluations are not covered in this document; however, a high-level description of how the Man-in-the-Loop Validation Test (Evaluation Issue 17 in the SSAR) will be conducted clarifying the proposed approach is provided.

As the evaluations proceed, a better estimate will be made regarding the math model parameters for the best-fit Poisson curve specific to AP600 Man-in-the-Loop testing. Once these parameters are estimated, it is possible to specify a priori to the number of participants needed in a Man-in-the-Loop test to identify a specific percentage of usability problems that exist in an HSI design (Nielson &

Landauer,1993). For example, one could specify the number of participants that need to be included in the Man-in-the-Loop test to catch 95 percent of the usability problems.

The purpose of the final V&V test is to establish that crews can achieve mission goals using the AP600 HSI, and that no serious human factor deficiencies remain. The methodological approach proposed is analogous to the methodology currently used to validate emergency operating procedures (EOPs) (cf. NUREG-0899; NUREG-1358; WCAPil638; WCAP10599; INPO 83-006).

The current process for validating EOPs employs a formal process for identifying, tracking, and resolving potential problems.' An interdisciplinary observation team, which includes specialists in plant design, operations, procedures, and human factors, is brought together to observe operator crews go through pre-defined scenarios on a high-fidelity training simulator. The scenarios are selected to exercise different procedure branches and include a variety of complicated situations. The observation team documents any problems identified on discrepancy forms. These discrepancy forms are then reviewed, and recommendations for resolving the problem are made.

Westinghouse proposes to use an analogous approach for final validation of the AP600 HSI. Operator crews will be observed as they go through a predefined set of scenarios on an AP600-specific, high-fidelity training simulator. An interdisciplinary observation team will observe crew performance and document any problems observed. Since the AP600 HSI design will have undergone a series of tests prior to final validation, many problems should not be detected in the final validation test. Any problems that are detected will be evaluated for severity and recommendations for resolving the problem will be made.

m:\\3123w.wpf:lt480196 3-1

. -. - - -- _. ~.

I To ensure a thorough evaluation, an emphasis will be placed on:

A representative sample of crews A representative sample of scenarios that reflect events known to be important to maintain

~ plant safety, and events known to be demanding from a human factors / cognitive perspective I

Use of redundant and convergent measures More details on the performance measures included in the final validations are provided in Chapter 18 of the SSAR.

3.1 Crew Sampling Participants in the final validation test will be the control room operators and related staff from the I

first AP600 plant to be built All individuals undergoing training to operate the AP600 plant will participate in the validation study. This will include individuals who will be training to be plant l

control room operators and the management and administrative staff who will be receiving training for licensing purposes. Inclusion of all personnel licensed to run the AP600 plant in the validation test i

should ensure a representative sample that includes a wide range of experience and skill levels.

Final validation will be conducted when personnel will be undergoing training on the AP600 plant and its operation.

1 3.2 Scenario Sampling A multi-dunensional set of criteria will be used to define the set of scenarios included in the final validation ensuring a wide sampling of situations. The dimensions considered include:

I A range of operational modes including normal and emergency operations Design basis events specific to AP600 AP600-specific design features Scenarios including human performance actions identified as critical to plant safety Situations that produce cognitive challenges identified in the V&V process description in the i

SSAR, including AP600-specific scenarios that may:

i Complicate situation assessment by providing degraded or conflicting plant state i

information mA3123w.wpf:1bm0196 3-2 f

~ -,,, --

r a

'ieIe+

t i

{

i Complicate response (e.g., require balancing of multiple goals; require manual takeover of automatic systems)

+

j

- Complicate performance by increasing personnel communication / coordination requirements

[

j

- Introduce high physical or mental workload i

f While it is premature to define the set of scenarios that will be included in the final AP600 validation, l

a preliminary list of the types of scenarios that need to be considered has been developed based on j

l discussions with engineers and training instructors f.uniliar with the AP600 design:

1). Major Plant Transitions - Major transitions, such as significant heatup or cooldown transitions, cover multiple plant modes and require transitions for several types of equipment. These are i

typically high-workload operations that also have significant surveillance requirements.

l

2) Reactor Startup - This component of plant startup is one of the most difficult and sensitive operations. The AP600 needs to be able to demonstrate its effectiveness here.
3) Power Production System Changes - During power operations, a critical role for the control room is to make transitions between power levels (e.g., transition to full power). In addition to the traditional reasons why this maneuver is important, the AP600 design introduces gray rods for f

certain power transitions.

1 I

4) Mode 1 High-Power Operations - In Mode 1, an important maneuver is the load-follow evolutions.

The AP600 relies on gray rods and boron control for this set of operations.

l

5) Maintenance Support at Full Power (tag-out) - Tag-out is an important element of operation.

AP600 validation needs to show that testing and maintenance are well supported, and that there are adequate ties to the Technical Specification document.

6) Shutdown Operations - The control room has a number of important duties during shutdown. One element of operations important in the AP600 is the transition from RCS intact to RCS open (mid-loop).

i

7) Accidents and Emergency Operations - Emergency operations are always an important part of l

V&V. Certainty within a range of events is important. AP600 design issues indicate that the following events may be appropriate to include in the final validation:

Loss of main feedwater where passive residual heat removal (PRHR) is required i

=

Sequence involving core makeup tank (CMT) actuation l

Steam generator tube rupture (SGTR)

Very small loss-of-coolant-accident (LOCA) with CMT recirculation me\\3123w.wpf;1t400996 3-3

. ~ _ _

e i

i

}

Instrumentation & Control (I&C) failures of Protection and Safety Monitoring System (PMS)

L (crossover to DAS)

?

4 t

Shutdown events 4

i The actual set of scenarios included in the final validation will be defined by an interdisciplinary team, including input from AP600 design engineers, EOP developers, training instructors, HSI designers, j

human factors specialists, and human reliability analysis /probabilistic risk assessment (PRA) analysts.

1 i

l.

t I

1 s

i t

i t

1 l

)

i I

i f

mA3123w.wpf:Ib-080996 3-4

4.0 MAN-IN THE-LOOP CONCEPT TEST DESCRIPTIONS Below are descriptions of concept tests to be conducted as part of the HSI design process. The descriptions are organized around specific HSI resources.

Each concept test includes a description of the issue (s) to be evaluated, the test method to be employed, testbed characteristics required to run the test, type of materials required, performance measures to be recorded, and expected test outcome (i.e., types of conclusions and recommendations expected).

The evaluation issue (s) specified in the SSAR that are addressed by the concept test are presented in parentheses under the issues heading.

These test descriptions are not intended to replace detailed test procedures. A detailed test procedure should be prepared for each test close to the time that the test is to be conducted.

4.1 Soft Controls Concept Test 1: Simple control action execution and feedback - Preliminary design concepts. This test was completed September 1994.

Issue: The ability for the soft controls provided in the workstation to adequately support the operator in execution of simple, operator-paced, control actions. (Evaluation Issue 11)

The initial test evaluated preliminary soft control design concepts that draw on current soft-control designs and soft control capabilities of the Process Control Division (PCD) graphics software.

The concept test assessed the ability for the soft controls to support operators in selecting and bringing up a control station, taking the control action and receiving feedback with respect to whether the control action was actuated as intended, whether the control action had the desired effect on plant state, and where the control was not actuated as intended, the reason why.

Specific issues include:

Ability of the soft control design to support a range of control activities that l

I

- Arise in discrete and continuous control actions i

- Require simultaneous control of several controls

- Require sequential execution of controls

- Require monitoring automated systems and taking manual control as necessary.

nr\\3123w.wpf:Ib-0801%

41

Ability of operator to:

i

- Evaluate current control state, target, and limit values Determine possible and desired control action

- Enable effective control when there are time lags

- Obtain and interpret feedback on actuation of control action Identify and recover from errors l

- Determine when action cannot be performed because desired conditions are not satisfied (such as interlocks or limits on rates)

Test Method: The participant was asked to perform simple, operator-paced control actions. For example, the participant was asked to open a valve, increase flow to some target value, perform a c

simple proceduralized sequence, and monitor the activity of an automated system and take manual control if necessary. 'Ihe participant was required to bring up the appropriate control (s), take the control action (s), indicate whether the control actions were actuated, and whether the desired plant state was achieved.

l t

Testbed Characteristics: This test was performed in the AP600 Main Control Room Test Facility, located in the Westinghouse Energy Center. Conceptual designs were developed as workstation-based prototypes. The set of displays and controls were dynamic in functionality (activating soft controls resulted in appropriate changes in the displays) but were not tied to a high-fidelity plant simulation.

j Rudimentary math models that enabled changes to occur in displayed plant parameters based on l

control actions were used to drive the displays and soft controls. 'Ihe math model used to drive the i

display allowed specification of different functional relationships between control actions and their effect on plant parameters (linear relation, logarithmic relation) and different lags to mimic different types of controls.

Type of Materials: A set of control scenarios were defined that exemplified types of control i

activities expected to arise in the AP600 plant, (such as discrete control tasks, and continuous control i

tasks).

Performance Measures: To assess that performance issues' response times were recorded, sequences of actions were documented, and subjective ratings were obtained. Subjective ratings gave subjective feedback on design adequacy.

Outcome: Test results provided data to support development of the functional requirements for soft I

controls for AP600.

m:\\3123w.wpf:lb 080196 4-2

Concept Test 2: Simple control action execution and feedback - AP600 Functional Design This test is similar to Concept Test 1, but is performed later in the functional design process. The test will be conducted using prototype displays that are representative of the AP600 soft controls functional design. This test establishes the adequacy of the AP600 soft control functional design and functional requirements for simple, self-paced control tasks.

Issue: The types of issues to be addressed will be similar to the issues addressed in Concept Test 1.

The specific issues addressed will depend on the results of Conc.ept Test 1, and the set of design issues that arise during the AP600 soft control functional design proce;ss for which Man-in-the-Loop concept test data is desired. The test will address the issues described in Evaluation Issues 11 and 12.

Test Method: The participant will be asked to perform simple, operator-paced control actions. The participant will be required to bring up appropriate control (s), take the control action (s), indicate whether the control actions were actuated and whether the desired plant state was achieved, and, in cases where the control action failed, indicate why it failed.-

Testbed Characteristics: 'Ihis test will be performed in the AP600 Main Contml Room Test Facility.

'Ihe set of displays and controls will be dynamic in functionality (i.e., activating soft controls will result in appropriate changes in the displays) but need not be tied to a high-fidelity plant simulation.

Type of Materials: A set of control scenarios will be defined that exemplify the different types of control activities that are expected to arise in the AP600 plant (such as discrete control tasks, continuous control tasks, and tasks that involve simultaneous control of multiple controls). These scenarios will include cases where the control action is blocked (due to interlocks, limits on rates, and equipment malfunction).

Performance Measures: To assess performance issues identified above, response times will be recorded, sequences of actions will be documented, and subjective ratings obtamed. Subjective ratings I

allow for subject feedback on design adequacy.

Outcome: Test results will establish the adequacy of the functional requirements for soft controls for AP600 simple control tasks.

Concept Test 3: Keeping pace with plant dynamics Issues: Soft controls supporting the operator in executing control actions and evaluating feedback in pace with the event. (Evaluation Issue 14)

Workstation Physical and Functional Displays should support the operator in locating relevant displays and executing soft controls at a rate that allows operators to keep pace with the event. (Evaluation Issue 14) mA3123w.wpf:1b 0801%

4-3

This evaluation determines whether the soft control design that comes out of the first evaluation I

supports performance in a more realistic setting where operators are required to keep pace with a plant evolution.

l Test Method: The participants will be asked to perform several control evolutions involving both manual and supervisory control of automated systems. Examples will include cases where the plant i

dynamics are rapid and cases where the plant dynamics are slow. It will include cases where the control maneuvers go as planned, cases where components fail, and cases where automated systems l

fail and manual takeover is required.

Testbed Characteristics: This test will be performed in the high-fidelity simulator at Waltz Mill. A workstation-based prototype that is tied to a plant simulation will be used, allowing simulation of several common plant evolutions. For example a plant startop or a change in power level.

Type of Materials: The prototype will present process and control displays built around the plant evolution scenarios identified.

Performance Measures: To assess each user's ability in soft controls efficiency, without error, and maintaining pace with the simulated event, the sequence of control actions and the time required for each control action will be recorded. To ensure that the participant was able to control the plant i

evolution, key plant simulation parameters will also be recorded (for example, key plant parameters stayed within target range and limits were avoided). Performance will be videotaped. In addition, subjective judgments will be elicited to identify and rate the severity of problems and to assess subjective workload.

Outcome: This test will assess whether the AP600 soft-control functional design is adequate to support operators in keeping pace with dynamic plant evolutions.

4.2 Workstation Displays Concent Test 4: Ability to navigate displays, finding information Issues:

Operator accuracy and efficiency, when given a specific request in locating and selecting the correct workstation display where that information is located (Evaluation issue 3)

Effective operator use of zoom and pan capabilities to locate information The coding scheme should indicate active display points where more information is available (e.g., through zoom and pan, or through transitions to another display)

The coding scheme should be adequate to locate information within a display mA3123w.wpf:lt4801%

4-4

Test Method: The participant will be asked to locate a particular piece of information requiring navigation through the displays. Specific information requests will be defined to exercise various available navigation mechanisms.

Testbed Characteristics: This test will be performed in the high-fidelity simulator at Waltz Mill.

Type of Materiais: The test will be performed using physical displays and possibly functional displays for the plant simulated in the high-fidelity simulator at Waltz Mill. The displays, while not AP600-specific, will be representative of the AP600 Functional Requirements for workstation displays.

Performance Measures:

To a'sess the performance issues, the following will be recorded:

s Whether the information is located Response times

=

The navigation path taken to get to the target display

'Ihe navigation path length to get to the target information d N compared to the optimal navigation path identified a priori by the designer.

In addition, subjective judgments will be elicited from the participants regarding the adequacy of display coding conventions and navigation mechanisms.

Outcome: This test will assess the adequacy of the AP600 Functional Requirements for display coding and navigation.

Concent Test 5: Coordination of physical and functional displays l

Issue: This test addresses the use and coordination of physical and functional displays supporting situation awareness and response planning (Evaluation Issue 6), including the types of information expected to be drawn from physical and functional displays, when these displays should be accessed, and how physical and functional displays are to be coordinated.

Specific issues include whether the workstation physical and functional displays support the operator in:

Distinguishing situations where physical displays should be examined from situations where

=

functional displays should be exammed Understanding interrelationships among systems and processes Assessing whether currently active processes are performing correctly mA3123w.wpf:Ib-080196 4-$

Assessing whether automated systems are performing correctly

=

Assessing goal satisfaction

=

J Identifying implications of plant state for plant goals l

Assessing the availability of alternative processes for achieving a given goal

=

Making choices among alternative means for achieving a given goal (recovery path)

Assessing the effect of the selected recovery path on other plant goals (side effects)

Test Method: Participants will be presented with several scenarios that require forming a situation assessment and determining a course of action based on examination of physical and functional displays. They will be asked to assess plant state, determine implications for plant goals, determine a course of action, and assess a selected course of action on other plant goals (determine whether there are any side effects).

Testbed Characteristics: This test will be performed in the high-fidelity simulator at Waltz Mill.

Type of Materials: The test will be performed using physical and functional displays for the plant simulated in the high-fidelity simulator at Waltz Mill. The displays, while not AP600-specific, will be representative of the AP600 Functional Requirements for workstation displays.

Performance Measures: The main measure of performance will be accuracy of response to questions related to situation assessment and response determination and evaluation. The displays that are accessed and the order in which they are accessed will also be recorded and examined to determine whether the participants accessed the physical and functional displays that contained relevant information. Subjective judgments will also be elicited from participants regarding the adequacy of the physical and functional displays for supporting plant-state interpretation and planning.

Outcome: This test will assess the adequacy of the AP600 Functional Requirements for the use and coordination of physical and functional displays.

4.3 Computerized Procedures Concept Test 6: Usability of computerized procedures Issue: The computerized procedure design should be adequate to support operators in performing tasks that involve following procedures.

m:\\3123w.wpf:1b-0801%

4-6

.. ~

l 4

Current Westinghouse EOPs are paper-based and employ a two-column format. Operating experience with current procedures has revealed a number of situations that are difficult for operators to handle l

and can lead to error, including:

Complex logical statements and branching found in the response not obtamed (RNO) column t

Pending procedure steps.

Nested procedures, in which a separate procedure must be completed before returning to complete the current procedure Multiple, indapandant procedures that must be executed in parallel (for example, when a procedure attachment is handed to one control room operator to perform, while the rest of the l

crew continues with the main procedure) l It is important to determme how effectively computerized procedures handle these difficult situations.

I It is also important to determine whether computerized procedures introduce new difficulties not found in current paper-based procedures.

i i

The current Westinghouse HSI design for computer-based procedures offers a new medmm with new support features, including support for procedure steps and foldout pages that require continuous monitoring, place-keeping aids, and automatic assessment of whether plant-state requirements specified in the procedure step are met. These support features should facilitate operator performance and reduce potential for error. However, user tests are required, insuring that no new problems are introduced by the change in technology. De test will determine the impact of having a narrower field of view, the impact of display navigation requirements on ability to look ahead and look back in the procedure, and the fact that the computer automatically deternunes whether a procedure step is met, j

resulting in a more passive review role for the operator.

l j

This study addresses the types of issues raised in Evaluation Issues 11 through 14. nese issues address:

How the HSI features support the operator in performing control tasks that require assessment of preconditions, side-effects, and postconditions (Evaluation Issue 12)

How the design of the procedure display interfaces prevent operators from getting lost in nested procedures (Evaluation Issue 13) 1 How the design of procedure display interfaces supports the concurrent use of multiple, independent (not nested) procedures (such as, specific alignment procedures that appear as j

EOP attachments and are normally handed to the reactor operator or balance of plant operator to accomplish) (Evaluation Issue 13)

(

mA3123w.wpf:1M20996 4-7

. ~-

i How operators are able to perform the next steps (suspend a step) and retum to complete the pending step at the appropriate time, in a case where event dynamics are slow (Evaluation Issue 14)

{

i Test Method: Participants will be observed as they respond to several simulated emergency events j

using computerized procedures. He test objectives will be met by observing utility crews using computerized procedures in the high-fidelity simulator at Waltz Mill as part of acceptance testing of the computerized procedure or as part of crew requalification training.

i Testbed Characteristics: This test will be performed in the high-fidelity simulator at Waltz Mill.

Type of Materials: The test will be performed using computerized procedures for the plant simulated in the high-fidelity simulator at Waltz Mill. The computerized procedure, while not AP600-specific, will be representative of the AP600 Functional Requirements for computerized procedures.

Perfonnance Measures: How operators follow procedure steps and keep up with the simulated event will be assessed.

i Outcome: The test will assess the adequacy of the AP600 Functional Requirements for computerized procedures.

Concept Test 7: Coordination of computerized procedures with workstation displays and soft co-* mis Issues:

ne workstation displays and soft controls should be coordinated with procedures to allow

=

efficient location and execution of control actions. (Evaluation Issues 11 and 13)

The HSI features should support the operator in performing control tasks that require i

assessment of preconditions, side-effects, and post-conditions. (Evaluation Issue 12) i Procedures should be coordinated with workstation physical and functional displays, allowing

}

the operator to keep pace with the event. (Evaluation Issue 14) l Workstation physical and functional displays and procedures should allow the operator to.

perform the next steps and retum to complete the pending step at the appropriate time in cases

+

where event dynamics are slow. (Evaluation Issue 14)

De soft controls should support the operator in executing control actions and evaluating i

feedback in pace with the event. (Evaluaticn Issue 14)

F i

m \\3123w.wpf:lt4801%

4-8

i I

Test Method: Participants will be asked to perform several simulated tasks using computerized procedures in combination with functional and physical displays and soft control mechanisms. They will be required to take control actions using soft controls.

Testbed Characteristics: This test will be performed in the high-fidelity simulator at Waltz Mill.

The plant simulation will be modified so that control actions can be executed using soft controls.

i Type of Materials: The test will be performed using displays and computerized procedures for the plant simulated in the high-fidelity simulator at Waltz Mill. The displays and computerized I

procedures, while not AP600-specific, will be representative of the AP600 Functional Requirements for i

workstation displays, soft controls, and computerized procedures. Several scenarios will be defined that exemplify the types of system monitoring and control executions that are performed using these procedures. The scenarios should include cases where plant dynamics are rapid and extensive manual action is required, cases where plant dynamics are slow, and cases where supervisory control of automated systems and manual takeover from automated systems is required. Ideally, there should be overlap between the scenarios included in Concept Tests 3 and 7.

Performance Measures:

i l

To assess performance issues identified above, the following will be recorded:

l Whether the right display is brought up, correct information located, and correct control action taken Response times Ability to detect and correct execution errors In addition to these objective measures of performance, subjective judgments will be obtained from participants regarding HSI design adequacy.

Outcome: This test will assess the adequacy of coordination between workstation displays and soft control mechanisms and procedures.

4.4 Wall Panel Information System (WPIS)

Concent Test 8: WPIS support for situation assessment i

Issue: Completeness The WPIS display should present sufficient information for the operator to maintain awareness of the plant state during different modes of operation. (Evaluation Issue 1) mA3123w.wpf;1b-080196 4-9

The WPIS should suppon the operator in getting more detail about piant status and system availability by directed search of the workstation functional and physical displays. (Evaluation Issue 2)

Initial concern will be with the completeness of the information presented on the WPIS. Also, this test will determine if the WPIS modes are appropriate.

Test Method: Panicipants will view a WPIS display and be asked to make a series of judgments or decisions regarding current plant state, system availability, and on the conducting of a shift turnover.

Testbed Characteristics: Conceptual WPIS designs will be developed as workstation-based prototypes. Testing actual hardware is not a concem at this point. This test may be conducted either in the low-fidelity simulation in the AP600 Main Control Room Test Facility or the high-fidelity simulator at Waltz Mill.

Type of Materials: WPIS displays will be developed to show plant state for the full range of plant modes to be addressed by the WPIS. For each mode represented, a range of plant states will be presented.

Performance Measures: The ability of panicipants to accurately determine plant state based on information in the WPIS will be assessed. In addition to objective measures of performance, j

subjective judgments will also be elicited from participants regarding the adequacy of the WPIS design j

concept.

Outcome: This test will assess the adequacy of the AP600 WPIS functional design, supponing single individuals in maintaining plant situation awareness.

Concent Test 9: Ability of WPIS to suppon multiple crew member situation awareness Issue: The WPIS should suppon multiple member crew awareness of plant conditions (Evaluation Issue 4).

Spwifically:

The HSI should support the crew in maintaining awareness of plant conditions and those implications.

The HSI should suppon the crew in maintaining awareness of each others' actions, intents, and

)

information needs l

The HSI should support effective and efficient shift turnover.

m:u123w.wpf;1b oso196 4-10

-Tt'

The HSI should support new personnel entering the control room to develop an awareness of plant conditions and their implications.

It is critical that the WPIS supports operators in maintaining awareness of the plant state as they operate the plant. It is also critical to show that the WPIS supports operators in maintaining awareness of each other's activities. Evidence of changes in operator behavior motivated by the information presented on the WPIS should be noted. These behavior changes will indicate that the WPIS is communicating effectively in that critical information is salient and timely.

Test Method: The WPIS prototype will be located where it affords easy access for a multi-person crew. For each scenario, the WPIS will indicate an important event that should alter crew operations.

The ability of the crew to assess plant state and perform maneuvers that require shared plant knowledge will be assessed.

Testbed Characteristics: This test will be performed in the high-fidelity simulator at Waltz Mill.

The simulator will be modified so that the WPIS will be driven by dynamic, simulated plant data Type of Materials: The test will be performed using a WPIS for the plant simulated in the high-fidelity simulator at Waltz Mill. While the scenarios employed will not be AP600-specific, the WPIS will be representative of the AP600 Functional Requirements for a WPIS.

Performance Measures: Objective measures of the ability of crews to assess plant state and to perform maneuvers that require shared knowledge of plant state will be obtained. In addition, subjective judgments will be obtained from participants regarding the adequacy of the WPIS functional design.

Outcome: 'Ihis test will asses:: the adequacy of the AP600 WPIS functional design, supporting group situation awareness.

4.5 Alarm System Concept Test 10: Alarm System organization and prioritization scheme Issue: The Alarm System should organize alarm messages so that it facilitates operator understanding of the alarmed state and its implications for the plant operational goals. (Evaluation Issue 5)

Specific issues include:

Alarm System presentation format should enable rapid detection and interpretation of alarm messages. (Evaluation Issue 5)

Alarm System prioritization scheme should facilitate operator understanding of the relative importance of alarm conditions. (Evaluation Issue 5) msmw.wprat>oso196 4-11

d j

Alarm System prioritization scheme, including the management of redundant or low importance alarm messages should facilitate operator understanding of the relative importance of alarm conditions. (Evaluation Issue 5)

Alarm System should enable operators to identify and interpret the implications of lower priority alarms. (Evaluation Issue 5) l Test Method: Participants will be observed as they respond to the Alarm System during simulated emergencies. The participant will be asked to indicate the alarm messages presented and their priority for response from most to least important. The participant will also be asked to describe the implications of the alarms to plant safety and productivity goals and any causal interrelationship among alarms. In addition, the participant will be asked to identify other alarm conditions that may have existed but were not displayed on the main alarm panel. Following this, the display of additional alarms will be presented, and the subject will be asked to describe the implications of these alarms.

Testbed Characteristics: This test will be performed in the high-fidelity simulator at Waltz Mill.

Type of Materials: The test will be performed using scenarios for the plant simulated in the high-fidelity simulator at Waltz Mill. The Alarm System, while not AP600-specific, will be representative of the AP600 Alarm System Functional Requirements. A set of plant upset scenarios will be defined that vary in severity from single malfunction events to multiple failure events. Alarm messages will vary in number and level of abstraction. Upsets will include cases where a single fault leads to a cascade of alarms (where the objective is to determine whether the participant can correctly assess the interrelation between the original fault and the consequent disturbances), multiple fault cases (where the objective is to determine the subject's ability to identify, prioritize and track the implications of multiple functionally unrelated alarms), and where additional alarm queues of varying types (redundant versus low priority alarms) and number exist.

Performance Measures: The objective of this test is to identify if operators are able to correctly interpret and assess the implications of alarm messages. This is assessed through objective performance measures and the participant's subjective assessments.

Objective dependent measures will be collected, including the:

Number of alarms correctly identified 1

1 Participant's assessment of alarm priorities compared to the prioritics assigned dunng the development of the test scenario Extent to which the alarm implications for present and future plant state are correctly assessed l

Extent to which the causal interrelationship among alarms is recognized w\\3123w.wpf:ltro801%

4-12

Ability to infer alarms not displayed on the main alarm panel Ability to interpret alarms In addition, subjective judgments will be elicited from participants regarding the adequacy of the Alarm System Functional Design.

t Outcome: 'Ihis test will assess the adequacy of the AP600 Alarm System Functional Design.

I

[

i 4

8 nd3123w.wpf:Ibel%

4 13

l

5.0 REFERENCES

l l

Electric Power Research Institute, Chapter 10, " Man-Machine Interface Systems," Advanced Light Water Reactor Utility Requirements Document. Volume Ill ALWR Passive Plant, i

Palo Alto, Ca., Revision 4, Dec.1992.

Hoecker, D. G. & Roth, E. M., " Effects of Control Leg on Operator's Use of Soft Controls,"

l l

OCS-J1-008, September 23,1994.

Jeffries, R., Miller, J., Wharton, C. & Ulyeda, K., " User Interface Evaluation in the Real World: A Comparison of Four Techniques," Proceedings of CHI,1991 (New Orleans, Louisiana, April 28-May 2,1991) ACM, New York,1991. pp.119-124.

l t

1 Jeffries, R. & Desurvire, H. " Usability Testing Versus Heuristic Evaluation: Was 'Ihere A Contest?,"

SIGCHI Bulierin, 24(4), October 1992, 39-41.

i l

Kantowitz, B. H. " Selecting Measures for Human Factors Research" Human Factors, 34 (4),

1992,387-398.

l Nielsen, J. & Landauer, T. "A Mathematical Model of the Finding of Usability Problems," INTER CHI

'93 Conference Proceedings (Amsterdam, The Netherlands, April 1993), ACM,1993.

pp. 206-213.

NUREG-0899 Guidelinesfor the Preparation of Emergency Operating Pmcedures, Division of Human Factors Safety, Office of Nuclear Reactor Regulation, US Nuclear Regulator Commission, I

August 1982.

NUREG-1358 Lessons Isarnedfrom the Specialinspection Programfor Emergency Operating l

Procedures, Office of Nuclear Reactor Regulation, US Nuclear Regulatory Commission, April,1989.

I O'Hara, J. M., Higgins, J. C., & Stubler, W. F. Human Factors Engineering Program Review Model for Advanced Nuclear Power Plants, draft technical report prepared for the US Nuclear Regulatory Commission, January 25,1994.

Westinghouse, AP600 Standard Safety Analysis Repon, " Human Factors Engineering," Chapter 18.

DE-AC03-90SF1 8495, June,1992.

Virzi, R A. " Refining the Test Phase of Usability Evaluation: How Many Subjects Is Enough?,"

l Human Factors, 34(4), 1992,457-468.

I i

mulDw.wpf;lb-000196 5-1 3

l

.